Jump to content

OT: Help with video cards


mikeszekely

Recommended Posts

Mods, I hope you can find it in your heart to leave this up for a day or two. :(

Anyway, I currently have a GeForce 4 MX 440... a video card known mostly for its mediocrity. But, it does happen to have a DVI out.

And, a few weeks ago, I bought a 52" widescreen HDTV. DLP. Very nice. With an HDMI in.

While browsing, of all places, Wal-Mart, I found a nicely priced 6' Philips DVI to HDMI cable... and I found myself thinking, "wouldn't it be awesome to use my TV as a 52" monitor?" So, I bought the cable, plugged it in, and there it was. MacrossWorld with a life-sized AgentOne.

But all was not well! I can only see part of the image. After tinkering with various settings, I've determined that my video card has no support for widescreen resolutions.

Rather than write it off as a pipe dream or wishful thinking, I've decided to replace my video card. Alas, I must confess that I'm not very knowlegeable about such things, so I thought I turn to the more knowledgeable members of the only web community I belong to.

So, I leave it to you... I need a relatively inexpensive video card (say, no more than $150), with DVI out, supports 16:9, AGP, and preferably at least 128mb or onboard RAM.

Suggestions?

Link to comment
Share on other sites

I got a GeForce 6600GT for about 150. Avoid the 6200's if you go for nvidia (they're intended as entry-level cards so don't have much horsepower), but the 6600 and 6600GT are great cards. You can get a vanilla 6600 for well under 150 as well. The GTs hover at the 150-and-up range. The current tops are the 6800 and new 7800 cards. The 6600 is the current mid-level card. If you can call it mid-level, as the card laughs at Doom3.

Go to Newegg.com and take a peek at the current pricing. Way lower there than in most retail stores. I like the nvidia's better than ATI personally, but ATI's cards aren't bad either. ATI's Radeon X700 I believe it the one that's on par with the 6600's.

This is the card I'm currently running. PNY 6600GT. It's on sale this weekend too. 145 + a 20 rebate. I hate rebates but that's another story. 145 alone is a good price.

I swear by newegg. Great PC parts store.

Edited by Anubis
Link to comment
Share on other sites

that depends on how much video ram you have and what os you're using. the gpu doesn't matter that much in this case.

first of all, don't get the cable from wal-mart. get the ar one at bestbuy and the quality is much better. trust me, it work pretty good on my dlp.

second, if you video has less than 258mb of ram, don't try any resolution other than vga but i maybe wrong. i like newegg a lot, and the shipping is really fast.

my suggestion is pair a nvidia card with a amd cpu or non intel north/south bridge, pair a ati card for intel cpu and intel chip set, this seen to be more stable from what i have seen.

third, if you are running window xp, you can set from display properties the use wide screen. otherwise, your only choice is 4:3.

lastly, the dvi out from the video card is for moniter, you can connect it to the tv's dvi port or the 15 pins video port. hdmi can only be use to input hd signal, upconvert dvd, turner, etc.. unless you get a high end card with hd output, which i don't know of one, but i'm sure they exist. or you can just switch a mac or a unix box.

Link to comment
Share on other sites

But all was not well!  I can only see part of the image.  After tinkering with various settings, I've determined that my video card has no support for widescreen resolutions.

Rather than write it off as a pipe dream or wishful thinking, I've decided to replace my video card. 

I was under the impression that most widescreen TVs had an inverse-letterbox mode for the proper display of 4:3 images. There a reason you can't do that instead?

first of all, don't get the cable from wal-mart. get the ar one at bestbuy and the quality is much better. trust me, it work pretty good on my dlp.

It works pretty good. So would a ratty pile of 20-year-old speaker wire.

We're dealign with DIGITAL signals here. Line quality is MEANINGLESS.

Your signal is perfect or non-existant. There IS NO in-between.

EVEN if that mattered, what store the cable came from is not important, merely the brand.

second, if you video has less than 258mb of ram, don't try any resolution other than vga but i maybe wrong. i like newegg a lot, and the shipping is really fast.

You are indeed wrong. It doesn't take much RAM at all to get the GUI past 640*480*8. Not even a meg.

third, if you are running window xp, you can set from display properties the use wide screen. otherwise, your only choice is 4:3.

Again you're wrong. My 2k and 98 boxes BOTH have the option of non-4:3 ratios.

Hell, even before it was an OS Windows supported non-4:3 resolutions.

1280*1024 springs to mind as one that virtually everything supports.

It's got very little to do with the OS, and a whole lot to do with video chipset and drivers.

lastly, the dvi out from the video card is for moniter, you can connect it to the tv's dvi port or the 15 pins video port. hdmi can only be use to input hd signal, upconvert dvd, turner, etc.. unless you get a high end card with hd output, which i don't know of one, but i'm sure they exist. or you can just switch a mac or a unix box.

Wrong AGAIN. DVI and HDMI carry EXACTLY the same digital video signals. They are ELECTRICALLY IDENTICAL for this task.

FURTHERMORE, HDMI accepts any signal of the appropriate encoding format, not just HDTV signals.

480i, while not HD in the least, is accepted through HDMI. You can even get signals GREATER THAN HD through an HDMI connector, though the odds that your specific TV is capable of dealing with them are somewhat low(possible, but not probable).

ANY video card capable of going past 1024*768 has HD output. HD just means the image has 720 or 1080 lines of resolution. Nothing special there at all.

If you have a TV with RGBHV inputs, you can buy a VGA->BNC cable and use that on ANY video card with an analog VGA connector. OR a VGA->BNC and DMI->VGA.

But most TVs don't have RGBHV. They usually have component, DVI, and/or HDMI. If you go overseas they have RGBHV, but not through the 5BNC setup I was talking about. So it's fortunate we're dealing with DVI and HDMI here.

As far as the 15-pin video port, that's a standard VGA connector. BUT DVI ALSO carries analog VGA signals on it. So AGAIN it's a matter of a small adapter.

And how in the hell would changing OSes from Windows to Linux fix the absurd hardware limitations you claim exist(which, fortunately for enthusiasts everywhere, do not)?

That is all.

Edited by JB0
Link to comment
Share on other sites

Excuse the double post, but...

Your videocard SHOULD be able to do wide-screen resolutions. Hell, I'm running a GeForce 2 MX and it can do 'em.

Somewhere Windows has an option to "hide video modes this monitor can't display." Find it and kill it. That will likely triple the number of resolutions you have available to choose from.

And talking to a videophile... says your DLP is probably 1280*720 native resolution. So shoot for 1280*720 once you get your resolutions unlocked.

Link to comment
Share on other sites

Avoid the 6200's if you go for nvidia (they're intended as entry-level cards so don't have much horsepower)

Well, I don't really game on my PC, so I doubt that I need much HP. I just figured that if I was going to invest, I might as well make sure it's really an upgrade. When browsing the net, I was actually looking at the 6200. As I live in an apartment, and have a job where I work odd hours, I don't like to shop online, and Best Buy is simply asking too much for the 6600. I do appreciate the suggestion, though. I may consider ordering online if I'll save that much.

Excuse the double post, but...

Excused, on account of you saved me the trouble. ;)

Your videocard SHOULD be able to do wide-screen resolutions. Hell, I'm running a GeForce 2 MX and it can do 'em.

I'm given to understand that, although newer, the GeForce 4 MX 440 was a step down from the GeForce 2.

Mind you, I'm not really looking at the options it was giving me when I had it hooked up to my 17" LCD via VGA. The computer is currently hooked up to the TV, via DVI.

Somewhere Windows has an option to "hide video modes this monitor can't display." Find it and kill it. That will likely triple the number of resolutions you have available to choose from.

I assume you mean under the monitor tab, where it says screen refresh rate? Not only is the box not checked, but it's (oh, what's the term...?) not available to be checked.

Under the adaptor tab, where it lists info on the card, there's a "List All Modes" button. My choices are:

640x480, 256 colors, 60Hz

800x600, 256 colors, 60Hz

640x480, High Color (16-bit), 60Hz

800x600, High Color (16-bit), 60Hz

640x480, True Color (32-bit), 60Hz

800x600, True Color (32-bit), 60Hz (currently using this one)

640x480, 16 Colors, Default Refresh

800x600, 16 Colors, Default Refresh

Under the nVidia tab, there is nothing useful. Mostly driver info, plus some options for full screen overlay that are blanked out.

If it helps you dig up some info, JB0, the TV is a Toshiba 52HM85, 60Hz, NTSC standard. My computer's got a 2.6GHz Intel Pentium 4 with HT, 512mb RAM. DirectX is at 9.something. OS is Windows XP Home SP1 (I just don't trust SP2).

Link to comment
Share on other sites

Somewhere Windows has an option to "hide video modes this monitor can't display." Find it and kill it. That will likely triple the number of resolutions you have available to choose from.

I assume you mean under the monitor tab, where it says screen refresh rate? Not only is the box not checked, but it's (oh, what's the term...?) not available to be checked.

Under the adaptor tab, where it lists info on the card, there's a "List All Modes" button. My choices are:

640x480, 256 colors, 60Hz

800x600, 256 colors, 60Hz

640x480, High Color (16-bit), 60Hz

800x600, High Color (16-bit), 60Hz

640x480, True Color (32-bit), 60Hz

800x600, True Color (32-bit), 60Hz (currently using this one)

640x480, 16 Colors, Default Refresh

800x600, 16 Colors, Default Refresh

Under the nVidia tab, there is nothing useful. Mostly driver info, plus some options for full screen overlay that are blanked out.

Go to the Control Panel -> Display -> Settings (click on Advanced) -> Monitor and the box should be there.

And if you don't have the latest drivers for your video card, get them.

Link to comment
Share on other sites

I'm given to understand that, although newer, the GeForce 4 MX 440 was a step down from the GeForce 2.

Actually it's pretty much identical.

GeForce4 MX IS a GeForce 2. And in terms of performance, it's just a hair better than the original GeForce2 MX.

Needless to say, when people's GeForce 4s were performing equivalent to the bottom-end card of 2 generations ago, when they shelled cash out for a current-gen card, hell was raised.

Mind you, I'm not really looking at the options it was giving me when I had it hooked up to my 17" LCD via VGA.  The computer is currently hooked up to the TV, via DVI.

I hate when Windows trys to be helpful.

Somewhere Windows has an option to "hide video modes this monitor can't display." Find it and kill it. That will likely triple the number of resolutions you have available to choose from.

I assume you mean under the monitor tab, where it says screen refresh rate? Not only is the box not checked, but it's (oh, what's the term...?) not available to be checked.

Under the adaptor tab, where it lists info on the card, there's a "List All Modes" button. My choices are:

640x480, 256 colors, 60Hz

800x600, 256 colors, 60Hz

640x480, High Color (16-bit), 60Hz

800x600, High Color (16-bit), 60Hz

640x480, True Color (32-bit), 60Hz

800x600, True Color (32-bit), 60Hz (currently using this one)

640x480, 16 Colors, Default Refresh

800x600, 16 Colors, Default Refresh

Under the nVidia tab, there is nothing useful. Mostly driver info, plus some options for full screen overlay that are blanked out.

What the... *BOOM!*

...

Sorry, too much Worms. But that's just farted up.

I'm inclined to run with nemesis' suggestion. What version of the drivers are you running?

If it helps you dig up some info, JB0, the TV is a Toshiba 52HM85, 60Hz, NTSC standard.  My computer's got a 2.6GHz Intel Pentium 4 with HT, 512mb RAM.  DirectX is at 9.something.  OS is Windows XP Home SP1 (I just don't trust SP2).

327376[/snapback]

*sniffles*

Everyone has a better computer than me...

...

Except video cards. My primary box has an FX5900 Blowdryer that I got free.

Edited by JB0
Link to comment
Share on other sites

Go to the Control Panel -> Display -> Settings (click on Advanced) -> Monitor and the box should be there.

Or you could right-click on your desktop, then select properties. ;)

Anyways, that's where I've been, and I saw what you were talking about. And like I said, it's blanked out (but it's not checked anyway).

I'm inclined to run with nemesis' suggestion. What version of the drivers are you running?

Well, I suppose that updating my drivers would be cheaper than buying a new video card. Of course, updating my drivers might solve the problem, which would eliminate the excuse to go out and buy a new video card...

*takes peak at driver info*

4.3.0.3. Dated 3-3-2003.

Well, I'm off to look to see if I can find an updated driver. But please keep the video card reccomendations coming in the mean time.

Link to comment
Share on other sites

You can google Powerstrip, a power user utility to make customized resolutions or to force your card to display at your hdtv's resolution. You can download the trial, or search for a copy (bittorrent).

Here's a quick guide:

HTPCNews Powerstrip 101 guide

Just take your time, read through it; its not that hard. The latest versions of powerstrip have all these great predefined resolutions built in that you can choose from. Just make sure you choose the one to match your HDTV's native resolution.

Link to comment
Share on other sites

MikeZ,

If you are unable to get it working using your current video card and need to perform a cheap upgrade, I'd second the choice of a 6600GT. Best bang for the buck in that price bracket. An ATI 9800Pro is a very close runner-up and is still a fantastic card even now. My old one still chugs along quite happily in my old P4 2.6 LAN rig.

If you really wanted to make the most of that widescreen HDTV, $300 will buy you an X800XT All-In-Wonder. I bought one of these to replace my old 9800Pro in my current P4 3.4 rig, and the number of features and options is simply staggering. It gives you TV and FM tuners, a multifunction RF remote, innumerable A/V in and out options, time-shifted recording of live video and audio (basically a TIVO unit integrated on-card), and is very flexible in display setup. Paired with your 52" TV, it would be really sweeeet.

Good Luck!

Link to comment
Share on other sites

what really irks me about the new slew of video cards is that they are all PCi-Express....like the new NVidia cards only come in PCI-Express, which would mean that I would have to get a whole new motherboard, just to get a new videocard upgrade from my 6800GT

Twich

Link to comment
Share on other sites

Well, I will probably still shop around for a new video card, but nemesis and JB0's suggestion to update the driver seems to have worked. I went to 7.8.0.1 (8-2-05), and I'm given A LOT more choices to work with. And although a shade from each edge of my desktop is still offscreen, I can see all my icons, my Start Menu, and my taskbar again.

Still doesn't quite look as good as I'd hoped, but I'm going to play around with the settings some more.

Thanks for your time, guys, and mods, feel free to close this one.

Link to comment
Share on other sites

Well, I will probably still shop around for a new video card, but nemesis and JB0's suggestion to update the driver seems to have worked.  I went to 7.8.0.1 (8-2-05), and I'm given A LOT more choices to work with.  And although a shade from each edge of my desktop is still offscreen, I can see all my icons, my Start Menu, and my taskbar again.
I'm inclined to say the last inch or two is from "overscan" on the TV.

Not REALLY overscan, since there's no scan, but they're pr'ly projecting the image a little larger than the screen surface, creating a similar effect.

I assume from the version #s that you're using the card manufacturer's drivers(got too many decimals for it to be nVidia's).

Dunno what options are or aren't available in the various variants, but nVidia's reference drivers have a screen adjustment option that might get you the last inch or 2.

...

Or not. My GeForce2 only has position controls, not squish/stretch.

Still doesn't quite look as good as I'd hoped, but I'm going to play around with the settings some more.

Thanks for your time, guys, and mods, feel free to close this one.

Good luck!

Edited by JB0
Link to comment
Share on other sites

using the latestes nvidia drivers is useually a best bet for most problems. but keep in mind Nvidia optimizes there drivers for the most current generation of cards. so you might not get boosts to that old geforce4 mx.

As for the DVI output, your best bet, being it seems your having problems with the DV to HDMI adaptor, is to get a dvi to vga adaptor, i bought mine for like $2, so i could hook up an old svga monitor to run dual monitors on my ghetto old Radeon 9600 pro. it might make things a bit easier.

Link to comment
Share on other sites

But all was not well!  I can only see part of the image.  After tinkering with various settings, I've determined that my video card has no support for widescreen resolutions.

Rather than write it off as a pipe dream or wishful thinking, I've decided to replace my video card. 

I was under the impression that most widescreen TVs had an inverse-letterbox mode for the proper display of 4:3 images. There a reason you can't do that instead?

first of all, don't get the cable from wal-mart. get the ar one at bestbuy and the quality is much better. trust me, it work pretty good on my dlp.

It works pretty good. So would a ratty pile of 20-year-old speaker wire.

We're dealign with DIGITAL signals here. Line quality is MEANINGLESS.

Your signal is perfect or non-existant. There IS NO in-between.

EVEN if that mattered, what store the cable came from is not important, merely the brand.

second, if you video has less than 258mb of ram, don't try any resolution other than vga but i maybe wrong. i like newegg a lot, and the shipping is really fast.

You are indeed wrong. It doesn't take much RAM at all to get the GUI past 640*480*8. Not even a meg.

third, if you are running window xp, you can set from display properties the use wide screen. otherwise, your only choice is 4:3.

Again you're wrong. My 2k and 98 boxes BOTH have the option of non-4:3 ratios.

Hell, even before it was an OS Windows supported non-4:3 resolutions.

1280*1024 springs to mind as one that virtually everything supports.

It's got very little to do with the OS, and a whole lot to do with video chipset and drivers.

lastly, the dvi out from the video card is for moniter, you can connect it to the tv's dvi port or the 15 pins video port. hdmi can only be use to input hd signal, upconvert dvd, turner, etc.. unless you get a high end card with hd output, which i don't know of one, but i'm sure they exist. or you can just switch a mac or a unix box.

Wrong AGAIN. DVI and HDMI carry EXACTLY the same digital video signals. They are ELECTRICALLY IDENTICAL for this task.

FURTHERMORE, HDMI accepts any signal of the appropriate encoding format, not just HDTV signals.

480i, while not HD in the least, is accepted through HDMI. You can even get signals GREATER THAN HD through an HDMI connector, though the odds that your specific TV is capable of dealing with them are somewhat low(possible, but not probable).

ANY video card capable of going past 1024*768 has HD output. HD just means the image has 720 or 1080 lines of resolution. Nothing special there at all.

If you have a TV with RGBHV inputs, you can buy a VGA->BNC cable and use that on ANY video card with an analog VGA connector. OR a VGA->BNC and DMI->VGA.

But most TVs don't have RGBHV. They usually have component, DVI, and/or HDMI. If you go overseas they have RGBHV, but not through the 5BNC setup I was talking about. So it's fortunate we're dealing with DVI and HDMI here.

As far as the 15-pin video port, that's a standard VGA connector. BUT DVI ALSO carries analog VGA signals on it. So AGAIN it's a matter of a small adapter.

And how in the hell would changing OSes from Windows to Linux fix the absurd hardware limitations you claim exist(which, fortunately for enthusiasts everywhere, do not)?

That is all.

327326[/snapback]

I love a man who know the whole over yonder isn't his ass.

Anyway...not all cards higher than the GT are PCIe. I am currently running a 6800 ultra agp. (much) More expensive than it's PCIe counterpart but it exists.

If you're not a hardcore gamer a 6800 anything is too much. It's a waste of cash. If you just want something the supports all newer drivers and DX 9.c get a 5500 with 256 onboard. It's an inexpensive card that does everything. I think the only two games I couldn't play on it were Far Cry and BF 2. HL2 ran just fine. Quit telling him to get something he doesn't need. :rolleyes:

And a shameless plug for me....I have above card available for $75. Lol

- Chewie

Link to comment
Share on other sites

Ummm, I wasnt telling anyone to get anything, I was using this thread, which I believed was a discussion about video cards to tell of my concern about all the next gen cards being PCI-Express, and I have a Geforce 6800 GT OC with 256MB ram and it doesnt run BF2 at high settings, which sucks, as I am in a Battlefield 2 gaming clan and would like the best performance, so in order to get that I was saying that I would have to buy a new motherboard before I could get the new video card. I was basically bitching because I think that sucks! Not telling anyone to get anything.

Twich

Link to comment
Share on other sites

Ummm, I wasnt telling anyone to get anything, I was using this thread, which I believed was a discussion about video cards to tell of my concern about all the next gen cards being PCI-Express, and I have a Geforce 6800 GT OC with 256MB ram and it doesnt run BF2 at high settings, which sucks, as I am in a Battlefield 2 gaming clan and would like the best performance, so in order to get that I was saying that I would have to buy a new motherboard before I could get the new video card.  I was basically bitching because I think that sucks!  Not telling anyone to get anything.

Twich

327456[/snapback]

Dirrr its not your graphics card. Just cause your graphics card is nice doesnt mean your machine can run anything. You need at least more than 512 Ram 1 gig is recommended, a AMD Cpu around 2.4 GHZ and hgher is good since they are ment for games. So dont get pissed every party part of the Computer matters. A 6800 GT will last you a long time so keep it.

Link to comment
Share on other sites

'm inclined to say the last inch or two is from "overscan" on the TV.

Not REALLY overscan, since there's no scan, but they're pr'ly projecting the image a little larger than the screen surface, creating a similar effect.

More or less. I used an option to trim the edges a little, but I still have a tiny bit offscreen. Oh well. Some of the fonts look weird too.

I assume from the version #s that you're using the card manufacturer's drivers(got too many decimals for it to be nVidia's).

Sorta. I pulled both versions by checking the drivers in the device manager. The updated driver, though, was downloaded from nVidia's site. I believe nVidia just leaves out some of the decimals, as you noted, and write it as 78.01.

As for the DVI output, your best bet, being it seems your having problems with the DV to HDMI adaptor, is to get a dvi to vga adaptor

Well, two things.

First, the trouble never seemed to be the cable (it's not really an adaptor at all, just a 6' cable with a male DVI on one end and a male HDMI on the other end). I've determined the majority of the issue to be the drivers, and the rest to simply be the videocard.

Second, a DVI to VGA adaptor wouldn't do me much good. My TV doesn't have VGA inputs, just HDMI, component, s-video, and coaxial. If it had a VGA input, I could have just gone VGA to VGA, since my videocard has both VGA and DVI out.

Anyway...not all cards higher than the GT are PCIe. I am currently running a 6800 ultra agp. (much) More expensive than it's PCIe counterpart but it exists.

The only thing that concerns me now is that my motherboard takes AGP. Although, if I do wind up buying a new videocard, my next motherboard will wind up being AGP. Unless, of course, I buy the motherboard first... then the new videocard will be picked to fit it instead.

If you're not a hardcore gamer a 6800 anything is too much. It's a waste of cash. If you just want something the supports all newer drivers and DX 9.c get a 5500 with 256 onboard. It's an inexpensive card that does everything. I think the only two games I couldn't play on it were Far Cry and BF 2. HL2 ran just fine. Quit telling him to get something he doesn't need.

Ah, let's not be too hard on our fellows. I may only be a hardcore console gamer, but that doesn't mean I never try out PC games. Besides, games or not, I do like to tinker with things. Who's to say that I won't play with other applications that will wind up being demanding on the videocard?

In any case, since I'm not a hardcore PC gamer, I certainly won't shell out $400+ for a knock-your-socks-off videocard. But if I set a budget, I am more likely to go right up to the limit and get the best card at the max of my budget rather than play it frugal and only buy what I need.

Link to comment
Share on other sites

I assume from the version #s that you're using the card manufacturer's drivers(got too many decimals for it to be nVidia's).

Sorta. I pulled both versions by checking the drivers in the device manager. The updated driver, though, was downloaded from nVidia's site. I believe nVidia just leaves out some of the decimals, as you noted, and write it as 78.01.

Other way around.

The card manufacturer drivers are based off nVidia's reference drivers. Some of them keep the nVidia numbering scheme, some add decimals, some pull totally new #s out of their ass.

The large version # is because the detonator drivers have been going for quite a while. Started back on the TNT1(hence the driver name).

Link to comment
Share on other sites

I assume from the version #s that you're using the card manufacturer's drivers(got too many decimals for it to be nVidia's).

Sorta. I pulled both versions by checking the drivers in the device manager. The updated driver, though, was downloaded from nVidia's site. I believe nVidia just leaves out some of the decimals, as you noted, and write it as 78.01.

Other way around.

The card manufacturer drivers are based off nVidia's reference drivers. Some of them keep the nVidia numbering scheme, some add decimals, some pull totally new #s out of their ass.

The large version # is because the detonator drivers have been going for quite a while. Started back on the TNT1(hence the driver name).

327690[/snapback]

I stand corrected, and the driver stands at 78.01 then.

Link to comment
Share on other sites

  • 1 month later...

So here I am, ressurrecting a long dead thread. I thought that, when I upgraded the drivers, that I'd solved my problems and could live contented and free until the time came to upgrade the whole PC. Unfortunately, Obsidian decided to develop a sequel to the one PC game I ever truly cared for, Neverwinter Nights. I was looking at some screens and thinking, to myself, "I really want to play that game, but I'll bet it won't run on my system!" And since the game is slated to come out in Q2 of 2006, I doubt I'll be ready to replace my whole PC, since the 2.6GHz Pentium 4 with HT that's in there will likely serve me fine until I'm ready to upgrade to a multi-core machine and Windows Vista.

And why put off the inevitable? Neverwinter Nights Diamond came out, and it's only $10 more than the Kingmaker expansion. I wanted to get Kingmaker anyway, and the extra $10 seems worthwhile to replace the multiple CDs for Neverwinter, Shadows, and Hordes with a DVD-ROM. Might as well upgrade now and max the visuals on that.

So, for around $80 at newegg, I can buy two 512mb RAM sticks, and that'd take my system up to a gig and half (no sense in going all way two 2 gigs now, since I'd have to buy four 512 sticks and toss the two 256s that are already in there... stupid motherboard with its stupid paired RAM...). But processor and RAM only go so far...

Which brings me back to my shitty 64mb GeForce MX440. The little bastard must go! And that, of course, brings me back to where I was when I started this thread... what to replace it with?

I've decided to stick with nVidia, simply because that's what's already in there. Naturally, it'd still have to be AGP since I didn't replace my motherboard. And it'd still have to have a DVI out, since I'm still using it on my TV. Despite the fact that I'm upgrading mostly because of a game, I still don't consider myself much of a PC gamer, so I still don't want to spend a lot of money (besides, the more I want to spend on upgrades, the more difficult it'll be to convince the other half to go along with the upgrades, especially since she doesn't use my computer).

Long story short, I've narrowed it down to the FX5200, the FX5500, the 6200A, or the 6200. I leave it to you guys to educate me on the pros and cons of those cards. I know earlier in this thread, the 6600 was suggested. Once we get that high, we're getting a little pricey. I'd need a pretty convicing argument for that over the 6200.

Thanks in advance.

Link to comment
Share on other sites

the Nvidia 6600 is a great card, I picked up a Geforce 6800 LE off of ebay new for very cheap it is alike a $200.00 card and got it for a fraction of the price, you may think these cards are out of the way because off price but if you look around you will be quite suprised as to what you can find.

chris

Edited by zeo-mare
Link to comment
Share on other sites

These are my specs:

-Intel P4 2.8 GHz w/HTT

-1 GB DDR RAM

-128 MB DDR GeForce FX 5200

I've had no trouble playing all of the above mentioned games. However I've only been able to play them on the smallest resolution with the graphics settings set anywhere from Medium to High. I got this computer in August of 2003 and it was supposed to be top of the line back then. Thinking (back then) that I'd be set for at least 5 years, I'm already looking to upgrade my computer. A friend of mine, an Advanced user, recommended that I get a GeForce 6600GT w/SLI. But in order for me to do that I'd need to upgrade my mobo, and if I were to do that I'd want to upgrade my CPU from what I have to an AMD 64 3200+ (or higher). I digress though, getting too far off from the point I'm trying to make. Point I'm trying to make is that the GeForce 6600GT is the way to go, and just about everyone on this board (that's replied) seems to agree. If I were you, I'd upgrade your computer with an SLI mobo and buy an SLI card. That way down the road when it comes time to upgrading again, all you'd need to do is buy the same card at a much cheaper price. Then you'd be able to watch all of your HD movies on your comp and play all the top of the games in the end. My two cents.

Edited by Oihan
Link to comment
Share on other sites

IMO, SLI's not worth it.

When the time gcomes to upgrade, buy a new card. They'll have added new features that aren't in the old card you're replacing.

SLI is a way for people with too much money to masturbate to higher resolutions by running a pair of top-end cards. And not much more than that.

Link to comment
Share on other sites

I would consider a Geforce 6600-series. But since you are making only minor upgrades (and from the looks of it, low-to-mid range upgrades) just to bring you up to spec, a 6200 would be fine...but I would definitely think about the 6600. The 5500 would definitely fair better than the 5200 so cross the FX5200 off the list. How much you are willing to pay would fine tune my answer.

Link to comment
Share on other sites

IMO, SLI's not worth it.

When the time gcomes to upgrade, buy a new card. They'll have added new features that aren't in the old card you're replacing.

SLI is a way for people with too much money to masturbate to higher resolutions by running a pair of top-end cards. And not much more than that.

342583[/snapback]

...How long has it taken us to finally get into Dual Core CPUs? ...I can't imagine new graphics cards getting any significant new features anytime soon. And if there happens to be some significant new feature...I'm sure the power of the second graphics card w/SLI can make up for that lack of features for the meantime. Going out on a limb here, but if anything, couldn't the power of the second card somehow emulate the feature of the newer cards through coding?

Link to comment
Share on other sites

IMO, SLI's not worth it.

When the time gcomes to upgrade, buy a new card. They'll have added new features that aren't in the old card you're replacing.

SLI is a way for people with too much money to masturbate to higher resolutions by running a pair of top-end cards. And not much more than that.

342583[/snapback]

...How long has it taken us to finally get into Dual Core CPUs?

It'd be more fair to ask how long we had Athlon XPs before we moved into A64s.

...I can't imagine new graphics cards getting any significant new features anytime soon. 

They add some with every generation.

Latest features, added for the GeForce 7 line, include "subsurface scattering"(simulates light penetrating through an ordinarily opaque, like your hand over a flashlight) and "radiosity"(light reflecting off of one object and onto another).

And if there happens to be some significant new feature...I'm sure the power of the second graphics card w/SLI can make up for that lack of features for the meantime. 

Unless the game requires said feature. They USUALLY allow a couple of generations' worth of cushion, but sometimes...

Going out on a limb here, but if anything, couldn't the power of the second card somehow emulate the feature of the newer cards through coding?

Nope.

Like I said, if it's time to upgrade I'd buy a newer card instead of digging out another one of the same kind.

Link to comment
Share on other sites

Latest features, added for the GeForce 7 line

"radiosity"(light reflecting off of one object and onto another).

Really?! How well done is it? Believable real-time Radiosity/Global Illumination is one of the things that will really change the immersiveness and believability of 3d environments, at least when simulating real life locations. Does it actually do much to much to make the lighting look near-photo-real?

Link to comment
Share on other sites

Unless the game requires said feature. They USUALLY allow a couple of generations' worth of cushion, but sometimes...

342595[/snapback]

That's what I was trying to get at you could say. Anyway.....

Link to comment
Share on other sites

Well, I think the important thing to remember is, I'm not looking to run top-of-the-line games. Not today, not ever. I know it sounds odd to dedicated PC gamers, but I don't play FPS on PCs at all... just the occasional RPG. My computer runs Nevewinter Nights with all the max settings, and only lags in very large, busy areas like Neverwinter's central hub... and that's as is. A gig and half of memory should be plenty until it's time to replace the whole computer (and I'm not canabalizing any RAM out of this computer... it's PC2700). It's also very likely that I won't canabalize the video card either, since I'll probably invest in a newer one that's better designed to work with a newer mobo and a dual-core CPU set up.

To that end, Az, I'm looking to get off as cheap as possible. The only realy game I'm looking into buying for PC in, say the next 12 months, would be Neverwinter Nights 2. Granted, NWN2 isn't scheduled to release until July or so, so no concrete system requirements are out there, so it'd be hard to say what I'd actually need to run it at decent settings. But, to put it another way, the 6600 is a consideration only if I can find an excellent deal, and the 6600GT is simply out of the question. Once we start getting that expensive, I'd rather just wait until I start building my next PC.

Link to comment
Share on other sites

Just get what you can for now.. the 6600 GT and a decent athlon CPU will do you well.

I've seen windows vista running on Athlon 64 and on a dual core CPU and it still lags so all of the stuff out now will be pretty much useless when microsoft rolls out vista anyways.

Link to comment
Share on other sites

Well, I think the important thing to remember is, I'm not looking to run top-of-the-line games.  Not today, not ever.  I know it sounds odd to dedicated PC gamers, but I don't play FPS on PCs at all... just the occasional RPG.  My computer runs Nevewinter Nights with all the max settings, and only lags in very large, busy areas like Neverwinter's central hub... and that's as is.  A gig and half of memory should be plenty until it's time to replace the whole computer (and I'm not canabalizing any RAM out of this computer... it's PC2700).  It's also very likely that I won't canabalize the video card either, since I'll probably invest in a newer one that's better designed to work with a newer mobo and a dual-core CPU set up.

To that end, Az, I'm looking to get off as cheap as possible.  The only realy game I'm looking into buying for PC in, say the next 12 months, would be Neverwinter Nights 2.  Granted, NWN2 isn't scheduled to release until July or so, so no concrete system requirements are out there, so it'd be hard to say what I'd actually need to run it at decent settings.  But, to put it another way, the 6600 is a consideration only if I can find an excellent deal, and the 6600GT is simply out of the question.  Once we start getting that expensive, I'd rather just wait until I start building my next PC.

342644[/snapback]

Well...if you don't care to watch HD movies and DVDs and the likes, then get an entry level card then. No point if you're not going to be using it for anything else if your computer can handle NWN with the vid card you have. Just about any new entry level video card should suit you well for what you seem to be asking. I still suggest the 6600GT.

Link to comment
Share on other sites

Latest features, added for the GeForce 7 line

"radiosity"(light reflecting off of one object and onto another).

Really?! How well done is it? Believable real-time Radiosity/Global Illumination is one of the things that will really change the immersiveness and believability of 3d environments, at least when simulating real life locations. Does it actually do much to much to make the lighting look near-photo-real?

342600[/snapback]

I have no idea.

I was just yanking it out of the bullet list on nVidia's webpage.

I'm running a GeForce 5900(-ish) myself. And that's because it was free, else I'd be stuck with a woefully inadequate GeForce2.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...