View Single Post
Old 09-14-03, 01:58 AM   #99
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by Edge
I doubt the image quality will make much of a difference in "casual" gamers eyes. Remember back in the GF2 days when the origional Radeon looked MUCH better?
Yes, the original Radeon looked better - when games actually ran on it. This was back in the day when ATI couldn't put together a proper driver to save its soul. For the longest time the original Radeon didn't have working Windows 2000 drivers. This was when a lot of gamers were switching because 1) of stability and 2) NT's vastly superior OpenGL support. This was the beginning of Quake III and for OpenGL, there was only one name you needed to know - NVIDIA. Once ATI got a working Win 2k driver out, I did some comparisons between the Radeon and my GeForce cards. Aside from the sky looking nasty with texture compression enabled (which was easily solved by turning TC off), the quality differences were barely perceptible, not enough to make people care about ATI's (at the time) lousy driver support.

But I do agree with you overall. Joe Sixpack who buys his FX5600 from Best Buy probably isn't going to see much of a difference between the GeForce and Radeon. But today's cards are extremely powerful, enough to where even the casual gamer should place more of an emphasis on image quality instead of all-out speed. But fortunately, ATI (and NVIDIA to a lesser extent) has cards out that are able to offer a nice balance of the two.
saturnotaku is offline   Reply With Quote