Originally posted by creedamd
A statement like you made does not help nvidia but actually just damages your image. I know that a lot of posts here are pro-ati, but it's because facts back them up. We all wish nvidia would get back into the game to have something to brag about as well.
Actually, his comment is one of the most rational in this thread, and probably bolsters his image if anything. IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card. They are useful for those curious about how each card handles a particular scene, filtering method, etc, but as for in game action, they are useless. Why? Because if you can't see it with your naked eye while playing the game, then how can you make the argument that the IQ differences would make a difference in the games' graphical quality? Again, the idea behind a game is to actually play it, not pick it apart as if it was some sort of digital art masterpiece, then wax and wane about which card does the best filtering in scene X at time Y when you zoom in at 4x and there is a full moon outside. As for 'facts' backing up posts here, the basic theorem behind many of them is flawed. And, to address your last point, Nvidia got back in the game once the FX5900 hit the street.
Re: why did I bring up Ultrashadow? Because its an optimization ATI could likely make in software, and pertained to the analogy I made in that post. No 'backup arguments' needed, as the original sticks - IQ differences that can't be seen while actually playing are meaningless to the game player.