Originally posted by Ruined
IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card.
Note I said "only" - read the statement again. I'm saying if you can't detect the differences in game with your naked eye, then any IQ differences that can be found with paint programs, etc, are useless, because in the end you still cannot detect them with your naked eye, and cannot see them. What is not valid about that?
I assumed that you said that because you thought it pertained to the UT2003 issue, am I mistaken about this? You have been asserting all along that the issue is NOT visible, right? I am saying that the UT2003 IS visible, without said measures listed above. If I misconstrued your intent then I apologize. Otherwise I stand by my assessment.
What he did say that Nvidia looked slightly superior when AF was off, ATI looked slightly superior when AF was on, but the differences were so minor that you wouldn't be able to see them when actually playing, presumably unless you stood there staring at the distance looking for whatever it was your were looking for. Not really a dangerous statement, just one that puts into perspective how minor the filtering differences are.
No, it is dangerous. You're still not seeing the gravity of what he said. Paraphrased he succintly said this: "I cannot see the issue while playing. That means nobody
else can either." Since when did Brent become the law? Since when should a journalist make blanket statements that just because they can't see something that nobody else can? I agree, it probably is not something that is always visible. But to say that it is never visible is ridiculous. Some people have made statements that FSAA and AF are not noticeable when actually playing a game, which I find ludicrous. Where does it end? There's nothing wrong with an individual saying that they cannot notice something while playing the game. An injustice is being committed when they try to say that their observations are true for everyone else.
And again, my counterargument is that if this is in fact an optimization, and not a bug, I would prefer it because it offers faster speed with IQ differences that according to a major site is undetectable when actually playing. Maybe arguing for an option to revert to the standard method just for kicks would be in order, but if you owned an FX card would you honestly use it, if the reviewers themselves looking for the differences didn't notice any during play? I think the point they were making over at HOCP is that the IQ difference is being blown out of proportion and although it can be seen when studying screencaps, it's so minor that when actually playing it, it's not noticable. You may disagree, but that is his unbiased findings (again, HOCP has been very hard on NV cards). You may feel differently about comparisons if you have done a side by side comparison of both, but when one of the pickiest sites out there says its no big deal, you have to wonder if its worth worrying about.
I do not feel that [H] is unbiased. Their whole attitude towards NVIDIA's NV3x lineup has been disappointing. Think what you like, but Kyle has shown time and time again that he is in NVIDIA's pocket, and has never said anything about NVIDIA's methods, nor whether or not he thinks they are bad. I think that Brent and Pelly's past reviews have been fabulous. But I think he dropped the ball on this, especially since he did not compare NVIDIA's faux trilinear to ATI's real trilinear
, so how are people supposed to gauge whether there is a difference or not, and how much of a difference there is?! Furthermore, the issue is clearly not a bug because the filtering changes only occur when "ut2003.exe" is detected. I agree that it is an optimization. However, because it only exists in UT2003, which happens to be a popular benchmark, it is clearly an "optimization."
Please agree or disagree with the following statements:
Giving the user an IQ/performance tradeoff is a good thing.
There is something deceitful about changing filtering in one game and not telling anyone about it, especially when they have been led to believe that the Quality setting in the drivers would perform full trilinear filtering.
It is not a good thing that users who want full trilinear filtering are unable to get it.
NVIDIA should provide an option in the drivers that does real trilinear filtering if the application asks for it.