Originally Posted by fivefeet8
Performance on these High End cards in most games are well above what most arguably would call playable. This issue mainly affects benchmark comparisons.
Therein lies the problem. If these IQ reducing optimizations were on CP settings you have to switch to, this wouldn't even be an issue.
The fact that they're on the default settings reviewers use to benchmark misleads consumers.
That said, I'm getting some very weird results trying to test this on games with built in benches like Metro2033 and Far Cry 2.
The average framerates were the same on both games on the default "Balanced" setting and the two notches up "Quality" setting at 25X14. (although the minimum was much higher on the "Balanced" at FC2, but curiously the maximum was lower) I added AA to prove I was testing GPU, which I would expect at 25X16 anyway.
The shimmer jumps right out at you on the Far Cry 2 bench, and in UT3.
If I disable CatAI it may get rid of necessary game specific settings and invalidate the test. I'm starting to wonder if the drivers are coded to recognize the benchmark tool somehow, because theoretically bumping the IQ setting two notches should lower fps.