Originally Posted by Moose
You are absolutely correct, Nvidia cards are at their very best when no AA or AF is used and there are no screenshots to show image quality. They also seem to excel whenever Anandtech reviews them.
As far as methodology what methodology was used since none is listed?
dunno.. which is why i said there's not anything "necessarily" wrong with it.. but that wasn't the point.
the point was in many circumstances someone can point to a benchmark and say, "here's proof", and someone else points to a benchmark that contradicts it, making any one benchmark useless in the first place. different driver revisions, game revisions.. any number of things can skew a benchmark any number of ways. a number of benchmarks have "AA or AF is and screenshots to show image quality" that would paint a similar picture to anand's on the games tested.
if you look at ALL the info and the various benchmarks, one thing is clear and beyond argument (well, at least logical argument): despite nvidia's improvements in ps2.0 shader performace in later driver revisions, they are still clearly behind ati. the relevance could certainly be argued, as ps2.0 shader usage is still minimal, however in the cases where it is implemented, ati is clearly superior in both performance and quality overall (tho quality could be construed as subjective).
i mean... i've seen stuff and compared stuff myself, where at higher resolutions or higher aa settings, the nv performance is actually BETTER (tho this is not the overall case, most times they are neck and neck, excluding the ps2.0 issue). could be due to game coding, bandwidth requirements (where nv's overall higher core/memory clock provides more than ati), or whatever.. but again, that's the whole point of my saying there's no reason to get all up in arms over ONE benchmark, regardless of the results - it only shows one part of the equation, and under a specific circumstance at that. it means little by itself regardless....