Originally posted by reever2
If you take one sites word over everybody elses, sure...
ahh barons post just went away
Well, I think HardOCP is coming up with a sensible conclusion - even if there are minor differences in filtering (slightly favoring fx5900 w/o AF, slightly favoring 9800 w/AF), you can only see those differences with zoomed in Photoshopped screenshots, but in actual gameplay (according to them), you can't tell the difference. In this case, they seemed to go to great lengths to detect differences as well, but in the end could not during gameplay. Therefore, it's a moot point.
HardOCP has been very pro ATI in the past so I wouldn't say it's an nvidia biased site, though they seem to be approaching it in a sensible way - compare the games to each other while you are actually playing, as opposed to studying photoshopped screencaps...
And yeh there are a lot of posts but I thought this was a pretty cool article because its a brand new article done by one of the 'big' sites, and is very in depth. Plus, it specifically addressed the filtering issue and optimizations everyone has been talking about recently. I know I read in many threads people were stating that Nvidia was 'cheating' in UT and/or offering inferior filtering, but this article is a good counter for that point, since their direct A/B comparisons of actual gameplay found no noticable differences, and direct A/B comparisons of screenshots they studied went one way w/o AF and the other with AF.
In other words, both cards play Unreal just as good regardless of how the drivers work.