View Single Post
Old 05-18-03, 12:07 PM   #14
gordon151
Registered User
 
Join Date: Mar 2003
Posts: 264
Default

Quote:
Originally posted by jimmyjames123
The fact of the matter is that the cameras in 3dmark don't get turned around. So NVIDIA is "optimizing" for this particular benchmark. It is certainly no secret that both NVIDIA and ATI optimize for this benchmark. Users of FX cards now get smoother and faster performance and better image quality in 3dmark03 in what we can see. This whole issue is a matter of perspective.
I don't think this can specifically be called "optimizing" since you are essentially altering the benchmark itself. I can see with optimizing the drivers to improve execution of a specific coding routine that is employed by the benchmark (which from my understanding is the common and preferred practice), but effectively ommitting part of the benchmark from rendering changes the rules as to how the benchmark is run.

This completely skews the scores in a way that driver optimizations wouldn't (as coding for a specific routine is a universal rather than specific application of optimizing) and it would make sense if this were applied to all other cards, that their scores would normalize to where they were before.
gordon151 is offline   Reply With Quote