View Single Post
Old 05-18-03, 05:46 PM   #92
legion88 Junkie
Join Date: Jul 2002
Posts: 135

Originally posted by jimmyjames123
Trust me, I fully understand this point of view. However, we all know that NVIDIA and ATI actively "optimize" their drivers for enhanced performance in 3dmark programs. This alone undercuts the argument about an "accurate comparision of performance", because it has been repeatedly shown that driver "optimizations" alone can (sometimes significantly) improve performance.

There is also the issue about what is an "optimization" and what is a "cheat". There is no agreement about what this distinction is. If NVIDIA (or ATI for that matter) can improve performance without compromising image quality, I'd like to think of that as an "optimization".

If anything, maybe this will help Futuremark to create a benchmark that is less susceptible to "optimizations", assuming that they truly want to create an "impartial" benchmarking program.
Using image quality as a means to determine whether it is a cheat or an optimization is laughable. It is just as stupid (well, actually, it is almost as stupid) as ATI morons using the totally idiotic notion that if the newer drivers showed similar speed improvements without image quality degradation then that proves the previous drivers weren't cheats.

Cheat means to violate the rules deliberately ( In the case of FutureMark's products (or any benchmark for that matter), the companies are expected not to "optimize" specifically for their benchmark. That is, it is expected that the optmizations would be apparent for any Direct3D applications, not just 3DMark200x. So those attempts at blurrying the textures in 3DMark2001 (that ATI was also caught doing) on the Radeon 8500s would be obvious examples of cheats.

When it comes to static benchmarks (like 3DMarks or Quake 3 pre-recorded timedemo playbacks), the scenes are rendered essentially the same way every time. (Thus, I call them static as in not changing). Because of that, adventurious drivers programmers can "hard-wire" code into the drivers that would exploit the knowledge gained from knowing what frames are being rendered, how they are being rendered, and when.

For playing games in real-time, adventurious drivers programmers are not psychic and thus do now know ahead of time what frames are going to be rendered, etc.. The techniques based on knowing ahead of time what frames are to be rendered, knowing ahead of time how the frames are to be rendered, and knowing ahead of time when they would be rendered would not be transferred to real-time gaming.

It should go without saying that consumers play games in real-time.
legion88 is offline   Reply With Quote