First off... here are the cheats FutureMark found, as taken from their PDF (I've highlighted the things of greatest concern to me):
What Are The Identified Cheats?
Futuremark’s audit revealed cheats in NVIDIA Detonator FX 44.03 and 43.51 WHQL drivers. Earlier GeForceFX drivers include only some of the cheats listed below.
1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the
workload. However, if the loading screen is rendered in a different manner, the driver seems to fail to detect 3DMark03, and performs the back buffer clear command as instructed.
2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case the driver uses instructions contained in the driver to determine when to obey the back buffer
clear command and when not to. If the back buffer would not be cleared at all in game test 2, the stars in the view of outer space in some cameras would appear smeared as have been reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so that the back buffer is cleared only when the default benchmark cameras show outer space. In free camera mode one can keep the camera outside the spaceship through the entire test, and see how the sky smearing is turned on and off.
3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds two static clipping planes to reduce the workload. The clipping planes are placed so that the
sky is cut out just beyond what is visible in the default camera angles. Again, using the free camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also
reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51 drivers as far as we know.
4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early
frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears
the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.
5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the shader is being totally discarded and replaced with an alternative more efficient shader in a similar fashion to the water pixel shader above. The rendering looks similar, but it is not identical.
6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection proved to reduce the frame rate with these drivers, but we have not yet determined the cause.
7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection drops the scores with these drivers. This cheat causes the back buffer clearing to be
disregarded; we are not yet aware of any other cheats.
8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the driver. When we prevented this detection, the performance dropped by more than a factor of
two in the 2.0 pixel shader test.
We have used various techniques to prevent NVIDIA drivers from performing the above detections. We have been extremely careful to ensure that none of the changes we have introduced causes differences in either rendering output or performance. In most case, simple alterations in the shader code – such as swapping two registers – has been sufficient to prevent
Not that clipping planes are acceptable... or turning the background on and off in space as you see fit is either... but in the two cases I highlighted... NVidia didn't simply edit their drivers to prevent rendering of unseen code... THEY COMPLETELY REWROTE SHADER CODES TO REPLACE THOSE INCLUDED IN THE GAME!
. This is sick... people using this against futuremark really upset me. Completely seperate from the fact that this totally invalidates the GFFX's results... does it occur to you that the shader is actually probably not a DX9 shader? Or that NVidia was wasting their driver development time writing new shader code for 3DMark instead of optimising for games that are already out there? It is also my theory that NVidia is doing something similar with the PS2.0 tests... as the GFFX performs PS2.0 abismally.
Seriously, how can anyone defend NVidia at this point?
Now onto a reply...
Originally posted by Morrow
After officially knowing now that both nvidia and ATI cheat in 3dmark03 (anyone surprised?), what do we learn from this incident?
ATI hasn't officially cheated, that is under investigation. The could be using a game specific optimization that improves rendering efficiency without changing what is rendered, which is perfectly acceptable, until Futuremark and ATI comment on this, your comment has no merit.
We learn of course that we can no longer trust synthetic benchmarks, isn't it obvious?
I cringed at this comment... because it is completely asinine. In the past this may have been true... but today Futuremark proved that from her on we can trust synthetic benchmarks specifically because
they can be policed like this.
Cheating in games is certainly easier to hide but not as easy to implement because the cheats found in 3dmark03 do not work in games where the camerapath is random.
But they could work in the fixed camera benchmarks within games... and switch off during normal gameplay.
Another thing we learn from this is that FutureMark now has also officially stated that their shader routines are inefficient! They say that nvidia managed to implement nvidia hw optimized shaders which are sometimes more than twice as fast as the shaders used by FutureMark. What does this say about FutureMark's credibility having released a benchmark optimized for future hardware! Well, nothing positive in any case...
Wow... I hadn't even read this when I got to part above. It means that NVidia can perform shaders that are hardcoded specifically for their hardware quickly... woo... who'd have thought! That doesn't mean they can run DX9 shaders at an acceptable speed... and guess what DX9 games won't be using NVidia's OpenGL functions if they aren't coded into the DX9 spec, so yes... this is a good indication of what you'll see in the future with PS2.0 running on NVidia's FX line. Take your blinders off sonny.