fyi... from the driver reports from 3dmark03...
In our testing, all identified detection mechanisms stopped working when we altered the
benchmark code just trivially and without changing any of the actual benchmark workload. With
this altered benchmark, NVIDIA’s certain products had a performance drop of as much as
24.1% while competition’s products performance drop stayed within the margin of error of 3%. To
our knowledge, all drivers with these detection mechanisms were published only after the launch
of 3DMark03. According to industry’s terminology, this type of driver design is defined as ‘driver
note the +/- 3% variance that they have listed... v/s the 1.9% variance shown by the ati driver set overall..
however with the 8 odd percent variance in gt4... there may well be some optimization going on... thus far no one has reported on it..
will be interesting to read the reports on the cat 3.4 series to see just what the 'optimizations' were..
but those same people who were deflecting criticism of nvidia's cheats are now taking it upon themselves to bash ati for a variance of 1.9% v/s well over 20% for nvidia's driver hacks... it is a little hard to understand why this is so...