So now we know
I think these two things say a lot.
Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.
The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST.
As I see it Nvidia got ****ed by not staying with FM's beta program.
ATI worked hand in hand with FM to make sure there shader was at top performance for "there bench".
Nvidia did not. But "Game Companies" do work very close with nvidia. Even more so than ati.
That is why you see the FX's suxing on PS 2.0 3DM03 and kicking butt in games.
It's easy to read between the lines on this one just from what both companies have said.
edit by StealthHawk: dont circumvent the swear filter. Thanks.
Compaq Presario CQ60-215DX
AMD 64 Athlon X2 @ 2GHz (QL62)
15.6 inch HD WideScreen
Nvidia 8200M-G 895mb
2Gig system ram
250Gig SATA 5400rpm HDrive