A few questions and not many answers:
o How far in the 'future' are the 3DMark2003 games set? 3DMark2001 lasted for 2 years or so, so perhaps those are like games we might play in 2004 or 5?
o Would game programmers write code and graphics programs like those in 3dm2k3? Or is that code an inefficient use of hardware. Hey, the nature scene is pretty, but full 3d grass with poor LODing? What about the rest of the world, beyond those few meters in front?
o Will future games use relatively less CPU time, or will they be like 3dm2k3, only way slower, cause they hog the CPU as well?
o How much influence did ATI have, since they bought the highest level of partnership with Futuremark? (Don't even think of responding to this please.) Eg. All you have to know is that your theoretical vertex processing is 20% better than the competition, and make a scene with simple, but high vertex count to look smarter.
o nVidia's touted strength all before and through the GeforceFX launch was 'long, complex shaders, both pixel and vertex', yet we havn't seen shaders that do anything more than be more efficient versions of DX8 level code. Perhaps developers (including Futuremark) can't think of anything to do with this feature. Related example: So we can now mix 8 textures together, 1 diffuse, 1 bump, 1 lut, 1 something, so feature goes to waste cause no one is showing off its potential. At least you could render 8 layers of 'fur' in one pass. Don't think I saw any fur in 3dm2k3.
What is a bit embarassing for our beloved friend nVidia is that the competition is regularly beating them in current AND future benchmarks. Maybe we need to see benchmarks that show off the strength of nVidia product, if that is possible. The hidden surface removal feature of the Kyro, the displacement mapping of the Matrox, and the Trueform of the ATI were great features that (to varying degrees) really didn't mean much at all.
That was quite a long rant for me, since I'm downloading some drivers and really should be going to bed. Night all.