Originally posted by Cotita
Again, Futuremark didn't use HLSL. So your argument is not valid.
Eh? I know, I asked, and I documentented that! I'm merely pointing out that if they were to have used a HLSL then it would make no sense for them to use Cg given they are making a DirectX
benchmark and DirectX ships with its own native
By the way HLSL won't produce ATI optimized code either. So why develop in HLSL when using Cg will deliver faster nvidia performance and same ATI performance as HLSL?
And thats exactly
what Futuremark do not want - they have created optimised code, but not opimised for any particular vendors product its optimised to DirectX.
Futuremark are already being called 'biased' for sticking to what DirectX offers - if they started using HLSL's produced by a vendor, that produces optimised code for that vendor, DESPITE the API they are working on providing a perfectly functional and neutral HLSL, what credibility would they have then?