Originally posted by ChrisRay
I think thing is this.
ATi and Nvidia have very different visions about the future of shaders and how they will come to pass.
Currently it seems Microsoft seems to favor ATI's Vision, As ATI's vision is a part of Microsoft's standard.
Nvidia favors its own ideals on how shader aplications will be ran.
We need to diagnose the real problem here, Not who is right and who is wrong.
Well I feel like saying whats right and wrong. Nvidia have done some very good things in the past but they also made a massive mistake and thankfully microsoft didn't listen to them.
Nvidia developed on of the sega chips and do you know IT DIDN'T SUPPORT TRIANGLES I'm serious you could only render quads and when microsoft was developing directx nvidia tried to convience them to use quads thankfully microsoft ignored them and went with the strong industry standards and used triangles.
Now when directx 9.0 came around microsoft feel under the pressure and of nvidia and allowed a second shader language Cg.
Now I don't no the whole story of Cg but I do know this in Cg arrays are indexed with FLOATS OMG that is complete stuiped because a) you can have fractions now I believe ( presume atleast ) that if you tried to enter 1.5 as the index it would spit chips at you but never the less its still stuiped. b) in theory this is problem in practise it isn't but when you get large float and preform operations on them you don't get exact answer and large numbers can't be expressed accurately even though you aren't going to the limit of the float you still can't use it in a array index.
On a side note NV30/31/34 was a flop we are still waiting for MikeC to post his review I think he never will NV35 looks to be a decent card and I wish good luck to the owners of NV35s.
OH AND SOMEONE MAKE NVIDIA RENABLE MY FSAA ON MY TNT2 DRIVERS I DON'T LIKE USING 2.XX DETS!!!!!