Originally posted by Chalnoth
I think nVidia is pushing Cg because it's the only realistic way to get optimum performance out of an NV3x part on the shader side (in OpenGL, at least...Microsoft's own HLSL compiler seems to work fairly well, though I think MS still should have some integer data types in PS 2.0).
nVidia would kill for having integer support in DX9. I doubt they'd even care if it was in PS2.0. - PS2.0.+ is sufficent for them.
INT12 is 3 or 4 times faster than FP16 on a NV3x, depending on the operation
IMO, that's one of the best threads about the NV3x. Ever.
If you are serious about understanding the NV3x, it's a must read.