View Single Post
Old 03-30-03, 01:35 PM   #12
Captain Beige
Join Date: Feb 2003
Posts: 59

Originally posted by ChrisRay
To be honest, I think it will depend upon the ap. It might be similar to vsync, "Aplication preference" And or always force 16 or always force 32 bit.

I really don't know.

I don't really know what to think of the scenerio right now, Do I believe 16 bit precision is enough for games of today and possibly tommorrow? ya I think 16 bit will be plenty,

Do I think this is good PR for Nvidia? No I do not, Nvidia is damned if they do and damned if they don't. I'm thankful that I'm in a situation where I can watch and see without tramatically being affected by whats to come.

in the end tho I think most vendors will opt to use 16 bit on both ATI and Nvidia cards, I have two reasons to believe this. Performance and negligable IQ increases, And Nvidia's influence in the gaming division right now.

Not quite sure I follow here?

What exactly are you proposing?
ATI cards don't support FP16, only FP24, since FP24 is part of the DX9 specification but FP16 is not and is therefore useless for a true DX9 card. this is not like vsync. vsync is an option not part of a standard. nvidia cards using FP16 unless specifically asked for FP32 is ridiculous.

it would be like a company claiming to have an equal oportunities policy but discriminating against people unless you specifically told them not to be prejudiced against every possible kind of lifestyle and if you accidentally left anyone out they'd bully them until they accpeted lower pay, and then saying it was okay because you didn't say you wanted them to be treated fairly and boasting about how great they are at cutting costs.
"If you want a picture of the future, imagine a fan blowing on a human face - forever." ([I]GeForce Orwell, 2004[/I])

Last edited by Captain Beige; 03-30-03 at 01:41 PM.
Captain Beige is offline   Reply With Quote