PDA

View Full Version : A interesting read about FX cards.


Lucien1964
06-27-03, 03:37 PM
http://www.digit-life.com/articles2/ps-precision/index.html

Oops I mean an interesting read about 3d cards and the technolgy with dx9 etc.

digitalwanderer
06-27-03, 03:55 PM
Man, that is some heavy-duty explaining of some of the stuff that I have just not been able to get me head around the concepts of until now!

Thanks for pointing it out, I'm still digesting it slowly... :)

EDITED BITS: Screw the fantastic tutorial on what 16/24/32FP actually means, did you read this bit?!?!?
It's well known that ATi chips use 24 bit floating-point numbers internally in the R300 core and this precision is not influenced by the partial precision modifier. But it's interesting that NVIDIA uses 16 bit floating-point numbers irrespective of the operation precision requested(!), though the partial precision term was introduced by NVIDIA's request, NV3x GPUs support 32 bit floating-point precision under OpenGL NV_fragment_program extension, and NVIDIA advertised their new-generation videochips as capable of TRUE 32bit floating-point rendering!
Question? How the hell did these get M$ WHQL certification?!?!?! :mad:

stncttr908
06-27-03, 04:26 PM
When I have more time, I'll sit down and read this over more carefully. I have an interest in understanding this sort of thing better, but at the moment, numbers are the devil thanks to work. :D

Lucien1964
06-27-03, 04:27 PM
Yeah I agree its a heavy duty read. I got some of it but it seems NVidias cards uses 16 bit floating-point numbers irrespective of the operation precision requested. I guess that means their pushing 16 bit in most cases. I must say its an interesting read so its seem that the certified drivers do not fully support dx9 according to the spec.

Nutty
06-27-03, 04:41 PM
Yeah it seems NV30/31/34 only do FP16.

But NV35 seems to do it right at least;

The NV35 demonstrates various and the most correct behavior among NVIDIA's video chips. We can see that calculations are fulfilled with the 32bit precision in the standard mode in line the with the Microsoft specifications, but when it's indicated that partial precision is supported, temporary and constant registers use 16 bit precision and texture registers use 32 bit precision, though according to the Microsoft specification texture registers can also use 16 bit precision.

digitalwanderer
06-27-03, 04:49 PM
Originally posted by stncttr908
When I have more time, I'll sit down and read this over more carefully. I have an interest in understanding this sort of thing better, but at the moment, numbers are the devil thanks to work. :D
I was doing really good thru the begining of the explanation and with the scientific notation and such, but when they started talking hexa-decimal I started glazing over a bit and feeling like a thicky. :(

But that chart explained a lot, and I get the huge difference between 16FP & 24FP now and why even though 32FP is better that 24FP looks like more than enough for now.

A really good read, with some truly sensational stuff buried in it!

Lucien1964
06-27-03, 04:57 PM
After reading that article my head hurts. No worries I'll just crack open a Pale Ale and go ahhhhhhh mmmmmmmm burp!!

Skuzzy
06-27-03, 05:05 PM
Okay,..that was an easy read,...uhmm...Might need to change my perspective level a bit when posting from here on out. :D

Sazar
06-27-03, 06:47 PM
so basically only the nv35 is doing things correctly ? in the FX lineup ?

that was indeed a good read and thats the main reason I come to sites like this :D

to learn more...

cheers lucien for the link :)

/me takes a bucket full of aspirins and starts wearing coke bottle bottoms after frying his eyes AND brain :(

john19055
06-27-03, 10:04 PM
It would be nice if Nvidia an ATI could get together and make thier cards have the same support,it reminds me of when ATI had 1.4 pixel shader and Nvidia used 1.3 pixel shader.Mybe on there next generation of cards they can both have the same support,to make it easier for game developers to write the codes for games in the future so they can both run at the same quialty. and not one haveing to down grade and the other one want support it.

StealthHawk
06-28-03, 12:17 AM
I knew this had been posted before http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=13567

Sazar
06-28-03, 01:57 AM
Originally posted by StealthHawk
I knew this had been posted before http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=13567

does this mean a merge is coming :D

DivotMaker
06-28-03, 09:11 AM
Originally posted by john19055
It would be nice if Nvidia an ATI could get together and make thier cards have the same support,it reminds me of when ATI had 1.4 pixel shader and Nvidia used 1.3 pixel shader.Mybe on there next generation of cards they can both have the same support,to make it easier for game developers to write the codes for games in the future so they can both run at the same quialty. and not one haveing to down grade and the other one want support it.

John,

As long as there is competition and these two companies are striving to differentiate themselves, I don't think you will ever see them have identical "support". Not even MS has been able to pull that one off yet. This also gets down to the benchmark/performance measurement situation. Neither ATI nor nVidia are in any big hurry to make their products similar so one can be declared "the winner" because of the high stakes involved. I am not sure you will ever see a "standardization" between these two as long as the stakes are as high as they are....

digitalwanderer
06-28-03, 09:41 AM
Originally posted by BigBerthaEA
Neither ATI nor nVidia are in any big hurry to make their products similar so one can be declared "the winner" because of the high stakes involved. I am not sure you will ever see a "standardization" between these two as long as the stakes are as high as they are....
(Standard Dig disclaimer that he does not intend this as a flame at BB, merely a difference of opinion)

I disagree, I don't see ATi shying away from an apples-to-apples comparison at all. It's nVidia that is trying to re-write the standards to get their hardware compliant rather than get their hardware compliant with the standards. :(

I agree that one of the companies involved is desperately trying to avoid a fair comparison of their card with their competitors at any cost, but it sure ain't ATi. :)

(Again; the standard Dig disclaimer that he does not intend this as a flame at BB, it is merely a difference of opinion)

Sazar
06-28-03, 11:01 AM
Originally posted by digitalwanderer
(Standard Dig disclaimer that he does not intend this as a flame at BB, merely a difference of opinion)

I disagree, I don't see ATi shying away from an apples-to-apples comparison at all. It's nVidia that is trying to re-write the standards to get their hardware compliant rather than get their hardware compliant with the standards. :(

I agree that one of the companies involved is desperately trying to avoid a fair comparison of their card with their competitors at any cost, but it sure ain't ATi. :)

(Again; the standard Dig disclaimer that he does not intend this as a flame at BB, it is merely a difference of opinion)

hmm... standards eh ?

thats a very interesting concept :D

digitalwanderer
06-28-03, 11:13 AM
Originally posted by Sazar
hmm... standards eh ?

thats a very interesting concept :D
Well? Don't ATi's cards actually pretty much conform to and perform at the industry standards? (I'm being serious here and not flaming again, I THINK Ati does but mebbe I'm wrong....feel free to disagree. :) )

Sazar
06-28-03, 11:15 AM
Originally posted by digitalwanderer
Well? Don't ATi's cards actually pretty much conform to and perform at the industry standards? (I'm being serious here and not flaming again, I THINK Ati does but mebbe I'm wrong....feel free to disagree. :) )

yes... t'was a funny digi :D thats why I posted it..