PDA

View Full Version : do we need FP32 for the FUture?


Pages : [1] 2 3 4 5

Nv40
06-04-03, 12:57 AM
yes... :D

indeed , Nvidia have done a Good choice with Fp32.. (128bits)
now everything is looking more and more clear.. since Fp16(64bits)
is now/will be part of Directx9 and Fp16 is supported in OpenGl..
since DIrectx8 games will be with us for 1-2 years.
is clearly to see that Nvidia vision about presicions will pay in the long run.
Fp16 for today games and Fp32 the graphics Professional industry and as a Tool for G.developers in the incoming games in the 2-3 more years.

what i find funny is that if X company has something better ,the only
thing that Y company do is to say "we dont need that ,what we have is enough" the same old saying of 3dfx... We dont need 32bits ,"16bits
is enough for our games" and look what happened to them :),the same when Nvidia told 256bit bus is "overkill" and now we see the great benefits in the Nv35. :D

why i posted this? even if we dont need something today , its really important that IHV vendors supply gamedevelopers with all the tools
necessary for their 3-4 years futuregames . RIght now it seems ATi is following the 3dfx philosphy ,"'we dont need that". seems
ATI (http://www.beyond3d.com/forum/viewtopic.php?t=6223&postdays=0&postorder=asc&start=0&sid=f89af82b21c9ff78e0626310ed9e6e9f) have the same arrogance aptitude "of what we have is enough" and they fail to see the bigger picture that games development takes time ,and doing so they continue overcloking their cards again and again and again re-releasing their cards with 20% speed improvements ,with near nothing new ,the same mistake with Nvidia Geforce3's. releasing directx8 cards in the low end market ,same mistake of NVidia Geforce4mx (holding back the industry) and looks like there will be no FP32 in ATi cards in this year and for along time by looking at the comments of ATI people . :(
until the next year DIrectx10 cards. (with luck). at least NVidia acted quicky with the 256bit bus (3-4months later from Nv30) but ATI still seems with their original position about Fp precisions and with the same products one year later .at that is bad news ,for all gamers since that means Fp32 as minimun in games will not be available until ATi support this or something better. and until both cards have good performance in it. yes its slow today in NV3x cards ,but its there!!! means developers already can play and program with that in mind.. (cough) Jcarmack. and already the Nv35 improved the performance ,but by the end of year ,you can be sure the performance in the Nv40 will be excelent at full FP32 in games . so any ATI fans here that doesnt want to see their games stuck with FP24 in 2-3years ,with only 96intructions pixel shaders games (Fbuffer is supported only in their R9800 and only in OpenGl and have some limitations to be used for realtime games) they must better start their own treads in RAGe3d forums and bitch as much you can :) for FP32bit (at least) in following hardware (R390) and a more Hardware aproach of DX9+ ala (Nvidia) or better.

solofly
06-04-03, 01:01 AM
Go nVIDIA go!;)

muzz
06-04-03, 01:04 AM
Yeah lets have a card that can do 32 bit, and drop it down to 16 just so we can compete.....

Do you mean the pics I have seen alot lately?
Thats 32 bit huh?....
Hmmmmm
I thought it was rendered in my washer machine.

gokickrocks
06-04-03, 01:05 AM
what you fail to look at is the fps that comes with using fp32 on the nv3x...its nice that nvidia though ahead for the future, but as of now, it is just a slide show...sure we will eventually want to use fp32, but we would like to use it at a smooth enough speed that we can play it at

Nv40
06-04-03, 01:16 AM
Originally posted by gokickrocks
what you fail to look at is the fps that comes with using fp32 on the nv3x...its nice that nvidia though ahead for the future, but as of now, it is just a slide show...sure we will eventually want to use fp32, but we would like to use it at a smooth enough speed that we can play it at

its the same thing as 32bits... when Nvidia first introduced it.
it was very Slow in quake3 in my Tnt2 , only playable in higher machines.
but thanks to this so early aproach to 32bits.. the jump from 16 to 32 bits
was very fast.. in just 1 year. if ATI doesnt support FP32 until 2004-2005
Gamedevelopers will be forced to their only precision Fp24 . and when they
finally release a fast FP32 ,when they have it ready you will need to wait for game developers to code for it 2-3 years more (from the day ATI choose to support it) that means 4-5 years for games to be FP32minimun. since as we already know game developers only goes with the minimum for each hardware for compatibility reasons. for one REason game developers are already using the "slow" PS/VS Nv3x cards in future developments games over others faster but more limited PS/VS cards . dont you think?

muzz
06-04-03, 01:19 AM
Thats ok 2-5 years is ok with me.

StealthHawk
06-04-03, 02:03 AM
No, you won't have to "wait for developers to code for it." FP24 is the minimum precision in DX9, meaning that all nvidia cards will normally use FP32. And using non-proprietary extensions in OGL, nvidia cards will again use their highest precision, FP32.

As soon as we have games that use PS2.0, we should have FP32 support.

Dazz
06-04-03, 02:16 AM
I don't know just using 128bit FP32bit with Dawn demo makes even the mighty FX5900 crawl

StealthHawk
06-04-03, 05:38 AM
Originally posted by Dazz
I don't know just using 128bit FP32bit with Dawn demo makes even the mighty FX5900 crawl

Because it isn't mighty with FP32 whatsoever ;)

With that said, I'm sure NV40 and R400 will have full speed FP32 shaders, so even if we don't need FP32 in the future, it would be pointless not to use it.

Chalnoth
06-04-03, 06:26 AM
Originally posted by muzz
Yeah lets have a card that can do 32 bit, and drop it down to 16 just so we can compete.....
FP32 can be used on up to 1/3 of the instructions at full performance.

Not all instructions need 32-bit accuracy.

Hanners
06-04-03, 07:06 AM
FP32 will be great for future generations of cards that can support and run it 'properly' (i.e. at full speed), but for now it really has no great use (in the gaming market at least, professional rendering is a different kettle of virtual fish).

vandersl
06-04-03, 09:03 AM
Actually, the FX could hold back the use of FP32 in applications.

By default, a PS2.0 shader will use full precision (spec'ed to be a minimum of FP24). However, if a developer doesn't specify the _PP hint, the FX will need to run it at FP32 (unless the driver does something it shouldn't behind the apps back). Since this will result in poor performance on the FX, developers are going to be pressured to either use the _PP hint in their shaders (I believe this then applies to the whole shader, not individual values) or use Cg in the app development (which automatically uses FP32, FP16, or even FX12).

With ATI's 'always on' FP24, the developer doesn't need to think about it - they can just assume everything will be running at FP24 minimum (and maximum). Also note that when ATI hardware does support FP32 existing shaders will automatically use it.

You tell me - which is more likely to result in use of higher precision in the near future?

And please - stop using the enter key when making a post - word wrap works wonders you know.

SnakeEyes
06-04-03, 09:30 AM
Good point vandersl. Basically, thanks to the FX's setup, if developers write specifically to support the mixed 16/32 modes (or God help us, 16 only modes), the games that come out WON'T gain anything once the hardware is capable of actually doing 100% Fp32 with good performance.

ATI LoVeR 9700
06-04-03, 10:31 AM
NV40...

Do you type into a translator or something? :confused:

I don't think the R300 and R350 will be in many computers in 3 years... We don't have to worry about FP32 at the moment. FP24 is fine right now. :)

We'll have the R500 by then. ;)

R.Carter
06-04-03, 11:21 AM
Originally posted by Nv40
RIght now it seems ATi is following the 3dfx philosphy ,"'we dont need that". seems
ATI (http://www.beyond3d.com/forum/viewtopic.php?t=6223&postdays=0&postorder=asc&start=0&sid=f89af82b21c9ff78e0626310ed9e6e9f) have the same arrogance aptitude "of what we have is enough" and they fail to see the bigger picture that games development takes time ,and doing so they continue overcloking their cards again and again and again re-releasing their cards with 20% speed improvements ,with near nothing new ,the same mistake with Nvidia Geforce3's.

(stuff deleted)

so any ATI fans here that doesnt want to see their games stuck with FP24 in 2-3years ,with only 96intructions pixel shaders games (Fbuffer is supported only in their R9800 and only in OpenGl and have some limitations to be used for realtime games) they must better start their own treads in RAGe3d forums and bitch as much you can :) for FP32bit (at least) in following hardware (R390) and a more Hardware aproach of DX9+ ala (Nvidia) or better.

Dunno. I though that the ATI drivers understand FP32 just fine. Internally the hardware will only use FP24, but then that shouldn't really matter to game developers should it? They just care about the calls they make to whatever interface they are programming for and expect the OS / drivers / hardware to do it.

As to the issue of shader program length, well it really depends on how fast such large programs run and if there is a real need for it. Do you know of developers who are unhappy and need FP32 today? If there was a real need, then I would expect the hardware guys to provide solutions. In my opinion, having insane hardware to play with can lead to sloppy coding.

Sadly, it seems that ATI is lengthening it's product cycle from 18 months to 24 months due to slowing computer sales. So we won't be seeing anything really new for quite a while.

But from that thread it's clear that ATI knows that FP32 will be needed in the future, it's just not really needed now.

As well, since DirectX9 only requires 24-bits game developers who are making DirectX9 games shouldn't write their code assuming that they will get 32-bits of precision.

Chalnoth
06-04-03, 11:28 AM
Originally posted by vandersl
or use Cg in the app development (which automatically uses FP32, FP16, or even FX12).
Cg doesn't "automatically" use any precision. It has data types that are cast to whatever precisions are supported in the target assembly.

And except for particular recursive algorithms, 32-bit floats aren't going to be required throught the calculation of pretty much any shader.

Cotita
06-04-03, 01:10 PM
Originally posted by ATI LoVeR 9700
NV40...

Do you type into a translator or something? :confused:

I don't think the R300 and R350 will be in many computers in 3 years... We don't have to worry about FP32 at the moment. FP24 is fine right now. :)

We'll have the R500 by then. ;)

Really?

there are still millions of tnt2 and geforce2 mx. And the geforce3 which was high end just a couple of years ago also has a large user base.

So I think that current ATI and nvidia cards will last more than 3 years.

PreservedSwine
06-04-03, 01:19 PM
Originally posted by Cotita
Really?

there are still millions of tnt2 and geforce2 mx. And the geforce3 which was high end just a couple of years ago also has a large user base.

So I think that current ATI and nvidia cards will last more than 3 years.

Yeap, I feel the same way..always user who *never* upgrade, they simply buy a wole new computer from Best Buy or Circuit City.....My father-in-law still uses a P100.......

Chalnoth
06-04-03, 01:39 PM
Originally posted by Cotita
Really?

there are still millions of tnt2 and geforce2 mx. And the geforce3 which was high end just a couple of years ago also has a large user base.

So I think that current ATI and nvidia cards will last more than 3 years.
But those who are going to be playing the games will upgrade.

Most of those who still own TNT's don't play games except in passing. Buying a GF2-level card is now no more expensive than purchasing a new game.

And I think that within two years, DX9-level will be expected. This didn't happen with DX8 because nobody put out a low-cost DX8 card soon enough. But now we do have a low-cost DX9 card, and within a year, those cards will be no more expensive than buying a new game. It'll take a bit for game developers to catch up, and they'll need to rely on games that benefit largely from DX9 hardware before any take the plunge on requiring it.

But I think requiring DX9 hardware in games will come sooner than many think.

Skuzzy
06-04-03, 02:12 PM
Chalnoth is pretty spot on with that post.

DX9 games will come much sooner than DX8 games came. It is happening. There are compelling reasons to use DX9 versus DX8.
The API is better defined and it is going to be around longer than one year, as DX10 is still a couple of years away and appears to completely redefine DX and how it interfaces with the hardware.

PS2.0 shaders are being churned out already. I have 33 PS2.0 shaders in the library now.

On long shaders; I always thought the PR for that was a joke. A good programmer will not use long shaders, unless it is for test purposes. Why?
Same reason you write short functions in C. Long shaders are a poor programming practice, just as long C functions are.

It is cheaper and faster to use multiple shaders per frame, rather than one long shader per frame.

StealthHawk
06-04-03, 04:47 PM
LOL, TNT users. The only thing TNTs play fine is Counter-Strike....somewhat :p

Skuzzy
06-04-03, 05:41 PM
Unfortnately Stealth, HP/Compaq are still shipping systems with TNT2 chips in them.

It is pretty disgusting to see a 2.4Ghz P4 on the shelf with a TNT2 card/chip in it. OY!

StealthHawk
06-04-03, 05:43 PM
Originally posted by Skuzzy
Unfortnately Stealth, HP/Compaq are still shipping systems with TNT2 chips in them.

It is pretty disgusting to see a 2.4Ghz P4 on the shelf with a TNT2 card/chip in it. OY!

Yikes. I knew TNT2s were still being shipped in OEM systems last year. But I thought they had graduated to gf2mx/gf4mx by now. Pathetic.

Skuzzy
06-04-03, 05:52 PM
It really is sad. Bad thing about this, is those large OEM's cause the video card market numbers to skew when testing for what performance levels are available.

Thus we end up waiting three or more years for the market to allow devs to crank up the levels in games.

Ooops, I just realized I am party to a thread hijacking!

Uh,..back on topic

FP32 is no ready for prime time. Next generation chips from the major manufacturers will solve that. The current generation FP32 product is too slow to make it useful in a real time 3D graphics application, such as games. OY!

Phew

Nv40
06-04-03, 06:01 PM
here is interesting info i have found , IEEE 32 precision used by NASA .

http://svs.gsfc.nasa.gov/stories/zooms/zoompg1.html