PDA

View Full Version : 128 Bit color (Again)


harl
03-23-03, 03:27 PM
I have read in the whislists
True 128 Colour

So in the 5800 & 5600 the colour
is calculated in 24 bits instead of 32?

What about Ati's 9700 & 9800?

I have no found any info about the
internal implementación of his
respectives pixels shaders.


Someone can bring me light in my confusion

PD:

Sorry for my ugly english :angel:

I'm Spanish

StealthHawk
03-23-03, 09:41 PM
no, the gfFX series can do color in FP32 and FP16.

the R300 series of cards(and R350) can do only FP24.

volt
03-23-03, 10:27 PM
In that case the whish is driver related.

Uttar
03-24-03, 01:00 AM
Originally posted by volt
In that case the whish is driver related.

Not exactly. Forcing FP32 everywhere in the current drivers is possible. It just isn't reasonable, due to the performance hit.
Doing some hardware tweaking would be required.

And anyway, wanting "true 128-bit color" doesn't make sense - 16-bit is sufficent for most calculations.
IMO, it's all about making sure the programmer *does* get FP32 when he asks for it. Right now, in DX9, nVidia forces FP16 for many type of instructions.


Uttar

Chalnoth
03-24-03, 07:47 AM
Originally posted by Uttar
Not exactly. Forcing FP32 everywhere in the current drivers is possible. It just isn't reasonable, due to the performance hit.
Doing some hardware tweaking would be required.
I'm really not so sure about that, Uttar. As some NV_fragment_program target shaders have shown, the difference in performance between FP16 and FP32 can indeed be very small.

One thing to note is that part of essentially every shader that uses dependent texture reads must be executed at FP32. This precision is requird for proper texturing (this precision is also almost certainly used for fixed-function texture coordinate calculations).

It appears the primary problem with FP32 performance is that only half the hardware registers are available (input/output doesn't yet appear to be an issue, though it will in the future). The pure processing power is the same whether using FP16 or FP32.

I really do think that once nVidia has optimized their PS 2.0 compiler in the drivers (that compiles to machine code), performance in FP32 will improve dramatically.

volt
03-24-03, 10:20 AM
Correct me if I'm wrong, but at 128bit internal it can render both FP16 and FP32. FP16 looks like crap (at least when comparing 9700 in 3DMark03 shots). I don't know about newest drivers, but the last time I checked it forced 16 bit.

Uttar
03-24-03, 12:26 PM
Originally posted by Chalnoth
I'm really not so sure about that, Uttar. As some NV_fragment_program target shaders have shown, the difference in performance between FP16 and FP32 can indeed be very small.

...

I really do think that once nVidia has optimized their PS 2.0 compiler in the drivers (that compiles to machine code), performance in FP32 will improve dramatically.

You are thinking about pocketmoon's tests, right? Seen those too, quite interesting indeed. FP16 often isn't two times faster, but in some cases it can still be noticeably faster.

As you noted yourself, NV30 is faster with FP16 only because it can use half the hardware registers.

Your conclusion that "half the hardware registers are available at FP32" isn't exactly correct, AFAIK. To be accurate, it's double the hardware registers available at FP16.
Looking at nVidia papers, nVidia list all of their hardware register numbers as if they were FP32. And even when doing so, they're superior to the R300.

Of course, nVidia could optimize their drivers to be more efficient in using registers. But then, FP16 probably will *also* improve dramatically! Maybe not as much, but still...

And considering the NV3x *horrible* arithmetic performance ( let's hope it's fixed in the NV35 ) , still using the FP16 performance boost may be essantial to remain competitive.


Uttar


Post Scriptum to Volt: Please note that FP16 *doesn't* look like crap in most screenshots. In some situations however, such as in 3DMark 2003 shots, it does. But those cases are very, very rare in today's games - even inexistant, you might say.

jjjayb
03-25-03, 10:34 AM
I really do think that once nVidia has optimized their PS 2.0 compiler in the drivers (that compiles to machine code), performance in FP32 will improve dramatically.


But you've been saying this for how long? First it was "well they're still using beta drivers" "they're not even shipping boards yet, this should be resolved by the time they start shipping boards." Well, they are shipping boards and people are receiving them. Where are the performance improvements? How long do you expect people to wait? The nv30's have been in hardware for months now. Plenty of time to "optimize" ps2.0 performance. As of right now ps2.0 performance is still dreadfull on the nv30. Which isn't exactly great seeing as this is touted as a dx9 board. What good is being a dx9 board if dx9 performance frankly sucks? Do users have to wait 6 months for the performance gains like they did with the gf3? Hell 6 months is a loooong time in the video card industry. By then the nv35 will be out. You'd be better of waiting for the nv35 to come out if you want a good performing Nvidia dx9 card then.

StealthHawk
03-25-03, 09:56 PM
Originally posted by jjjayb
But you've been saying this for how long? First it was "well they're still using beta drivers" "they're not even shipping boards yet, this should be resolved by the time they start shipping boards." Well, they are shipping boards and people are receiving them. Where are the performance improvements? How long do you expect people to wait? The nv30's have been in hardware for months now.

and there are no official gfFX drivers yet, so we don't know whether or not performance will improve. supposedly new drivers come out this week. then we'll know, and if you're right, continue your bashing.

Plenty of time to "optimize" ps2.0 performance. As of right now ps2.0 performance is still dreadfull on the nv30. Which isn't exactly great seeing as this is touted as a dx9 board. What good is being a dx9 board if dx9 performance frankly sucks?

and people care because they will get poor performance in all the DX9 games which they don't own....right? as long as its fixed before any games ship then nvidia got by fine.

Do users have to wait 6 months for the performance gains like they did with the gf3? Hell 6 months is a loooong time in the video card industry. By then the nv35 will be out. You'd be better of waiting for the nv35 to come out if you want a good performing Nvidia dx9 card then.

gf3 performed fine. it was slower than a gf2Ultra in 16bit and low resolution. the gf3 delivered as promised in 32bit, higher resolution(1024+), and whenever FSAA was added.

jjjayb
03-25-03, 11:48 PM
and there are no official gfFX drivers yet, so we don't know whether or not performance will improve. supposedly new drivers come out this week. then we'll know, and if you're right, continue your bashing.

I have every reason to "bash". How long has this card been out? And no official drivers? Who the heck releases new hardware without official drivers? You don't find anything wrong with that?

Nvidia: "hey go ahead and buy our card now, we promise that we will release higher performance official drivers sometime in the future. In the meantime, we recommend you and the review sites use these "hacked" low quality performance drivers.

I find it appalling that a company would do that, yet find it even more appalling that some people don't seem to have a problem with that. They think I'm "bashing" because I do. I thought Nvidia was the king of drivers. At this point I don't think anyone can seriously claim that anymore.

Typedef Enum
03-26-03, 02:09 AM
3 1/2 months since they last updated their drivers.

It's a real cop-out to excuse nVidia because they haven't released drivers that should be out for a product that has supposedly been shipping (although there's some doubt about the extent of the shipping)...

I agree with the guy above me in that it's very hard to understand giving them a pass because they're internally considered Beta...despite their claim of shipping.

These are the moves we used to come to expect from the ATI's of the world a few years ago, not nVidia.

StealthHawk
03-26-03, 03:11 AM
Originally posted by jjjayb
I have every reason to "bash". How long has this card been out? And no official drivers? Who the heck releases new hardware without official drivers? You don't find anything wrong with that?

all i'm saying is you can't expect increased performance if there are no drivers! so the optimistic statements made by people saying performance would/should increase cannot be verified yet. you have to wait until drivers are released before an assessment can be made.

I agree with you, it is fair to bash them for not releasing drivers. they've had ample time. but in that recent nvidia chat we learned that the Detonator50's would come out this week. so we'll know once they are out whether performance increases or not.

but honestly, as I pointed out before, what's the big deal about DX9 performance when there are no DX9 games? and I don't think there will be any till Summer at the earliest.

Grrrpoop
03-26-03, 10:10 AM
Originally posted by StealthHawk
but honestly, as I pointed out before, what's the big deal about DX9 performance when there are no DX9 games? and I don't think there will be any till Summer at the earliest.
If there are no reliable DX9 drivers for FX, how can dev's be expected to optimise for it?

S.T.A.L.K.E.R is being dev'd on 9700Pro's because it's the only card available with solid, fully featured DX9 driver support. From the way it's looking, this could be a BIG game in 2003.

If nV's DX9 drivers will appear and work flawlessly straight off then sweet, but I think it's better for card optimisations to be worked in from an early stage rather than bolted on at the end in a rush b4 the game is released. I'm not a coder so I could be way out on this :)

There's always patches, but nobody likes having to buy a game and then d/l a patch straight away just to get their card working optimally.

We need DX9 early in order for developers to feel confident enough to introduce it into their games. If one of the major card manufacturers out there has lacklustre DX9 support then we're moving forward at a slower pace.

I think by nV35 most of these points will be non-issues tho.

Uttar
03-26-03, 12:51 PM
Originally posted by Grrrpoop
If there are no reliable DX9 drivers for FX, how can dev's be expected to optimise for it?

There are developer drivers, you know. Those are drivers nVidia give to registered developers ( *not* only members of The Way It's Meant To Be Played ) , and according to some people such as pocketmoon, they are really quite good.


Uttar

AngelGraves13
03-26-03, 03:59 PM
[QUOTE]Originally posted by Grrrpoop


S.T.A.L.K.E.R is being dev'd on 9700Pro's because it's the only card available with solid, fully featured DX9 driver support. From the way it's looking, this could be a BIG game in 2003.



uhh...yeah, that game is being developed on the nv30. Do your homework, read the F.A.Q. on their official site.