PDA

View Full Version : 128 bit color 10 bit DAC


Pages : [1] 2

harl
11-20-02, 03:22 PM
If the new cards like GFX have 64 128 bit color modes.
why the 10 bit dac?
so all the rendering is made wuth 16/32 bit per chanel
and in the output it's round to 10 bits?


and what happend with the digital flat panels?

Bigus Dickus
11-20-02, 04:12 PM
10 bit final output is probably a finer gradation than the human eye can detect, and probably about as good as current displays can handle anyway.

It's during repeated internal calculations where errors can accumulate, and that's where the increased precision comes in useful. Once the final color value for the pixel to be displayed on screen is computed, rounding off to 10 bits (or even 8) is perfectly fine.

am2020
11-20-02, 07:46 PM
I have to disagree. There are applications where tha final output bits are especially important. We are talking about sensor simulations. I for one, was confused by the 32 bits per channel and told my boss "We need that card." Now, after this thread I am re-evaluating my position and will call back teh SGI people, because they give me 12 bits per channel.

Chalnoth
11-20-02, 07:46 PM
Actually, I believe high-end DAC's generally max out at about 12-bits.

Regardless, a 10-bit DAC is quite a bit better than today's 8-bit DACs, though I wonder if/when flatpanels will be able to support the higher-definition color.

And don't forget that those formats are floating-point. I believe the mantissa of 16-bit half-floats is only 10 bits anyway (meaning that there would be no dithering...but I might be off by one or two bits on this format...).

Bigus Dickus
11-20-02, 08:04 PM
Originally posted by am2020
I have to disagree.

OK, so it is important for some uses, but sensor simulation isn't exactly Quake III. ;)

Chalnoth
11-20-02, 10:41 PM
Originally posted by am2020
I have to disagree. There are applications where tha final output bits are especially important. We are talking about sensor simulations. I for one, was confused by the 32 bits per channel and told my boss "We need that card." Now, after this thread I am re-evaluating my position and will call back teh SGI people, because they give me 12 bits per channel.

I'm curious about this, then.

The rendering power of the NV30 is just phenomenal. I would think that it could be highly useful for offline rendering applications, if modified appropriately (multichip configs, optimizations for transferring data from the card to system memory, etc.). Here the NV30 would be able to write out at the full 128 bits per pixel of accuracy.

But what do you need 12 bits per pixel of accuracy for in realtime rendering? I can definitely see the need for it in offline rendering, but I would also think that realtime rendering would be just fine at 10 bits per pixel.

Just as an aside, FP16 wouldn't be a accurate enough for 12 bits per pixel in the DAC. You would need to move up to FP32 for a 12-bit DAC to actually display an improvement (in anything but very dark scenes...).

harl
11-21-02, 09:25 AM
So can we expect better DAC's in profesionals versions
like Quadro?

About the support for output in theory monitors are analog
and can supor unlimited colors so theres no problem with tht bits
you use.

Chalnoth
11-21-02, 09:39 AM
Originally posted by harl
So can we expect better DAC's in profesionals versions
like Quadro?

I doubt it. As I said, a 12-bit DAC would be pretty much useless for FP16.

What I forgot to mention was that FP32 is not currently used as a framebuffer format. FP32 is only to be used as a pbuffer, for an intermediate step in rendering.

am2020
11-22-02, 12:02 PM
Can we summarize this somehow?
Question 1: How many bits of resolution is the
maximum that can I get out of the framebuffer in
the GeforceFX?
Question 2: Is this different from the DAC bits necessary for driving the displays?
Question 3: Are we going to see any difference in
the "workstation-quality" Quadro line?

Thanks for the answers!

Is anyone from NVIDIA reading this? There is a
huge sector out here that needs more bits in the
framebuffer... seriously. I am even considering
using software renderers like Mesa (s..l..o..w..)
just to get the necessary resolution.

nutball
11-22-02, 12:07 PM
Well what exactly are you trying to achieve? Do you need high-precision colour on-screen, or would it be sufficient in an off-screen buffer?

I don't think the width of the DACs has been widely discussed.

Chalnoth
11-22-02, 02:28 PM
Originally posted by am2020
Can we summarize this somehow?
Question 1: How many bits of resolution is the
maximum that can I get out of the framebuffer in
the GeforceFX?
Question 2: Is this different from the DAC bits necessary for driving the displays?
Question 3: Are we going to see any difference in
the "workstation-quality" Quadro line?

Thanks for the answers!

Is anyone from NVIDIA reading this? There is a
huge sector out here that needs more bits in the
framebuffer... seriously. I am even considering
using software renderers like Mesa (s..l..o..w..)
just to get the necessary resolution.

As I said, you can output the full 32-bit floating point per channel (23 bit mantissa...far above the necessary for final output) to a pbuffer, and then pull that back over the AGP bus to a bitmap, if you really need the extra precision.

However, I don't believe the NV30-level hardware will ever support higher than 10-bits output to the screen.

Obviously, with this kind of system, it would be far, far slower than just plain screen outputs, but if you're doing some sort of rendering that doesn't absolutely need to be realtime, it may still be benficial.

Whichever it turns out to be, you do have time to decide. The NV30 cards won't be out for a little bit yet.

harl
11-22-02, 02:39 PM
24 bits of mantisa (first 1 is implicit)
at least in IEEE754 :D

am2020
11-22-02, 10:42 PM
Rendering to a pbuffer will be fine, if I can get the 32 bit levels. Now, I hope that the use of thepbuffer will not introduce tradeoffs.. you know... like you can't render an image bigger than
the physical display (even when it's a pbuffer) etc.

By the way, how big (horiz * vert * RGBA color depth) a pbuffer can I get?

Bigus Dickus
11-22-02, 10:57 PM
I posted a reply in the other thread on this topic concerning the R300's capability for this purpose. It sounds like both the NV30 and the R300 will require some specialized program to enable the rendering to a texture which could then be read from the buffer. See the other thread for OpenGL guy's comment on this.

nutball
11-23-02, 01:56 AM
Originally posted by am2020
Rendering to a pbuffer will be fine, if I can get the 32 bit levels. Now, I hope that the use of thepbuffer will not introduce tradeoffs.. you know... like you can't render an image bigger than
the physical display (even when it's a pbuffer) etc.

By the way, how big (horiz * vert * RGBA color depth) a pbuffer can I get?

There are two sources of limitation.

Firstly I don't think there's a restriction on size, at least not one so daft as it being limited to screen size. I've done rendering into a pbuffer bigger than my screen. There's a memory limit of course, your pbuffer can't be bigger than graphics memory, so that should allow you to estimate what you might get.

Secondly there are limitations on what operations can be performed when rendering to a floating-point buffer. The most notable of these is that the standard alpha blending operations are not available.

Take a look at the PDFs at this page, they give some more insight:

http://developer.nvidia.com/view.asp?IO=nv30_emulation

There are a bunch of slides outlining the NV30 OpenGL extensions, and the extension specifications themselves. Essential reading, and should answer your questions.

harl
01-23-04, 01:05 PM
So long...

But I still confused.

Now I have a Fx
are there any program whera I can see the 10 Bit Dac in action?

saturnotaku
01-23-04, 01:58 PM
Wow, this is close for the all-time award for resurrecting threads from the dead. :eek:

Morrow
01-23-04, 02:41 PM
Originally posted by harl
So long...

But I still confused.

Now I have a Fx
are there any program whera I can see the 10 Bit Dac in action?

you wanna see 10bit in action?

Well, then you have to buy a monitor which support 10bit per channel...

harl
01-23-04, 02:56 PM
CRT's are analog.
Do you need special speaker to 24 bit audio?

In theory CRT can display unlimted colors

Wow, this is close for the all-time award for resurrecting threads from the dead.

Posts never die only go down and down ...

bkswaney
01-23-04, 03:15 PM
WOW... I agree this one was pulled from way at the bottom. :) 2002. :eek:

When I saw the post the NV30 will not be out for a little while yet. :D hehehe
I wish it had never come out.
They should have skipped it and put out the NV35.
The R300's 256bit bus made sure it was DOA.

Razor04
01-23-04, 03:32 PM
Originally posted by Chalnoth
The rendering power of the NV30 is just phenomenal. I would think that it could be highly useful for offline rendering applications, if modified appropriately (multichip configs, optimizations for transferring data from the card to system memory, etc.). Here the NV30 would be able to write out at the full 128 bits per pixel of accuracy.
Can you tell someone was eating up any hype produced by NV back then? Looking back on all this you would have a ton of probs in this type situation...getting the drivers to produce the correct output...the sheer amount of air conditioning needed to support a room full of comps with dustbusters...and a bunch of other stuff too.

I don't fault Chalnoth at all though...this was back in 2002 and he was seduced by the NV hype monster. Not that I agree with all his views right now...but this is such a clear example of hype it isn't even funny and also a good example as to why you shouldn't believe anything before you have seen it in action.

The Baron
01-23-04, 03:37 PM
you mean Chalnoth isn't seduced by the NV hype monster anymore?

Chalnoth
01-23-04, 04:33 PM
Originally posted by Razor04
Can you tell someone was eating up any hype produced by NV back then?
Just to put this in perspective, I was speaking from the claims of "32 functional units" and similarly-massive claims of FP performance. Those claims were either obviously misrepresented, or the final NV30 had silicon problems that prevented full functionality of the FP units (this latter one actually seems quite likely when you consider that the NV35 doubled the FP32 units with a very small increase in transistors).

Anyway, one major flaw in my logic (and that of most people at the time) was that the NV3x has a very significant register count limitation. If there was no such limitation, and the NV30 had the same number FP unit configuration as the NV35, then it would have indeed had excellent FP performance. I don't think anybody outside nVidia expected that limitation, and it seriously throws a wrench into things.

One thing you should notice is that nVidia is being exceedingly silent about the next architecture. I think this is a very good sign, as those architectures about which we heard next to nothing prior to launch have all been quite good (note: the last chip that nVidia overhyped similarly to the NV30 was the original TNT, which was vastly underclocked and ended up with performance around half that of expectations).

Chalnoth
01-23-04, 04:34 PM
Originally posted by harl
Now I have a Fx
are there any program whera I can see the 10 Bit Dac in action?
As it turns out, no. To see the 10-bit DAC in action, you would need to have a framebuffer with higher precision than 8-bit. The GeForce FX doesn't support framebuffers higher than 8-bit precision.

DSC
01-23-04, 05:03 PM
Parhelia has 10-bit output per colour channel, their marketing dept even came up with a marketing term, GigaColor...... :D

http://www.matrox.com/mga/products/parhelia512/technology/gigacolor.cfm

Neither NV3x nor R3xx/RV3xx has higher than 8bit output per colour channel, which is quite strange since they exceed the Parhelia in DX level support, has all the fancy DX9 features and yet is limited to 8bit output like previous generation hw.