PDA

View Full Version : What does 128 bit HDR really mean?


sillyeagle
11-01-06, 02:31 AM
So the G80 does 128 HDR, but what the hell does that mean? Any good sales person knows you sell benefits, not features!!! :D

HDR in Source seems like its already pretty good to me, so how will 128 bit improve upon what we see now?

As many times as I have heard some of you guys boasting this feature I hope one of you can explain.

I'm assuming this is something that is not going to improve existing HDR implementation.

jolle
11-01-06, 05:26 AM
FP16 is 64bit, so 128bit HDR would be FP32.
Larger dynamic range, thats what its about.
Special effects production often uses FP32, but sometimes its overkill even there and thats why there is OpenEXR with FP16 support.
HDR is a bit complicated, so its hard to say exactly what the benefit of the extra dynamic range is.. I spose more contrast between brightest and dimmest pointso of light, which could probably be used for better autoexposure effects and that like..
And when we start using Image Based Lighting it might make a noticable difference.
Support for it on the new GPUs is welcome, but I we´ve only starting to se FP16 HDR used in games, so I doubt it will be a prominint feature in the near future, atleast for gaming.

AdmiralHalsey..
11-01-06, 07:02 AM
It means nothing unless the performance hit is minimal. It just won't get used.

fellix
11-01-06, 08:02 AM
Higher range of storing an image data is useful mostly to prevent/reduce the error accumulation in case of heavy multi-pass effects (blending and so on) and thus the increased image/FX quality. Remember the 16-bit frame buffer format, widely used in 3Dfx era, that often caused bad looking dither artifacts in some games utilizing alpha-blend effects.
As for the display, at the present time such a frame buffer color depth doesn't contribute for better perception, as the majority of CRT/TFTs are capable to max out 8-bit per component depth.
It's more important now that all the "internal" calculations -- from vertex data to fragment/pixel batches -- to be done at the higher possible/feasible data format. In the case for SM3 (DX9c), FP32 is mandatory for both VS and PS, and DX10 now constitutes the entire pipeline -- all the way down to the frame buffer and even the textures -- to be FP32 capable, which wasn't the case for DX9 at all (the very first spec's didn't even defined what should be considered as full-precision PS processing -- ATi's 24-bit or IEEE's 32-bit).

p.s.: try to run RTHDRIBL -- the famous DX9 HDR demo -- on any GFFX card and you will see what, the lack of native support for FP16 textures (even INT16), can do to the final image output.

SH64
11-01-06, 06:01 PM
Thanks for the info fellix :thumbsup:

yeah they way i understand it is that 128-bit or FP32 HDR = higher precision HDR format.

AliceCooper
11-03-06, 03:03 PM
In marketing terms it means that Nvidia have now caught up with ATI who have had this feature for a while.

But hey it sounds better!!

I suppose it means that HDR with AA enabled will now work as it has done on the Radeon X1k series from launch. BTW HL2 uses a software HDR that works on all cards that support it.

Just like ATI finally came on board with SM3.0.

Cheers:

:captnkill:

hemmy
11-03-06, 05:57 PM
In marketing terms it means that Nvidia have now caught up with ATI who have had this feature for a while.

But hey it sounds better!!

I suppose it means that HDR with AA enabled will now work as it has done on the Radeon X1k series from launch. BTW HL2 uses a software HDR that works on all cards that support it.

Just like ATI finally came on board with SM3.0.

Cheers:

:captnkill:

That is not what it means, only the G80 can do 128bit HDR (and maybe R600, too early to know)

And the Source Engine doesn't use "software" HDR, it is done through pixels

PeterJensen
11-03-06, 06:49 PM
It means SEX!

jolle
11-03-06, 07:22 PM
And the Source Engine doesn't use "software" HDR, it is done through pixels
I belive they use FP16 framebuffer where supported, render to FP16 texture when not, and a even lower FX12 format for lower precision.
Not sure tho, there was a article on it when it came out outlining the different methods they use.

I assume this feature talked about here is about FP32 framebuffer support + blending + texture and filtering.. with full AA support aswell (mentioned in recent rumors atleast)

SH64
11-03-06, 07:24 PM
I belive they use FP16 framebuffer where supported, render to FP16 texture when not, and a even lower FX12 format for lower precision.
Not sure tho, there was a article on it when it came out outlining the different methods they use.

I remember i read that Source used only Integer10 for HDR .. (or something close).