Go Back   nV News Forums > Graphics Card Forums > Other Desktop Graphics Cards

Newegg Daily Deals

Reply
 
Thread Tools
Old 01-05-06, 05:23 PM   #1
faculaganymede
Registered User
 
Join Date: Jan 2006
Posts: 3
Question 10 bit per channel DVI output

Graphics gurus,

ATI x1800 spec states "16 bit per channel floating point HDR and 10 bit per channel DVI output." I need some help to understand what this is saying.

Does "16 bit per channel floating point" mean 16 bit dynamic range for each of the RGB channels in the frame buffer?

Does "10 bit per channel DVI output" mean 10 bit dynamic range for each of the RGB channels in the DVI output? If yes, is 10 bit per RGB channel currently the best there is for PC graphics cards?

I couldn't seem to find this kind of info in the Nvidia card specs.
faculaganymede is offline   Reply With Quote
Old 01-05-06, 06:56 PM   #2
Victorshen
Registered User
 
Join Date: Nov 2004
Location: Sydney, Australia
Posts: 30
Default Re: 10 bit per channel DVI output

dvi 1.0 spec only allows 8bit
Victorshen is offline   Reply With Quote
Old 01-05-06, 08:53 PM   #3
faculaganymede
Registered User
 
Join Date: Jan 2006
Posts: 3
Default Re: 10 bit per channel DVI output

Victorshen, could you explain?

I got "10 bit per channel DVI output" directly from the ATI x1800 spec: http://www.ati.com/products/RadeonX1800/specs.html

Does anyone know where I can find some documents that explains the data bit flow from framebuffer to DVI output (for any high-end graphics cards)?

Thanks.
faculaganymede is offline   Reply With Quote
Old 01-05-06, 09:31 PM   #4
Victorshen
Registered User
 
Join Date: Nov 2004
Location: Sydney, Australia
Posts: 30
Default Re: 10 bit per channel DVI output

http://www.ddwg.org/lib/dvi_10.pdf

have a look
Victorshen is offline   Reply With Quote
Old 01-06-06, 08:26 AM   #5
hordaktheman
Hans... boobie...
 
hordaktheman's Avatar
 
Join Date: Aug 2002
Location: Iceland
Posts: 273
Default Re: 10 bit per channel DVI output

From what I can tell, this "10 bit per channel" only seems to apply to the RGB channels, with the alpha channel being limited to 2 bits. That amounts to 32 bits total, which would fall within spec. And seeing as an 8 bit alpha channel is largely redundant, it should allow greater color fidelity. I'm not entirely sure about this, though.

Of course, native support requires a 10-bit capable display, but 8 and 6-bit displays get an internally dithered, downsampled version of the image. Gamma and color correction, for example, are both done in 10-bit and dithered down to 8 and 6-bit for those displays, just like previous 8-bit cards did on 6-bit displays.

I imagine the dithering artifacts to be much lesser, though.
hordaktheman is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
New New New New Driver - Still No DVI decay NVIDIA Linux 2 09-16-02 05:47 PM

All times are GMT -5. The time now is 03:54 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.