nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   Wrongly rejected interlaced modes from EDID? (http://www.nvnews.net/vbulletin/showthread.php?t=83670)

sandeen 01-04-07 11:41 PM

Wrongly rejected interlaced modes from EDID?
 
log snippet:

Code:

(--) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at PCI:1:0:0:
(--) NVIDIA(0):    TSB TOSHIBA-TV (DFP-0)
(--) NVIDIA(0): TSB TOSHIBA-TV (DFP-0): 135.0 MHz maximum pixel clock
(--) NVIDIA(0): TSB TOSHIBA-TV (DFP-0): Internal Single Link TMDS
(--) NVIDIA(0): TSB TOSHIBA-TV (DFP-0): Native FlatPanel Scaling is supported
(--) NVIDIA(0): TSB TOSHIBA-TV (DFP-0): DFP modes are limited to 60 Hz refresh
(--) NVIDIA(0):    rate
(--) NVIDIA(0):
(--) NVIDIA(0): --- EDID for TSB TOSHIBA-TV (DFP-0) ---
(--) NVIDIA(0): EDID Version                : 1.3
(--) NVIDIA(0): Manufacturer                : TSB
(--) NVIDIA(0): Monitor Name                : TSB TOSHIBA-TV
(--) NVIDIA(0): Product ID                  : 515
(--) NVIDIA(0): 32-bit Serial Number        : 0
(--) NVIDIA(0): Serial Number String        :
(--) NVIDIA(0): Manufacture Date            : 2006, week 0
(--) NVIDIA(0): DPMS Capabilities            :
(--) NVIDIA(0): Prefer first detailed timing : Yes
(--) NVIDIA(0): Supports GTF                : No
(--) NVIDIA(0): Maximum Image Size          : 1960mm x 1420mm
(--) NVIDIA(0): Valid HSync Range            : 15.0 kHz - 46.0 kHz
(--) NVIDIA(0): Valid VRefresh Range        : 59 Hz - 61 Hz
(--) NVIDIA(0): EDID maximum pixel clock    : 80.0 MHz
...
(II) NVIDIA(0): Frequency information for TSB TOSHIBA-TV (DFP-0):
(II) NVIDIA(0):  HorizSync  : 15.000-46.000 kHz
(II) NVIDIA(0):  VertRefresh : 59.000-61.000 Hz
(II) NVIDIA(0):    (HorizSync from EDID)
(II) NVIDIA(0):    (VertRefresh from EDID)
(II) NVIDIA(0):
(II) NVIDIA(0): --- Building ModePool for TSB TOSHIBA-TV (DFP-0) ---
(II) NVIDIA(0):  Validating Mode "1920x1080":
(II) NVIDIA(0):    1920 x 540 @ 60 Hz
(II) NVIDIA(0):    For use as DFP backend.
(II) NVIDIA(0):    Mode Source: EDID
(II) NVIDIA(0):      Pixel Clock      : 74.25 MHz
(II) NVIDIA(0):      HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0):      HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0):      VRes, VSyncStart :  540,  542
(II) NVIDIA(0):      VSyncEnd, VTotal :  547,  562
(II) NVIDIA(0):      H/V Polarity    : +/+
(II) NVIDIA(0):      Extra            : Interlace
(WW) NVIDIA(0):    Mode is rejected: VertRefresh (120.1 Hz) out of range
(WW) NVIDIA(0):    (59.000-61.000 Hz).

But the thing is, this is an -interlaced- mode. The actual vertical refresh should be 60.05 from my calculations. Only 1080p (screen-at-a-time) would require 120.1 Hz, but this is 1080i. Is this a bug in nvidia's calculations? Rejecting EDID-reported modes as invalid based on EDID-reported thresholds seems like a hint that something has gone wrong here... :)

sandeen 01-04-07 11:45 PM

Re: Wrongly rejected interlaced modes from EDID?
 
1 Attachment(s)
Oh, this is the 9746 driver btw. bug-report info attached.

sandeen 01-05-07 12:03 AM

Re: Wrongly rejected interlaced modes from EDID?
 
Maybe I'm confused about how interlacing affects vert refresh... but here's another oddity. I put the EDID-interpreted timings in as a modeline, and the driver seems to treat it differently even though they are the same values....

Modeline:

Code:

Modeline "1920x1080" 74.25 1920 2008 2052 2200 540 542 547 562 +hsync +vsync Interlace
Code:

(II) NVIDIA(0):  Validating Mode "1920x1080":
(II) NVIDIA(0):    1920 x 540 @ 60 Hz
(II) NVIDIA(0):    Mode Source: X Configuration file ModeLine
(II) NVIDIA(0):      Pixel Clock      : 74.25 MHz
(II) NVIDIA(0):      HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0):      HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0):      VRes, VSyncStart :  540,  542
(II) NVIDIA(0):      VSyncEnd, VTotal :  547,  562
(II) NVIDIA(0):      H/V Polarity    : +/+
(II) NVIDIA(0):      Extra            : Interlace
(WW) NVIDIA(0):    Mode is rejected: HorizSync (33.8 kHz) out of range
(WW) NVIDIA(0):    (28.000-33.000 kHz).

EDID:
Code:

(II) NVIDIA(0):  Validating Mode "1920x1080":
(II) NVIDIA(0):    1920 x 540 @ 60 Hz
(II) NVIDIA(0):    Mode Source: EDID
(II) NVIDIA(0):      Pixel Clock      : 74.25 MHz
(II) NVIDIA(0):      HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0):      HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0):      VRes, VSyncStart :  540,  542
(II) NVIDIA(0):      VSyncEnd, VTotal :  547,  562
(II) NVIDIA(0):      H/V Polarity    : +/+
(II) NVIDIA(0):      Extra            : Interlace
(WW) NVIDIA(0):    Mode is rejected: VertRefresh (120.1 Hz) out of range
(WW) NVIDIA(0):    (59.000-61.000 Hz).

at least the parsing/verification is going differently, because it chokes for different reasons despite the same values... what's up with this?

netllama 01-05-07 10:38 AM

Re: Wrongly rejected interlaced modes from EDID?
 
The bug report that you attached doesn't contain any of the X log output that you quoted above. Please provide a bug report which includes the verbose output.

Thanks,
Lonni

sandeen 01-05-07 10:52 AM

Re: Wrongly rejected interlaced modes from EDID?
 
1 Attachment(s)
Sorry about that; here you go.

On a possibly related note (related to rejected modelines) how is the native resolution of the DFP found - is it directly from the EDID info or is it calculated from other info?

Thanks,
-Eric

netllama 01-05-07 11:15 AM

Re: Wrongly rejected interlaced modes from EDID?
 
The only defined 1920x1080 mode in the EDID is:
(--) NVIDIA(0): 1920 x 1080 @ 60 Hz
(--) NVIDIA(0): Pixel Clock : 74.18 MHz
(--) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0): VRes, VSyncStart : 540, 542
(--) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(--) NVIDIA(0): H/V Polarity : +/+
(--) NVIDIA(0): Extra : Interlaced
(--) NVIDIA(0): CEA Format : 5

However, its actually 1920x540 since its interlaced. That mode is failing validation:
(II) NVIDIA(0): Validating Mode "1920x1080":
(II) NVIDIA(0): 1920 x 540 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 74.25 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 540, 542
(II) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(II) NVIDIA(0): H/V Polarity : +/+
(II) NVIDIA(0): Extra : Interlace
(WW) NVIDIA(0): Mode is rejected: VertRefresh (120.1 Hz) out of range
(WW) NVIDIA(0): (59.000-61.000 Hz).

You can attempt to workaround that by adding the following to the Device section of xorg.conf:
Option "ModeValidation" "NoVertRefreshCheck"

However, you're not requesting any modes in your xorg.conf, so if you want 1920x1080, you'll need to explicitly request it.

The native resolution can only be determined from the EDID.

sandeen 01-05-07 11:39 AM

Re: Wrongly rejected interlaced modes from EDID?
 
Quote:

However, its actually 1920x540 since its interlaced. That mode is failing validation:
(II) NVIDIA(0): Validating Mode "1920x1080":
(II) NVIDIA(0): 1920 x 540 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 74.25 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0): VRes, VSyncStart : 540, 542
(II) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(II) NVIDIA(0): H/V Polarity : +/+
(II) NVIDIA(0): Extra : Interlace
(WW) NVIDIA(0): Mode is rejected: VertRefresh (120.1 Hz) out of range
(WW) NVIDIA(0): (59.000-61.000 Hz).
And is 120.1 Hz really the right vert refresh for that mode? IOW does this mean that the EDID is inconsistent, reporting modes which do not fit within the device parameters, or is the calculation of 120.1 Hz for that mode incorrect?

Quote:

However, you're not requesting any modes in your xorg.conf, so if you want 1920x1080, you'll need to explicitly request it.
Yes, I understand.

Quote:

The native resolution can only be determined from the EDID.
But is that native res explicitly communicated via EDID, or is it calculated from other values?

Any comment on the apparently different parsing of modes from ModeLines vs. modes from EDID?

Thanks,
-Eric

netllama 01-05-07 11:46 AM

Re: Wrongly rejected interlaced modes from EDID?
 
120.1 Hz is the vertrefresh defined in the EDID for that mode.

pe1chl 01-05-07 12:40 PM

Re: Wrongly rejected interlaced modes from EDID?
 
There is a quite detailed article about EDID in Wikipedia. This answers your question about the native res (it is in the EDID but in a convoluted way and it seems unwise to rely on it for checks and calculations, see it as a hint only).

Many values are stored in the EDID in minimalist number of bits with tricky formulas.
This easily results in mistakes.

The real vertical refresh is PixelRate / (HTotal * VTotal).

rgifford 01-26-07 02:00 PM

Re: Wrongly rejected interlaced modes from EDID?
 
1 Attachment(s)
Eric,
Looking at your EDID information we share the same TV model and problem. I recently upgraded my MythTV hardware and everything else in the process.

The good news is that I did have 1080i working for this display under the 1.0-8686 32 bit driver.

I have attached my Config file. I got it to work by turning off the vertical refresh rate check (Option "ModeValidation" "NoVertRefreshCheck")

Doing the same thing under 1.0-9746 or any other later driver that I tested causes the display to crop in half vertically. As you stated it is clearly not being treated as an interlaced display.

Further more, prior to my recent upgrade running the same hardware in Windows XP the display worked fine. Clearly, I do not see this as a hardware issue, but a driver bug.

-Rob

sandeen 02-04-07 10:47 PM

Re: Wrongly rejected interlaced modes from EDID?
 
Quote:

The real vertical refresh is PixelRate / (HTotal * VTotal).
So from this bit of the EDID:

Code:

(--) NVIDIA(0):  1920 x 1080 @ 60 Hz
(--) NVIDIA(0):    Pixel Clock      : 74.25 MHz
(--) NVIDIA(0):    HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0):    HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0):    VRes, VSyncStart : 540, 542
(--) NVIDIA(0):    VSyncEnd, VTotal : 547, 562
(--) NVIDIA(0):    H/V Polarity    : +/+
(--) NVIDIA(0):    Extra            : Interlaced

we have 74,250,000 / (2200 * 562) = 60.05Hz - and this is what the mode says too.

But, the nvidia driver says, for this mode:

Code:

(II) NVIDIA(0):    Mode Source: EDID
(II) NVIDIA(0):      Pixel Clock      : 74.25 MHz
(II) NVIDIA(0):      HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0):      HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0):      VRes, VSyncStart :  540,  542
(II) NVIDIA(0):      VSyncEnd, VTotal :  547,  562
(II) NVIDIA(0):      H/V Polarity    : +/+
(II) NVIDIA(0):      Extra            : Interlace
(WW) NVIDIA(0):    Mode is rejected: VertRefresh (120.1 Hz) out of range
(WW) NVIDIA(0):    (59.000-61.000 Hz).

So what's going on here? Why is it reporting 120.1 Hz?

I could try overriding the checks but I think the nvidia driver is going astray here.

pe1chl 02-05-07 03:36 AM

Re: Wrongly rejected interlaced modes from EDID?
 
Quote:

Originally Posted by sandeen
So what's going on here? Why is it reporting 120.1 Hz?

One possibility is that the EDID is wrong. As I said before, the EDID protocol has made the classic mistake (quite common in the early days of PC) to use the minimum number of bits to specify information. They apparently wanted to use a very small datablock.

In particular, the vertical refreshrate is stored in a 6-bit field, and to give it a useful range the value specified in the EDID is the actual refreshrate minus 60. So, when the refreshrate is 75 the value 15 is stored. For 60 Hz it would be zero.
Maybe the screen manufacturer forgot to subtract 60, put the value 60 in the EDID and NVIDIA adds 60 (which is correct) with 120 as the result.

One can easily the stupidity of this EDID design:
1. the value 50 Hz, a common TV refreshrate in Europe, cannot be specified.
2. the maximal refreshrate is 60+63 = 123 Hz, which means that very fast refreshrates also cannot be specified
3. such a formula always leads to confusion and error.

Using 2 bits more would have given a 0-255 Hz range with the possibility of using 0 to mean "another value stored elsewhere".

Similar stupidity is seen with the "native resolution" fields. One would assume that two values are given, being the horizontal and vertical resolution, each with a reasonable range (like 16 bits for 0-65535).
But NO. They chose to use only one byte! You have to multiply the value by 8 and then add 248 to get the horizontal size (yielding a range of 248 to 2288 with increments of 8).
The vertical size is not given at all! There is a 2-bit code for aspect ratio that can be 4:3 5:4 16:9 or 16:10 and you are supposed to calculate the vertical size from the horizontal size and the aspect ratio.

UNBELIEVABLE!!!!

This means that common TV display sizes like 1366 x 768 cannot even be represented. So manufacturers have to use other values that come close, and 1:1 pixel mapping becomes a nightmare.

What I find even worse is that NVIDIA has based its parameter validity checks on the output of this EDID query. The wacky data returned from the display, with all its limitations and errors, is trusted more than what the administrator writes in xorg.conf :-(
So, when a manufacturer decides that the best way to represent his 1366x768 display is to return a reasonably common value of 1280x768, the result is that all your modelines get rejected because they are "invalid". Not good, IMHO.
Furthermore, they also reject modes for internal inconsistency of the EDID, like is happening in your case. The PixelClock, Htotal and Vtotal are all OK and completely specify a correct mode, but the for-information-only VertRefresh value is wrong and now they reject the entire mode. WHY???


All times are GMT -5. The time now is 05:45 AM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.