View Single Post
Old 02-05-07, 03:36 AM   #12
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default Re: Wrongly rejected interlaced modes from EDID?

Quote:
Originally Posted by sandeen
So what's going on here? Why is it reporting 120.1 Hz?
One possibility is that the EDID is wrong. As I said before, the EDID protocol has made the classic mistake (quite common in the early days of PC) to use the minimum number of bits to specify information. They apparently wanted to use a very small datablock.

In particular, the vertical refreshrate is stored in a 6-bit field, and to give it a useful range the value specified in the EDID is the actual refreshrate minus 60. So, when the refreshrate is 75 the value 15 is stored. For 60 Hz it would be zero.
Maybe the screen manufacturer forgot to subtract 60, put the value 60 in the EDID and NVIDIA adds 60 (which is correct) with 120 as the result.

One can easily the stupidity of this EDID design:
1. the value 50 Hz, a common TV refreshrate in Europe, cannot be specified.
2. the maximal refreshrate is 60+63 = 123 Hz, which means that very fast refreshrates also cannot be specified
3. such a formula always leads to confusion and error.

Using 2 bits more would have given a 0-255 Hz range with the possibility of using 0 to mean "another value stored elsewhere".

Similar stupidity is seen with the "native resolution" fields. One would assume that two values are given, being the horizontal and vertical resolution, each with a reasonable range (like 16 bits for 0-65535).
But NO. They chose to use only one byte! You have to multiply the value by 8 and then add 248 to get the horizontal size (yielding a range of 248 to 2288 with increments of 8).
The vertical size is not given at all! There is a 2-bit code for aspect ratio that can be 4:3 5:4 16:9 or 16:10 and you are supposed to calculate the vertical size from the horizontal size and the aspect ratio.

UNBELIEVABLE!!!!

This means that common TV display sizes like 1366 x 768 cannot even be represented. So manufacturers have to use other values that come close, and 1:1 pixel mapping becomes a nightmare.

What I find even worse is that NVIDIA has based its parameter validity checks on the output of this EDID query. The wacky data returned from the display, with all its limitations and errors, is trusted more than what the administrator writes in xorg.conf :-(
So, when a manufacturer decides that the best way to represent his 1366x768 display is to return a reasonably common value of 1280x768, the result is that all your modelines get rejected because they are "invalid". Not good, IMHO.
Furthermore, they also reject modes for internal inconsistency of the EDID, like is happening in your case. The PixelClock, Htotal and Vtotal are all OK and completely specify a correct mode, but the for-information-only VertRefresh value is wrong and now they reject the entire mode. WHY???
pe1chl is offline   Reply With Quote