nV News Forums


nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   Request for new Sticky about resolution issue! (http://www.nvnews.net/vbulletin/showthread.php?t=96006)

ThinkTiM 08-03-07 01:02 PM

Request for new Sticky about resolution issue!

I spent quite a while researching a problem with the new nvidia driver (100.14.11) where it would not utilize my LCD monitor's native screen resolution.

It turns out that the issue is resolved by adding the following line into my xorg.conf:

Option "ModeValidation" "NoMaxPClkCheck"

While researching my problem, I noticed that there are a LOT of people having this problem.

Would it be possible to create a sticky on this forum so that people are more aware that the default behavior of the driver has changed?

Paul Sorensen

Jaws_2 11-02-07 12:09 PM

Re: Request for new Sticky about resolution issue!
Hey Paul,

Here's a big Thank you!

I just spent a day and a half trying to figure out why I couldn't get my new LCD monitor, when connected with the DVI cable, to work with its native resolution of 1680x1050.

After searching, hacking and borking xorg.conf to no end, I happened on the post where you linked to this one.

At first I was stuck with 1280x1024 resolution which, through nVidia's configuration setting, was listed as the native resolution for this monitor.

I was able to sneak up to the proper resolution by adding this line to the Device Section...

Option "ModeValidation" "NoDFPNativeResolutionCheck"

which at least got me to 1440x900. Close but no cigar and nVidia setting were still listing 1280x1024 as the monitor native resolution.

Finally your suggested...

Option "ModeValidation" "NoMaxPClkCheck"

got me to 1680x1050 and now the nVidia configuration setting also list 1680x1050 as the monitor native resolution.

BTW, my card is a FX5500 and all the modelines generated by gtf were left as they were when initially setting up the nVidia drivers (100.14.19) and not messing with any EDID options. Even with the 147.14 pixel clock settings for 1680x1050@60.

Therefore, I second the request for a sticky.


zander 11-03-07 03:08 PM

Re: Request for new Sticky about resolution issue!
Please note that by overriding the NVIDIA X driver's mode validation (or parts of it), e.g. by ignoring the maximum pixel clock, you're defeating capability checks. The resulting configuration may work for you, but overriding sanity checks, especially against hardware capabilities such as the maximum pixel clock, is not recommended. If you believe that the NVIDIA X driver is in error and mis-detects your hardware's capabilities, please post a bug report (see http://www.nvnews.net/vbulletin/showthread.php?t=46678).

Jaws_2 11-03-07 06:23 PM

Re: Request for new Sticky about resolution issue!

I would but being a Linux noob, I have absolutely no idea what 3a) is asking me for or how to do it. I understand that there are a multitude of configurations out there but I'm sorry, these kinds of basic problems where a monitor and video card can't communicate a native resolution shouldn't exist. I do appreciate your concern and the time you put into the forum.


pe1chl 11-04-07 05:10 AM

Re: Request for new Sticky about resolution issue!

Originally Posted by Jaws_2
I'm sorry, these kinds of basic problems where a monitor and video card can't communicate a native resolution shouldn't exist.

I think there are two issues mentioned in the second article in this thread:

1. the driver misdetects the native resolution. This is the one that can be defeated by the "NoDFPNativeResolutionCheck" parameter.
As mentioned before, it looks like the driver makes an assumption about the EDID data returned by the monitor (the assumption that the native resolution is the first one returned in the list of resolutions) that is not valid. At least, the VESA document about the EDID data clearly states that this assumption should NOT be made.
What happens in this case is that the driver selects a smaller resolution and rejects the proper native resolution as invalid because it is larger than this wrongly determined native resolution.
I think this is a bug. But nvidia seems to hold the opinion it is a bug in the monitor, so we seem to be stuck with it.

2. the driver determines that the native resolution of the monitor requires a higher clock than the maximum allowed value, and rejects the resolution based on that check. This can be defeated by the "NoMaxPClkCheck" parameter.
This is not really a bug, but just a limitation of your videocard. The card is too limited (too old) to support the resolution of the shiny new panel you bought.
This should not really be called a bug, it is just a problem with your configuration.

So while it may be worthwile to list "solutions" like the one in the first article in this thread somewhere, it is not appropriate to call them bugs or miscommunications.
But, the other issue (the problem solved by "NoDFPNativeResolutionCheck") deserves that description.

Jaws_2 11-04-07 08:20 AM

Re: Request for new Sticky about resolution issue!
Interesting info, Pe1chl, thanks.

So I assume from what you said that my video card is the problem and there is NO resolution (no pun intended) to my problem other then to upgrade it?

That assumption, if correct, begs the questions -- why does the card work now when bypassing with the "NoMaxPClkCheck" parameter if the card is too old?

And why (I'm not that smart as to understand the intricacies of digital vs analog output) does it work when connecting analog to analog at it's native resolution?

Just to note, I did try modelines with lower pixel clock speeds at the native resolution, but were rejected. As a matter of fact, all the modelines were rejected at first, according to the xorg log file. Only after the "NoDFPNativeResolutionCheck" parameter was I able to get, at least, the 1440x900 resolution.


pe1chl 11-04-07 09:24 AM

Re: Request for new Sticky about resolution issue!
The reason it works at analog is because analog output has the potential to output much larger pixel rates.
With analog output, each pixel is output as 3 analog values of 8 bits each (0-255), the card can do that at about 350 MHz.
For digital output, the 3 8-bit values for each pixel are output serially. So even with a much larger serial clock value, the card actually can output only about 135 MHz at pixel level (which is 24 times more at bit level).

The maximum bitrate is a "soft" limit. Your card's specs may say the maximum is 135 MHz but your actual card's performance could be a bit better than that.
So turning off the checking can make it working while the guaranteed performance of the card would not allow it.

Newer cards are guaranteed to work up to 155 MHz for single link DVI.
To solve the problems for larger resolutions, some cards have dual link DVI which means it can output two pixels at the same time over two different connections, effectively doubling the pixel rate.

mc42 11-22-07 11:49 AM

Re: Request for new Sticky about resolution issue!
After some googling I discovered that all our GeForce FX (5200)
have a broken TMDS encoder that works below the DVI specification.
(see: http://en.wikipedia.org/wiki/Digital_Visual_Interface
"The DVI specification mandates a fixed single link maximum pixel clock frequency of 165 MHz")

The lower maximal frequency is detected by the driver and the right timing for monitor is blocked, i.e.:

#(--) NVIDIA(0): Detailed Timings:
#(--) NVIDIA(0):  1680 x 1050 @ 60 Hz
#(--) NVIDIA(0):    Pixel Clock      : 146.25 MHz
#(--) NVIDIA(0):    HRes, HSyncStart : 1680, 1784
#(--) NVIDIA(0):    HSyncEnd, HTotal : 1960, 2240
#(--) NVIDIA(0):    VRes, VSyncStart : 1050, 1053
#(--) NVIDIA(0):    VSyncEnd, VTotal : 1059, 1089
#(--) NVIDIA(0):    H/V Polarity    : -/+

The best solution for this I found, is to use (119MHz) reduced blanking modeline.

In the Section "Monitor":
Option "ExactModeTimingsDVI" "true"
Modeline "1680x1050rb" 119.00 1680 1728 1760 1840 1050 1053 1059 1080 -hsync +vsync

Section "Device":
Option "ModeValidation" "NoDFPNativeResolutionCheck"

and in the SubSection "Display" write:
Modes "1680x1050rb"

If that for some reason doesn't work, you can try the

Option "ModeValidation" "NoMaxPClkCheck"

but as zander wrote, this may kill the TMDS encoder on the card a bit sooner than the new 3D AMD open source drivers arrives ;)

energyman76b 11-22-07 01:20 PM

Re: Request for new Sticky about resolution issue!
and before you start overriding, maybe removing modes&modelines is the answer?

mc42 11-22-07 02:38 PM

Re: Request for new Sticky about resolution issue!
There is absolutely nothing to remove. As I wrote the problem lies
in crappy TMDS encoders used on FX cards.

A standard 1680 x 1050 resolution needs a pixel clock of 146.25 MHz and that is more than the encoder was designed for. This modeline uses a supported 119 Mhz clock. All the new TFT monitors doesn't on principle need any "blanking time", also it is perfectly sane. The nvidia driver doesn't grok this, so you must override ExactModeTimingsDVI and NoDFPNativeResolutionCheck.

All times are GMT -5. The time now is 10:51 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.