Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-15-07, 04:20 AM   #1
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default FX 5200 DVI: No valid modes for "1680x1050"; removing.

Hi,
New LCD monitor LG L204WT on DVI does not work with 1680x1050. The driver output is:
No valid modes for "1680x1050"; removing.

The first working mode is 1280x1024.
After "startx -- -loglevel 6" I found in /var/log/Xorg.0.log:

(II) NVIDIA(0): Validating Mode "1680x1050":
(II) NVIDIA(0): 1680 x 1050 @ 60 Hz
(II) NVIDIA(0): For use as DFP backend.
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 146.25 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1680, 1784
(II) NVIDIA(0): HSyncEnd, HTotal : 1960, 2240
(II) NVIDIA(0): VRes, VSyncStart : 1050, 1053
(II) NVIDIA(0): VSyncEnd, VTotal : 1059, 1089
(II) NVIDIA(0): H/V Polarity : -/+
(WW) NVIDIA(0): Mode is rejected: PixelClock (146.2 MHz) too high for
(WW) NVIDIA(0): Display Device (Max: 135.0 MHz).

I testet also some options found in some internet threads:

Option "UseEDIDFreqs" "FALSE"
Option "UseEDIDDpi" "FALSE"
or
Option "ExactModeTimingsDVI" "true"

but no success. Mostly lesser resolution (800x600 ?).
Currently I'm not used 'Option "UseEDID" "FALSE"' because I don't know the
consequences.
Modelines are ignored by the driver (?)

opensuse 10.2
nvidia driver 1.0.9631
NVIDIA GPU GeForce FX 5200
Monitor: LG L204WT

Currently I use the opensource "nv" driver which works with 1680x1050 but not
supporting 3D. And I want to use Google Earth and ...

Any hints for me ?

I attached Xorg.0.log after "startx -- -loglevel 6" .

Thanks, Werner
Attached Files
File Type: gz Xorg.0.log.gz (12.7 KB, 265 views)
wernerf is offline   Reply With Quote
Old 03-15-07, 05:16 AM   #2
PhilippeP
Registered User
 
Join Date: Nov 2005
Posts: 9
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

You have to add a modeline entry in your xorg.conf, you can generate one on one of the modeline generator found on the web (based on your hardware specs...

http://koala.ilog.fr/cgi-bin/nph-colas-modelines
PhilippeP is offline   Reply With Quote
Old 03-15-07, 06:29 AM   #3
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Hi PhilippeP,

I already added some Modelines in the Section "Modes" before
I do the test used in the appendet file Xorg.0.log.gz but the result was
the same.
I looks like the driver does not use any modelines when the monitor is connected via DVI.
I created the modelines with:

gtf 1680 1050 55

# 1680x1050 @ 55.00 Hz (GTF) hsync: 59.62 kHz; pclk: 133.55 MHz
Modeline "1680x1050_55.00" 133.55 1680 1784 1960 2240 1050 1051 1054 1084 -HSync +Vsync

("gtf 1680 1050 60" creates one with pclk 147.14 MHz)

Make I some wrong ?

Thanks, Werner
wernerf is offline   Reply With Quote
Old 03-15-07, 06:38 AM   #4
pgs
Registered User
 
Join Date: Apr 2006
Posts: 73
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Better to use cvt (I do not have the link with me) than gtf, since it allows reduced blanking modeline, which fits the 135MHz constrain of the 5200 DVI at 60Hz.

cvt 1680 1050 60 -r

# 1680x1050 @ 60.00 Hz Reduced Blank (CVT)
# field rate 59.88 Hz; hsync: 64.67 kHz; pclk: 119.00 MHz
Modeline "1680x1050_60.00_rb" 119.00 1680 1728 1760 1840 1050 1053 1059 1080 +HSync -Vsync

This here works for my LCD.
It might be the driver is really willing to support only full-blanking, while the monitor will accept reduced one.
There are several options of the driver in order to force the modeline, for example, in section "screen" of xorg.conf (or equivalent), you could try:

option "ExactModeTimingsDVI" "True"

Check the nVidia readme...

Hope this helps.

pg
pgs is offline   Reply With Quote
Old 03-15-07, 07:37 AM   #5
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Sorry, but the Nvidia README says:

Option "ExactModeTimingsDVI" "boolean"
Forces the initialization ... (for DVI devices, the X server initializes with the closest mode in the EDID list).

The result is 800x600

Does anybody knows what the driver uses when I use Option "UseEDID" "FALSE" ?
Use it then the specified modelines ?

Thanks, Werner
wernerf is offline   Reply With Quote
Old 03-15-07, 08:02 AM   #6
pgs
Registered User
 
Join Date: Apr 2006
Posts: 73
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Quote:
Originally Posted by wernerf
Sorry, but the Nvidia README says:

Does anybody knows what the driver uses when I use Option "UseEDID" "FALSE" ?
Use it then the specified modelines ?

Thanks, Werner
I use:

option "UseEDIDFreqs" "false"

together with the above and something else.
As I wrote before, check carefully the docs, there are several options to disable driver checking.

For example:

option "ModeValidation" ...

offers several tuning solutions.

bye,

pg
pgs is offline   Reply With Quote
Old 03-15-07, 08:57 AM   #7
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Is your monitor connected via DVI or analog ?

File README:
Option "ModeValidation" "CRT-0:NoEdidModes"
o not use EDID modes and do not perform the maximum pixel clock check on CRT-0.
----
This sounds for me that the driver use the calculated modes with 146.25 MHz
pixel clock and this is to high for my monitor (135 MHz). I don't want to kill it.

The readme does not describe what happen when the driver does not use
EDID data.

Greetings, Werner
wernerf is offline   Reply With Quote
Old 03-15-07, 09:11 AM   #8
pgs
Registered User
 
Join Date: Apr 2006
Posts: 73
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

My monitor is an LCD one connected using DVI (is there any other connection for LCDs... ;-))

The 135MHz limit is the DVI interface capability (monitor and/or 5200).

I understood we are talking about LCDs, these are less sensitive (or no sensitive at all) in case of frequency mismatch. Usually they'll not display anything or only a warning message, in case of problems.

In any case, the "-r" option of cvt is used to _reduce_ the frequency below the 135MHz limit.
Specifically, in the modeline I posted, the clock is 119MHz, which should be acceptable by all devices.
The only problem _could_ be in case the monitor does not like the reduced blanking, but if it does, then it's a dumb one... :-)

Note that modes like 1920x1200 _must_ have reduced blanking or dual-link DVI in order to work properly (below 165MHz).

bye,

pg
pgs is offline   Reply With Quote

Old 03-15-07, 12:31 PM   #9
netllama
NVIDIA Corporation
 
Join Date: Dec 2004
Posts: 8,763
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

If the mode is failing to validate due to the pixelclock being too high, then your only safe option is to attempt to generate a custom modeline (with cvt as others suggested).

You could set the NoMaxPClkCheck parameter for the ModeValidation option, however since your delta is 11Mhz, I can't see how that would ever work properly.
netllama is offline   Reply With Quote
Old 03-16-07, 06:08 AM   #10
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

FX 5200 DVI: No valid modes for "1680x1050"; removing. Solved.

Yesterday I had a boot orgy.
All my tests with some settings in xorg.conf had the effect that the X server
works only once a time (on the DVI connector). I had the idea that after a stop of the X server the graphiccard or driver switch to the analog connector.
Anyway the LCD monitor doesn't get a signal, but the x server can be started again, because a audio output (started by KDE's autostart) started.

All my defined modelines in the "Mode" section was't ignored by the driver.

At least the following option solves my problem:

Option "UseEDIDFreqs" "false"
Option "ModeValidation" "DFP-0: NoMaxPClkCheck, NoEdidMaxPClkCheck, AllowNon60HzDFPModes"

in the section "Device".
With this also the multiple start of the X server works again.
Via "startx -- -logverbose 6" I found no3w in the file "/var/log/Xorg.0.log":

(II) NVIDIA(0): Mode Validation Overrides for LG L204WT (DFP-0):
(II) NVIDIA(0): AllowNon60HzDFPModes
(II) NVIDIA(0): NoMaxPClkCheck
(II) NVIDIA(0): NoEdidMaxPClkCheck
(II) NVIDIA(0): Frequency information for LG L204WT (DFP-0):
(II) NVIDIA(0): HorizSync : 30.000-83.000 kHz
(II) NVIDIA(0): VertRefresh : 56.000-75.000 Hz
(II) NVIDIA(0): (HorizSync from HorizSync in X Config Monitor section)
(II) NVIDIA(0): (VertRefresh from VertRefresh in X Config Monitor
(II) NVIDIA(0): section)
(II) NVIDIA(0):
(II) NVIDIA(0): --- Building ModePool for LG L204WT (DFP-0) ---
(II) NVIDIA(0): Validating Mode "1680x1050":
(II) NVIDIA(0): 1680 x 1050 @ 60 Hz
(II) NVIDIA(0): For use as DFP backend.
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 146.25 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1680, 1784
(II) NVIDIA(0): HSyncEnd, HTotal : 1960, 2240
(II) NVIDIA(0): VRes, VSyncStart : 1050, 1053
(II) NVIDIA(0): VSyncEnd, VTotal : 1059, 1089
(II) NVIDIA(0): H/V Polarity : -/+
(II) NVIDIA(0): Mode is valid.

Much Thanks for all to help me to solve the problem.

Werner
wernerf is offline   Reply With Quote
Old 03-16-07, 06:12 AM   #11
wernerf
Registered User
 
Join Date: Mar 2007
Location: Nuernberg, Germany
Posts: 6
Default Re: FX 5200 DVI: No valid modes for "1680x1050"; removing.

Sorry.

Should be:
All my defined modelines in the "Mode" section was ignored by the driver

Werner
wernerf is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 03:28 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.