Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-25-06, 08:28 PM   #1
transonic
Registered User
 
Join Date: Sep 2006
Location: Atlanta
Posts: 5
Default Dell 2405 on FX5200 - Interlace bug?

I've read the prior Dell 2405 posts regarding the FX5200 pixel clock limit of 135MHz and the 2405's requirement for 60Hz refresh.

Using the 1.0-8774 driver on Fedora Core 5 (FC5), I can find no valid modeline. One old solution that would suit me fine is to specify an interlaced modeline to reduce the required bandwidth. The driver ModeValidation does not seem to properly account for Interlace, as the debug invocation of Xorg, "startx -- -verbose 6 -logverbose 6", reports that the Vertical Refresh is only 30.1 Hz, instead of the required 60 Hz. Well, I guess that depends on how you define Vertical Refresh, but several old posts report that the 2405 runs on a 128Meg FX5200 using the exact same modeline that is now considered invalid. The only change is the driver's accounting, which must be wrong if people had it working under prior releases.

Although I run CRTs at 85Hz because I'm really flicker sensitive, I hope that an interlaced 60Hz will be satisfactory on the 2405's LCD. I look at code all day, and couldn't care less about games or mpeg playback frame rates. In fact, I don't think I need 3D at all. I'm using the nVidia driver rather than the libvga 'nv' driver only because I couldn't get interlacing to work on 'nv' either. I was hoping that the NVCONTROL options would help me weasel my monitor past the ModeValidator.

I'm connecting over a DVI-D link.
My modelines are listed in a stand-alone "Modes" section of xorg.conf.

The modeline that I think should work is:
> ModeLine "1920x1200_2405a" 87.7 1920 1952 2280 2312 1200 1227 1233 1261 interlace

Telling the Device section of xorg.conf
> Option "UseEDIDFreqs" "False" # (IgnoreEDID & NoDDC have been deprecated)
does not help get my '2405a' modeline validated.

And putting this line in the Device section:
> Option "ModeValidation" "NoEdidModes"
causes the driver to default to something like 640x480.

Which trick(s) will get this combo working with the 8774 driver?
transonic is offline   Reply With Quote
Old 09-26-06, 04:00 AM   #2
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default Re: Dell 2405 on FX5200 - Interlace bug?

When I still had a 5200, my 2405 worked perfectly by just using that 60Hz modeline and lowering the clock to 135.00
This results in something like 52Hz vertical, which my 2405 accepts without problem.
(I know the spec says it doesn't)
pe1chl is offline   Reply With Quote
Old 09-26-06, 08:10 AM   #3
transonic
Registered User
 
Join Date: Sep 2006
Location: Atlanta
Posts: 5
Default Re: Dell 2405 on FX5200 - Interlace bug?

Thanks for the reply. I read your earlier post, and don't understand what you meant by "lowering the clock". If I use http://xtiming.sourceforge.net/cgi-bin/xtiming.pl and put in the 2405's numbers, except for "lying" about the 2405's acceptable range of VertRefresh (claiming that it will work from 51-76, instead the actual EDID spec of 56-76), and specifying a clock of 135 instead of 170, I still get a modeline of
> Modeline "1920x1200@52" 173.27 1920 1952 2608 2640 1200 1225 1235 1261

This has a clock of 173.27, which I think I tried yesterday - the 'nvidia' driver refuses to consider it valid. I'll go try it again, with and without various NVControl options, so that this thread has a definitive record.

Can you post the actual modeline that worked for you (if you still have it....)?
transonic is offline   Reply With Quote
Old 09-26-06, 09:14 AM   #4
transonic
Registered User
 
Join Date: Sep 2006
Location: Atlanta
Posts: 5
Default Re: Dell 2405 on FX5200 - Interlace bug?

All 3 of my trial failed with the same /var/log/Xorg.0.log complaint:

(II) NVIDIA(0): Validating Mode "1920x1200@52":
(II) NVIDIA(0): 1920 x 1200 @ 52 Hz
(II) NVIDIA(0): Mode Source: Custom ModeLine
(II) NVIDIA(0): Pixel Clock : 173.27 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 1952
(II) NVIDIA(0): HSyncEnd, HTotal : 2608, 2640
(II) NVIDIA(0): VRes, VSyncStart : 1200, 1225
(II) NVIDIA(0): VSyncEnd, VTotal : 1235, 1261
(II) NVIDIA(0): H/V Polarity : +/+
(WW) NVIDIA(0): Mode is rejected: Only 60 Hz VertRefresh modes are allowed
(WW) NVIDIA(0): for this TMDS encoder; this mode had VertRefresh 52.0 Hz.

I used this modeline:
> ModeLine "1920x1200@52" 173.27 1920 1952 2608 2640 1200 1225 1235 1261

With seperate attempts using each of:
> Option "UseEDIDFreqs" "False"
> Option "ModeValidation" "NoEdidModes"

All 3 trials produced the Frequency message. The "NoEdidModes" came up at 800x600.

Any ideas?
transonic is offline   Reply With Quote
Old 09-26-06, 10:49 AM   #5
transonic
Registered User
 
Join Date: Sep 2006
Location: Atlanta
Posts: 5
Default Re: Dell 2405 on FX5200 - Interlace bug?

After reading http://download.nvidia.com/XFree86/L...ppendix-d.html

I tried
> Option "ModeValidation" "AllowNon60HzDFPModes, NoMaxSizeCheck, NoEdidDFPMaxSizeCheck"

With 'pe1chi's "@52" mode, the modeline was rejected for the pixel clock being 173MHz, well above the 135Mhz limit.

With my "_2405a" modeline, the mode was declared valid in the log, but was rejected on the very next line because the 1920x1200 "size is too large for" the 2405. huh? It was designed for this size!

Next, I added the Device option:
> Option "ExactModeTimingsDVI" "true"
to run the interlaced "_2405a" modeline again. Apparently the nVidia driver actually tried this one, because the actual 2405 monitor briefly said that it could not display some input on the DVI-D connector before dropping back to 1600x1200. I thought this mode was the one that worked for some other posters and don't understand why it won't.

I sure hope that pe1chi can explain how I've misunderstood his "135MHz clock" advice, because the interlace didn't work and I'm running out of ideas.
transonic is offline   Reply With Quote
Old 09-26-06, 12:20 PM   #6
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default Re: Dell 2405 on FX5200 - Interlace bug?

I can't find the modeline on my system (although I think I have posted it on this forum before, but the search is often quite hard to use).
However, it is easy to re-create it.

First, generate a 1920x1200 at 60Hz reduced blanking modeline using cvt:
cvt 1920 1200 60 -r

Modeline "1920x1200_60.00_rb" 154.00 1920 1968 2000 2080 1200 1203 1209 1235 +HSync -Vsync

The 154.00 is the pixelclock, the value that is too high for the 5200 card.
Change this value into 135.00. The 60Hz will be lowered to 135/154*60 (52.6 Hz) by this.
So:

Modeline "1920x1200_52.60_rb" 135.00 1920 1968 2000 2080 1200 1203 1209 1235 +HSync -Vsync

This should be accepted by the driver. If it works on your monitor depends on the acceptance of the 52.6Hz vertical refresh. My 2405 accepted it OK, but apparently others had problems with it.

What you can also try is to use the unmodfied modeline above (or no modeline at all), and tell the driver to skip the maximum pixel clock check using the "NoMaxPClkCheck" option. This may work OK when you use the 154.00 MHz modeline.
pe1chl is offline   Reply With Quote
Old 09-29-06, 10:51 AM   #7
transonic
Registered User
 
Join Date: Sep 2006
Location: Atlanta
Posts: 5
Default Re: Dell 2405 on FX5200 - Interlace bug?

To finish this thread:
0) to 'startx -- -verbose 6 -logverbose 6', I had to drop to run state 3 ('init 3'). I could not prevent 'gdm' from restarting every time I 'kill -9'ed it or Xorg.

1) The interlace that claims to be 'enabled' in the Xorg.0.log output never worked. I still suspect that driver version 1.0-8774 didn't really try it, though I'm not sure.

2) The 52Hz vertical refresh that results from slowing the 154MHz modeline to 135MHz did not work on my Dell 2405 over DVI. The monitor reported that it "can not display this mode". My monitor EDID reports that it was made in 2005. In order to get the driver to even attempt this modeline, I used
> Option "ModeValidation" "AllowNon60HzDFPModes"
> Option "ExactModeTimingsDVI" "true"
Read Appendix D!
http://download.nvidia.com/XFree86/L...ppendix-d.html

3) Running 'pe1chi's 154MHz modeline does work on my FX5200 (VideoBIOS 04.34.20.69.01) so far. It has been 2 days, and it has not burned out the video card yet. Since this is the monitor's native mode, I'm not worried about damaging the $800 monitor, but I don't know how long the card will hold up. Hopefully long enough to replace this antique AGP-less computer. From Appendix D, I used:
> Option "ModeValidation" "NoMaxPClkCheck, AllowNon60HzDFPModes"
> Option "ExactModeTimingsDVI" "true"
I think the "AllowNon60HzDFPModes" is not needed, but have not cleaned up my xorg.conf yet.

4) Configuring Twinview dropped the monitor back to 1600x1200. After a few trials at running the second 1280x1024 monitor off the FX5200's VGA line, I reverted to running it off of the onboard Intel 865 video instead. Now the PC boots up with 1600x1200 on the 2405, and 1280x1024 on the Sceptre 19" LCD. But I can use the Gnome System->Preferences->Screen Resolution to change it to 1920x1200 once it's up.

5) As I told my client before I started fooling with this, it would be cheaper to buy an AGP or PCIe computer than to pay my rate for the number of hours it would probably take to get the FX5200 driving 1920x1200. A modern box would support any modern video card, which would easily handle the Dell 2405 native bandwidth of 154MHz. That prediction was certainly accurate - but the client is always right, huh?

Thanks pe1chi !! This wasn't much fun, but I'd never have got it going without your help!
transonic is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 12:05 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.