Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-15-05, 03:24 PM   #1
haynold
Registered User
 
Join Date: Jun 2005
Posts: 6
Default Reduced Blanking on DVI-D

Hello:

I'm trying to get an HP 2335 flat panel display to work with an Nvidia
Geforce 5500 FX. If I use an analog cable all works very well and I can
run X with 1920x1200 using the default mode selected by Xorg. With the
DVI-D cable things get more complicated, though. Up to 1600x1200
resolution everything works perfectly with the default settings. For
1920x1200, however, I need reduced blanking which doesn't seem to be an
Xorg default mode. That's where the trouble starts.

The monitor manual says that 1920x1200 should be run at 154.0 MHz,
74.04 kHz, 60.0 Hz. From these parameters I computed the following
Modeline:

ModeLine "1920x1200" 154.0 1920 1940 2012 2080 1200 1201 1204 1234

The sync pulses are just guesses because I couldn't find any
information on the right sync timing.

If X starts with these settings, the monitor claims there's no input
signal and goes to sleep. I noticed something in the Windows driver
manual for the card that apparently the older Nvidia cards can't modify
timing for digital output, but it didn't say whether there is a
predefined mode for reduced blanking that does work.

I'd be very grateful for any suggestions on how to make this work.

Oliver
haynold is offline   Reply With Quote
Old 06-16-05, 02:57 PM   #2
SaTaN0rX
Registered User
 
Join Date: Apr 2005
Posts: 86
Default Re: Reduced Blanking on DVI-D

http://www.vesa.org/Public/CVT/CVTd6r1.xls
works with gnumeric.

if you enter reduced blanking 1920x1200
it says
H front porch: 48 pix.
H sync pulse: 32 pix
H back porch 80 pix

v front porch: 3 lines
v sync pulse: 6 lines
v back porch : 26 lines

so it should be something like:

Modeline "002MA-R" 154.0 1920 1968 2000 2080 1200 1203 1209 1235 +hsync -vsync
SaTaN0rX is offline   Reply With Quote
Old 06-16-05, 10:22 PM   #3
haynold
Registered User
 
Join Date: Jun 2005
Posts: 6
Default Re: Reduced Blanking on DVI-D

Thank you for the modeline. The one you gave me works very well on analog output, but with DVI-D the screen stays blank.

Does anyone know whether the 5500 FX supports 1920x1200 digital at all? I used to have an old Quadro card with very fine analog output but unusable digital (i think up to 1024x768 or so), and I'm starting to fear that the digital output in the Nvidia cards is mostly for decoration.
haynold is offline   Reply With Quote
Old 06-17-05, 05:35 AM   #4
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Question Re: Reduced Blanking on DVI-D

Oliver,

[Sorry this got soo long. I'm hoping it's remotely helpful and who knows? Maybe the search indexes will make it useful for others too...]

I too am trying to get my nVidia-based graphics card to work at 1920x1200 with DVI. I too am running into some difficulties. I'll tell you where I'm at and maybe we can help each other...

I have a Samsung SyncMaster 243T. It is a 24" LCD with both DVI and VGA inputs. It's native resolution is 1920x1200. It came with a DVI-D Single Link cable (and a VGA cable, but where's the fun in using that?)

My graphics card is an nVidia FX5700 Ultra-based card. I don't have the actual card manufacturer name and manual handy at the moment, but the card has a DVI output, a VGA output, and an S-Video output. The card has 128MB of texture memory.

My PC is currently set up to dual boot XP and Fedora Core 3 (linux). Under Windows, using relatively recent nVidia drivers, I cannot get 1920x1200 to work via DVI. Things work fine over DVI if I use 1600x1200. Since I'm not a windows guru, I'm not very good at diagnosing the problem there. Instead, I tried to get things working in Linux first. This is where I made a few potentially interesting discoveries...

First, I can get the whole mess to work at 1920x1200 via DVI in Linux...*but* only if I use the Xorg distributed 'nv' driver. If I use the nVidia supplied, closed-source driver, called 'nvidia', it won't work. I would prefer to use nVidia's driver, so the current arrangement isn't really acceptable for me. (Not to mention that the current Linux setup doesn't help me at all with Windows...)

Here's what I figured out while mucking around with my xorg.conf file. My monitor is fairly persnickety about the timings that it will accept. It will only see 60Hz timings. What's more, it seems to be particular about the sync pulse timings as well. Perhaps if your monitor is less touchy, you might be able to work around the problems that I am seeing by trying a timing with a smaller sync/porch profile. This might help you avoid what I think is the problem (see below).

If you examine your monitor specifications, via the manual or maybe the vendor's web site, you can probably find the exact recommended timings. Alternatively, in Linux with Xorg, you can examine the Xorg log file (in /var/log/Xorg.0.log for me) to see what the X server and driver are able to deduce about your monitor from its EDID responses. In my case, the EDID data from the monitor in these logs said exactly what timing the monitor wanted for its native resolution (1920x1200). This seemed great. I should just plug those numbers in and I'm good to go. Right? Apparently not.

The problem that I seem to be running into is that the monitor's preferred timing for 1920x1200 resolution is actually 2080x1235 pixels once you add in the "invisible" sync times. This size multiplied by the monitor's required 60Hz refresh results in a total pixel clock of 154Mhz. "So what?", you might ask. Welp. it seems that DVI-D Single Link cabling is only rated up to 150Mhz. The nVidia drivers, on both Windows and Linux, seem to know about this. They refuse to even try driving the card at that timing. Ironically, the Xorg 'nv' driver doesn't pretend to be nearly as clever and it seems to just work.

I honestly don't know if the nVidia driver limitation is due to the card's TMDS capabilities or it's assumptions about the capapbilites of Single Link DVI cables. If anyone knows, do tell! I'm trying to get my hands on a Dual Link DVI cable to try it, but it seems silly that I need one when I know that the current cable *can* work, since I've seen it!

Now some more technical mumbo-jumbo if you want to try some of this yourself under Linux. My monitor wants the following timing...

# Modeline reported by the Samsung SyncMaster 243T EDID...
# 1920x1200 @ 59.95 Hz (GTF) hsync: 74.04 kHz; pclk: 154.00 MHz
ModeLine "1920x1200" 154.00 1920 1968 2000 2080 1200 1203 1209 1235

I tried making up some timings that were <150Mhz but still 1920x1200 visible. Here are a few of them. For me, they didn't work even though the nVidia driver would actually accept them and drive the card. The monitor wouldn't sync them though. Perhaps other monitors are more forgiving with their timing ranges.

# None of the funny (half-baked) modes below are loved by my 243T.
ModeLine "1920x1200" 162.00 1920 1984 2176 2480 1200 1201 1204 1250
ModeLine "1920x1200" 148.99 1920 1968 2000 2080 1200 1203 1209 1235
ModeLine "1920x1200" 148.20 1920 1936 1984 2000 1200 1203 1209 1235
ModeLine "1920x1200" 138.60 1920 1968 2000 2080 1200 1203 1209 1235

So, anywho, I feel kinda stuck. Maybe one of those modes above helps you. For me, I wonder the following...

2005/06/18 9:30pm - Ok, I thought I would go back and edit this message to include my newer understanding of things based on what I learned from all the great comments and support from others here on the boards. Thanks for the help everybody! I think I can answer some of my own questions now. I thought that I would go ahead and update this message to include the things that I think I've figured out for posterity's sake.

o Is the problem with the nVidia drivers fundamentally a DVI Single Link bandwidth issue or a TMDS issue with my card?
It seems most likely that the problem is a limit in my graphics card. It also appears that Single Link DVI is rated up to 165MHz (according to this SiliconImage white paper). Since Single Link DVI is rated up to 165MHz and my monitor's preferred (only!) 1920x1200 mode is 154MHz, it should be able to work with the current cabling. What's more, it appears that the 150MHz limit that I was seeing is most likely a limit that the driver induced based upon its understanding of the limits of my graphics card, not the DVI cabling. Other people (running the more advanced (and more expensive!) 6800 cards) have seen the driver limit them to 165MHz instead of the 150MHz limit that I see.
o Is there a way to tell the nVidia drivers to "just do it"? Will that work? It seems to work with the 'nv' driver...
I have not been able to figure out a way to tell the nVidia 7664 build driver to "just do it". *But* the 7174 build and the Xorg 'nv' driver both seem to lack the offending checking code, so they can be made to work.
o Is anyone else able to use an nVidia driver at a pixel clock above 150Mhz on Single Link DVI?
Yes. See above. Additionally, Single Link DVI is really only rated for 1600x1200 with CRT sized sync timings. *But* with reduced sync profile timings Single Link DVI can drive 1920x1200 or the 1920x1080 (HDTV-like) resolutions.
o Even if I could get this to work under Linux, how do you do equivalent timing tweaking under Windows?
I saw recommendations to look into a tool called PowerStrip. I haven't tried it myself...yet.
o Would a Dual Link cable just solve all my problems? (If so, why the heck didn't Samsung just ship one in the box with this product that is obviously intended to run at 1920x1200!?!?)
Unlikely. Given the supposition above that the problem is really the capabilities of my graphics card's TMDS signalling chip, a better cable would probably have no effect whatsoever.
Any of you big brained people out there willing to help clue me in?

TIA!

-t

Last edited by kulick; 06-19-05 at 12:38 AM. Reason: Answering some of my own questions based on subsequent advice and observations...
kulick is offline   Reply With Quote
Old 06-17-05, 08:33 AM   #5
davemoore
Registered User
 
Join Date: Sep 2004
Posts: 50
Default Re: Reduced Blanking on DVI-D

Two ideas:

1. The previous version of the Linux driver (7174) was not very strict about maximum pixel clocks over DVI. For me, that driver claims my maximum pixel clock is 400 MHz, so with it you may be able to "just do it".
2. You may want to see if adding
Option "IgnoreEDID" "yes"
helps. Sometimes a new modeline won't be recognized without it.
davemoore is offline   Reply With Quote
Old 06-17-05, 03:50 PM   #6
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Unhappy Re: Reduced Blanking on DVI-D

Thanks for the response...

Quote:
Originally Posted by davemoore
1. The previous version of the Linux driver (7174) was not very strict about maximum pixel clocks over DVI. For me, that driver claims my maximum pixel clock is 400 MHz, so with it you may be able to "just do it".
I'm not at home now and the machine is off so I can't grab the Xorg.0.log file, but I vaguely remember that the maximum pixel clock of 4000Mhz applies to the analog/VGA output. I can get things to work in analog. I just don't want to.

Quote:
Originally Posted by davemoore
2. You may want to see if adding
Option "IgnoreEDID" "yes"
helps. Sometimes a new modeline won't be recognized without it.
I have used this option, but it doesn't seem to help with my problem. This allows me to ignore the EDID reported vertical and horizontal frequency requirements of the monitor. It doesn't seem to convince the driver/Xorg to ignore the DVI Single Link bandwidth constraint though.

Thanks for the ideas. Got any others?
kulick is offline   Reply With Quote
Old 06-17-05, 04:00 PM   #7
haynold
Registered User
 
Join Date: Jun 2005
Posts: 6
Angry Re: Reduced Blanking on DVI-D

Hi!

I tried using the open "nv" driver. The good thing is that with this driver I get more detail about the panel's expected timing. 1920 1968 2000 2080 1200 1203 1209 1235, as SaTaN0rX suggested, is indeed correct. However, even with EDID turned off I get an error message with the "nv" driver

(II) NV(0): Mode "1920x1200" is larger than BIOS programmed panel size of 1280 x 1024. Removing.

I assume that the BIOS referred to is the graphic card's, and I have no clue where it got its notion about the panel size from.

Hell, I'm getting a Ph.D. today but I can't get a fregging graphics card to work... sad world.
haynold is offline   Reply With Quote
Old 06-17-05, 04:17 PM   #8
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Default Re: Reduced Blanking on DVI-D

Quote:
Originally Posted by haynold
Hi!
Heyas!

Quote:
Originally Posted by haynold
I tried using the open "nv" driver. The good thing is that with this driver I get more detail about the panel's expected timing. 1920 1968 2000 2080 1200 1203 1209 1235, as SaTaN0rX suggested, is indeed correct. However, even with EDID turned off I get an error message with the "nv" driver

(II) NV(0): Mode "1920x1200" is larger than BIOS programmed panel size of 1280 x 1024. Removing.

I assume that the BIOS referred to is the graphic card's, and I have no clue where it got its notion about the panel size from.
Oy! I don't know about that one. For me, the custom mode gets eliminated by a line like the following, but only if the dot clock is >150Mhz.

(II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan)

If the dot clock is less than 150Mhz then Xorg and the driver seem willing to try it, but my monitor won't sync to it. My monitor *really* wants that timing that both you and I quoted above.

Did you try any of the other timings that I included? Maybe keep trying the lower dot clock ones until the Xorg log shows that the card is indeed trying the timing that you requested. Then once you have the card driving it, you can validate whether or not your monitor is as persnickety as mine.

I'm somewhat skeptical that this will work for you though, given the error that you are getting. Maybe the source code would elucidate things some. Maybe I'll try to dig through the Xorg stuff when I get home tonight...*hrrrrm*...

Quote:
Originally Posted by haynold
Hell, I'm getting a Ph.D. today but I can't get a fregging graphics card to work... sad world.
Congrats and oh boy. It is sad isn't it? I used to do this graphics thing for a living and I can't get it to work either.
kulick is offline   Reply With Quote

Old 06-17-05, 04:55 PM   #9
SaTaN0rX
Registered User
 
Join Date: Apr 2005
Posts: 86
Default Re: Reduced Blanking on DVI-D

Option "ExactModeTimingsDVI" "boolean"
Forces the initialization of the X server with the exact
timings specified in the ModeLine. Default: For DVI
devices, the X server inilializes with the closest mode in
the EDID list.

have you tried this ?

maybe helps, atherways, i can't help.

i could get my 1280x1024 monitor running with some 76Hz sun mode over DVI ...
SaTaN0rX is offline   Reply With Quote
Old 06-17-05, 05:50 PM   #10
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Default Re: Reduced Blanking on DVI-D

I'll try that one tonight and let ya know what I find. Thanks again for the advice!

-t
kulick is offline   Reply With Quote
Old 06-18-05, 03:55 PM   #11
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Default Re: Reduced Blanking on DVI-D

Ok, I tried "ExactModeTimingsDVI", but it didn't seem to help. The fundamental problem appears to be the following...

When I run with the Xorg 'nv' driver, I see the following in Xorg.0.log...

...
(--) NV(0): Chipset: "GeForce FX 5700 Ultra"
...
(II) NV(0): Probing for analog device on output A...
(--) NV(0): ...found one
(II) NV(0): Probing for analog device on output B...
(--) NV(0): ...can't find one
(II) NV(0): Probing for EDID on I2C bus A...
(II) NV(0): I2C device "DDC:ddc2" registered at address 0xA0.
(II) NV(0): I2C device "DDC:ddc2" removed.
(II) NV(0): ... none found
(II) NV(0): Probing for EDID on I2C bus B...
(II) NV(0): I2C device "DDC:ddc2" registered at address 0xA0.
(II) NV(0): I2C device "DDC:ddc2" removed.
(--) NV(0): DDC detected a DFP:
(II) NV(0): Manufacturer: SAM Model: f7 Serial#: 1312961076
...
(II) NV(0): Monitor0: Using default hsync range of 30.00-80.00 kHz
(II) NV(0): Monitor0: Using default vrefresh range of 55.00-75.00 Hz
(II) NV(0): Clock range: 12.00 to 400.00 MHz
...
(**) NV(0): *Mode "1920x1200": 154.0 MHz, 74.0 kHz, 60.0 Hz
(II) NV(0): Modeline "1920x1200" 154.00 1920 1968 2000 2080 1200 1203 1209 1235
...from here things work fine. You can see that the driver finds the card correctly. Then it detects the monitor on the DVI port. The monitor's EDID data reports a 1920x1200 mode and since the driver believes that the clock range is up to 400.00 MHz (see the green part), the driver uses that 1920x1200 modeline and the monitor displays things just fine.

When I run with the nVidia driver ('nvidia'), I see the following in Xorg.0.log...

(II) NVIDIA(0): NVIDIA GPU detected as: GeForce FX 5700 Ultra
(II) NVIDIA(0): Chip Architecture: 0x30
(II) NVIDIA(0): Chip Implementation: 0x36
(II) NVIDIA(0): Chip Revision: 0xa1
...
(II) NVIDIA(0): Connected display device(s): DFP-0
(II) NVIDIA(0): Enabled display device(s): DFP-0
(II) NVIDIA(0): Mapping display device 0 (DFP-0) to CRTC 0
(--) NVIDIA(0): DFP-0: maximum pixel clock: 150 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS

...
(II) NVIDIA(0): Monitor0: Using default hsync range of 30.00-80.00 kHz
(II) NVIDIA(0): Monitor0: Using default vrefresh range of 55.00-75.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 150.00 MHz
...
(II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using mode "1920x1200" (no mode of this name)
...
(**) NVIDIA(0): Validated modes for display device DFP-0:
(**) NVIDIA(0): Mode "1280x800": 107.2 MHz, 62.6 kHz, 75.0 Hz
...doh! Here you can see the driver correctly detects the card. Then it goes looking for the monitor and realizes that the monitor is connected via Single Link DVI. Because of this it seems to limit the clock range to 150.00 MHz maximum (see the red part). Then, since it has constrained itself to a 150.00 MHz clock, it eliminates the monitor recommended 1920x1200 modeline.

Ok, so how do I lie to this driver and tell it not to assume a 150.00 MHz clock maximum for my Single Link DVI connection?

I read alot of the open source Xorg and 'nv' driver code last night and I think I understand the interaction between the driver and Xorg somewhat. The problem is...I don't have access to the source for the nVidia driver and it seems that the badness is being injected in there. Any of you nVidia linux driver gurus out there know of a backdoor or driver option that might help me?

I tried reading through 'strings nvidia_drv.o | less' last night, but I didn't see anything promising. I think that's a sign that I'm getting pretty desperate.

-t
kulick is offline   Reply With Quote
Old 06-18-05, 04:25 PM   #12
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Thumbs down Re: Reduced Blanking on DVI-D

Hey Oliver,

I did look at the driver a little regarding your problem while I had all the code here in front of me. I was hoping that maybe you could at least get things working with the 'nv' driver like the arrangement that I have.

Quote:
Originally Posted by haynold
I tried using the open "nv" driver. The good thing is that with this driver I get more detail about the panel's expected timing. 1920 1968 2000 2080 1200 1203 1209 1235, as SaTaN0rX suggested, is indeed correct. However, even with EDID turned off I get an error message with the "nv" driver

(II) NV(0): Mode "1920x1200" is larger than BIOS programmed panel size of 1280 x 1024. Removing.

I assume that the BIOS referred to is the graphic card's, and I have no clue where it got its notion about the panel size from.
It seems the BIOS in question is either that of the graphics card or, more likely I think, that of the display device itself. The 'nv' driver seems to get that panel size by rooting around in the graphics card's memory. My card seems to report 1920x1200.
(--) NV(0): Panel size is 1920 x 1200
It seems like you could do 'Option "FlatPanel" "False"' in your driver section. This would stop it from determining the bogus BIOS size. Unfortunately, it will probably affect a lot of other things too, so I'm not sure if it would really help. The code checks the value of the flat panel setting in a lot of places.

The code that is causing your problem is...

static ModeStatus
NVValidMode(int scrnIndex, DisplayModePtr mode, Bool verbose, int flags)
{
NVPtr pNv = NVPTR(xf86Screens[scrnIndex]);

if(pNv->fpWidth && pNv->fpHeight) {
if((pNv->fpWidth < mode->HDisplay) || (pNv->fpHeight < mode->VDisplay)) {
xf86DrvMsg(scrnIndex, X_INFO, "Mode \"%s\" is larger than "
"BIOS programmed panel size of %d x %d. Removing.\n",
mode->name, pNv->fpWidth, pNv->fpHeight);
return (MODE_BAD);
}
}

return (MODE_OK);
}
The only way that I could see to make those fpWidth and fpHeight variables not get assigned was to set FlatPanel to false or make the card think it is attached to a TV. Again, not very helpful. Sorry man!

-t
kulick is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
9600 GT with nvidia driver occasionally blanking tzp NVIDIA Linux 3 05-18-12 04:07 PM
New New New New Driver - Still No DVI decay NVIDIA Linux 2 09-16-02 06:47 PM
DVI to VGA converter anyone? Necrosis NVIDIA Linux 4 08-18-02 04:54 PM
LCD DVI support in Linux drivers? salobaas NVIDIA Linux 1 07-31-02 04:29 AM

All times are GMT -5. The time now is 05:29 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.