View Single Post
Old 06-18-05, 03:55 PM   #11
kulick
Todd Kulick
 
Join Date: Jun 2005
Location: Like, in the Bay Area
Posts: 23
Default Re: Reduced Blanking on DVI-D

Ok, I tried "ExactModeTimingsDVI", but it didn't seem to help. The fundamental problem appears to be the following...

When I run with the Xorg 'nv' driver, I see the following in Xorg.0.log...

...
(--) NV(0): Chipset: "GeForce FX 5700 Ultra"
...
(II) NV(0): Probing for analog device on output A...
(--) NV(0): ...found one
(II) NV(0): Probing for analog device on output B...
(--) NV(0): ...can't find one
(II) NV(0): Probing for EDID on I2C bus A...
(II) NV(0): I2C device "DDC:ddc2" registered at address 0xA0.
(II) NV(0): I2C device "DDC:ddc2" removed.
(II) NV(0): ... none found
(II) NV(0): Probing for EDID on I2C bus B...
(II) NV(0): I2C device "DDC:ddc2" registered at address 0xA0.
(II) NV(0): I2C device "DDC:ddc2" removed.
(--) NV(0): DDC detected a DFP:
(II) NV(0): Manufacturer: SAM Model: f7 Serial#: 1312961076
...
(II) NV(0): Monitor0: Using default hsync range of 30.00-80.00 kHz
(II) NV(0): Monitor0: Using default vrefresh range of 55.00-75.00 Hz
(II) NV(0): Clock range: 12.00 to 400.00 MHz
...
(**) NV(0): *Mode "1920x1200": 154.0 MHz, 74.0 kHz, 60.0 Hz
(II) NV(0): Modeline "1920x1200" 154.00 1920 1968 2000 2080 1200 1203 1209 1235
...from here things work fine. You can see that the driver finds the card correctly. Then it detects the monitor on the DVI port. The monitor's EDID data reports a 1920x1200 mode and since the driver believes that the clock range is up to 400.00 MHz (see the green part), the driver uses that 1920x1200 modeline and the monitor displays things just fine.

When I run with the nVidia driver ('nvidia'), I see the following in Xorg.0.log...

(II) NVIDIA(0): NVIDIA GPU detected as: GeForce FX 5700 Ultra
(II) NVIDIA(0): Chip Architecture: 0x30
(II) NVIDIA(0): Chip Implementation: 0x36
(II) NVIDIA(0): Chip Revision: 0xa1
...
(II) NVIDIA(0): Connected display device(s): DFP-0
(II) NVIDIA(0): Enabled display device(s): DFP-0
(II) NVIDIA(0): Mapping display device 0 (DFP-0) to CRTC 0
(--) NVIDIA(0): DFP-0: maximum pixel clock: 150 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS

...
(II) NVIDIA(0): Monitor0: Using default hsync range of 30.00-80.00 kHz
(II) NVIDIA(0): Monitor0: Using default vrefresh range of 55.00-75.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 150.00 MHz
...
(II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan)
(II) NVIDIA(0): Not using mode "1920x1200" (no mode of this name)
...
(**) NVIDIA(0): Validated modes for display device DFP-0:
(**) NVIDIA(0): Mode "1280x800": 107.2 MHz, 62.6 kHz, 75.0 Hz
...doh! Here you can see the driver correctly detects the card. Then it goes looking for the monitor and realizes that the monitor is connected via Single Link DVI. Because of this it seems to limit the clock range to 150.00 MHz maximum (see the red part). Then, since it has constrained itself to a 150.00 MHz clock, it eliminates the monitor recommended 1920x1200 modeline.

Ok, so how do I lie to this driver and tell it not to assume a 150.00 MHz clock maximum for my Single Link DVI connection?

I read alot of the open source Xorg and 'nv' driver code last night and I think I understand the interaction between the driver and Xorg somewhat. The problem is...I don't have access to the source for the nVidia driver and it seems that the badness is being injected in there. Any of you nVidia linux driver gurus out there know of a backdoor or driver option that might help me?

I tried reading through 'strings nvidia_drv.o | less' last night, but I didn't see anything promising. I think that's a sign that I'm getting pretty desperate.

-t
kulick is offline   Reply With Quote