View Single Post
Old 08-15-06, 01:05 PM   #6
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default Re: CRT detection and/or DFP ignore doesn't work longer in newer driver versions

Quote:
Originally Posted by netllama
Different functionality changes are implimented for different reasons. There isn't an overarching policy dictating driver changes, other than to improve functionality, performance & stability.
You must have noticed that for the last few driver releases there has been an increasing number of posts discussing situations where the user has a desire to use a certain resolution or a certain refresh frequency, of which he is quite sure that his display supports it or he at least wants to take the responsibility, but then the driver rejects his configuration "because it knows better".
Probably the "automatic detection" side of things has improved, but the experience of the user (especially the users who still know about the times when the computer did what you told it to do) maybe hasn't.

Quote:
Originally Posted by netllama
As for your examples:
* "the DVI pixel clock limit that makes high resolution modes fail because the clock is above 135 MHz, while the card usually works to 155". I'm not sure what you mean by "the card usually works to 155".
What I meant here is the situation where older cards (often 5200 based) worked just fine with older Nvidia drivers, and still work fine with the nv driver, but with newer Nvidia drivers they reject the proper video mode because of the pixel clock limit that apparently has been set somewhere in the eeprom. When I bought a Dell 2405FPW I ran into this problem. Its EDID suggests a 154 MHz clock, it worked fine with nv and older Nvidia drivers, but it refused to work with the then current driver. Later there was an option to override this check and it again worked fine. It already used reduced blanking.

Quote:
Originally Posted by netllama
There are planned improvements to better handle obviously bogus EDID data. This can always be overridden with the UseEdid parameter, so that should satisfy your 'expert' mode request in this particular case.
That is good to hear. Maybe another idea is to always allow a user-specified modeline that falls within the user-specified horizontal and vertical frequency limits, without making checks against EDID provided limits of resolution. Resolution limits probably are pointless anyway, and it looks like they sometimes are reported incorrectly. Frequency limits are what have to be observed for (CRT) monitors.
Nothing more frustrating than having a shiny new 1280x1024 LCD and see it startup in 640x480 because "the vertical refresh for the 1280x1024 mode is 52Hz and the driver likes to see 60Hz" or somesuch.
(I know there is yet another option to override that specific check)

Quote:
Originally Posted by netllama
* "the complicated situation around "ConnectedMonitor" and "UseDisplayDevice" that recently changed again. Now, when I boot the system I have to have all monitors switched on. I can't understand for the **** of it why I should be forced to switch on my VGA monitor (actually my TV) for the driver to believe I have that monitor, when I TELL the driver that I have it." This sounds like a bug, based on the limited information you've provided. YOu shouldn't need to have a TV powered on from bootup, to later use it in X. Please create a new thread with additional details, and an nvidia-bug-report which captures the problem.
I'll do that when I have to reboot the system for some reason. For now I can mention that with the previous-to-last revision of the driver, when the "ConnectedMonitor" had been replaced by "UseDisplayDevice" for some reason, I had to fight quite a long time to get it OK again.
My config has a Dell 2405FPW on the first output in DVI, and an LCD TV on the second output in VGA (via adaptor). Don't focus on the word TV. It is a TV but for your purposes it is just a VGA monitor.
At first (using the old config with "ConnectedMonitor") it had just swapped my monitor devices, apparently thinking that a VGA device should be the first monitor and a DVI the second (which is documented, but I find it kind of backwards). So all my user programs were on the TV and my Dell only showed the background I keep on the TV until I start a movie there.
After I had changed to "UseDisplayDevice", like this:

Section "Device"
BoardName "GeForce 6600 GT/AGP/SSE2"
VendorName "NVidia"
BusID "1:0:0"
Driver "nvidia"
Identifier "Device[0]"
Option "AllowDDCCI" "on"
Option "Coolbits" "1"
Option "UseDisplayDevice" "DFP-0"
Screen 0
EndSection

Section "Device"
BoardName "GeForce 6600 GT/AGP/SSE2"
VendorName "NVidia"
BusID "1:0:0"
Driver "nvidia"
Identifier "Device[1]"
Option "Coolbits" "1"
Option "UseDisplayDevice" "CRT-1"
Screen 1
EndSection

Now it works ok when both monitors are ON during boot, but when the VGA TV is OFF during boot it declares the second device not being present and omits the entire screen 1 from the running configuration. This again leads to KDE not seeing the screen, assuming that it is gone, and aborting its startup sequence at the point where existing windows are being restarted. Turning on the TV at that time of course does nothing, I have to restart X.

All this worked fine before. I will try to provide mode detail at the next opportunity (but I rarely boot the system more than twice a year).
pe1chl is offline   Reply With Quote