View Single Post
Old 08-15-06, 11:59 AM   #5
netllama
NVIDIA Corporation
 
Join Date: Dec 2004
Posts: 8,763
Default Re: CRT detection and/or DFP ignore doesn't work longer in newer driver versions

lathans,
Using a Y-split connector to two display devices off of one CRTC has never been a supported configuration, and you were lucky that it ever worked with older drivers. My best guess is that there was a bug lurking somewhere that happened to allow your corner case to work, and that bug was fixed in the latest driver.

pe1chl,
Different functionality changes are implimented for different reasons. There isn't an overarching policy dictating driver changes, other than to improve functionality, performance & stability.

As for your examples:
* "the DVI pixel clock limit that makes high resolution modes fail because the clock is above 135 MHz, while the card usually works to 155". I'm not sure what you mean by "the card usually works to 155". What is actually happening here is that a reduced blanking mode was silently used in the past. There was a bug in 1.0-7676 where that reduced blanking was broken, rendering some modes which required a pixelclock slightly above the max pixelclock to fail. That bug was resolved a while ago. Reduced blanking is still in use, when possible. If you have a specific case where this is failing, please start a new thread with a -logverbose 6 bug report.
* "postings by others that indicate that their monitor returns unusual EDID data and the driver goes by that information, rejecting the valid configuration info they provide". The information in a display's EDID is usually preferable to a random modeline generated by gtf (or the equivalent). There are planned improvements to better handle obviously bogus EDID data. This can always be overridden with the UseEdid parameter, so that should satisfy your 'expert' mode request in this particular case.
* "the complicated situation around "ConnectedMonitor" and "UseDisplayDevice" that recently changed again. Now, when I boot the system I have to have all monitors switched on. I can't understand for the **** of it why I should be forced to switch on my VGA monitor (actually my TV) for the driver to believe I have that monitor, when I TELL the driver that I have it." This sounds like a bug, based on the limited information you've provided. YOu shouldn't need to have a TV powered on from bootup, to later use it in X. Please create a new thread with additional details, and an nvidia-bug-report which captures the problem.

Thanks,
Lonni
netllama is offline   Reply With Quote