View Single Post
Old 12-04-06, 10:57 PM   #1
Cyber Jedi
alaric's Avatar
Join Date: Aug 2003
Location: Amsterdam
Posts: 75
Default Monitor is not detected at all, and ConnectedMonitor / UseDisplayDevice do not help.

I have a CRT monitor that is not detected when connected to the DVI output of an FX5200 (using a DVI->VGA adaptor). Now you might say: then plug it into the VGA output-- but that isn't relevant imho. Moreover, I have reasons to want it to be plugged into the DVI output because I'm trying to set up some multi-monitor case. This didn't work and I've tracked down the problem to this simple case with one monitor.

What I need, for a start, is to find the options that will make this particular set up work.

From the whole Xorg.0.log file, these are therefore the relevant lines:

(WW) NVIDIA(0): No connected display devices detected; assuming 1 CRT
(II) NVIDIA(0): Assigned Display Device: CRT-0
(II) NVIDIA(0): CRT-0 assigned CRTC 0
Apparently, CRTC 0 is the VGA output, while the only monitor that I have is
connected to the DVI output. Therefore I need to convince the nvidia driver
somehow to use the DVI output anyway (instead of CRTC 0).

I'm using a very basic xorg.conf file, the relevant parts of which are:

Section "Device"
        Identifier      "FX5200_DVI"
        Driver          "nvidia"
        BusID           "PCI:1:0:0"
        Screen          0
        Option          "IgnoreDisplayDevices" "DFP, TV"

Section "Monitor"
        Identifier      "VMP17"
        Option          "DPMS"
        HorizSync       27-96
        VertRefresh     50-160

Section "Screen"
        Identifier      "Default Screen"
        Device          "FX5200_DVI"
        Monitor         "VMP17"
        DefaultDepth    24
        SubSection "Display"
                Depth           24
                Modes           "1024x768"
I've tried to use the option "ConnectedMonitor" - but that doesn't
work because that only forces an already detected monitor to change
type (ie, when a CRT is detected you can force to be treated as a DFP).
When no monitor is detected at all - and you only have CRT's, then
using 'Option "ConnectedMonitor" "CRT"' doesn't help one bit: it was
already assuming that it was a CRT anyway. The chosen output is still
the VGA plug.

I've also tried to use the option "UseDisplayDevice" - but that doesn't
help either: What happens is this: first a list of available devices is constructed.
This list would either be "CRT-0, CRT-1" or "DFP-0, CRT-1" or just "CRT-0" or
"DFP-0" (a bug in the driver causes it to be "CRT-0", "DFP-0" in another attempt
of me, which is a bug because it is nonsense-- but that is not relevant for this thread).
Next - you can assign X display(s) to one of that list. In this simple case I have
obviously a single X display, and ... there is only a single device, namely "CRT-0".
So, there is nothing to assign here - and CRT-0 is still happy sending signals to the
wrong output.

Does anyone have a suggestion of how to solve this problem?
Attached Files
File Type: gz nvidia-bug-report.log.gz (20.4 KB, 100 views)
Get my email address from my homepage.
alaric is offline   Reply With Quote