|11-19-07, 04:39 AM||#1|
Join Date: Nov 2007
I've got a problem using an NVIDIA Geforce 9750GT on a Linux box.
When I hook up a monitor directly, a resolution of 1920x1200 is used automatically, which is fine. I need then to hook up a DVI extender, which is capable of using the same resolution. The image is fine after the extender on the remote location.
However, when I need to restart the computer (or just the X server), the resolution is forced to 1024x768 and won't go higher. The monitor it detects is a Cat5 DVI extender, which is correct, but the resolution won't go up.
My workaround is quite difficult: going with the large LCD monitor to the computer, hook it up directly, reboot, hook up the extender and go back with the monitor to the remote location.
Is there a way to disable the auto-detection of monitors? It should just stay at 1920x1200.
|Thread||Thread Starter||Forum||Replies||Last Post|
|New New New New Driver - Still No DVI||decay||NVIDIA Linux||2||09-16-02 05:47 PM|
|DVI to VGA converter anyone?||Necrosis||NVIDIA Linux||4||08-18-02 03:54 PM|
|LCD DVI support in Linux drivers?||salobaas||NVIDIA Linux||1||07-31-02 03:29 AM|