View Single Post
Old 08-13-06, 08:25 AM   #1
Registered User
Join Date: Aug 2006
Posts: 2
Default Problems with TwinView on Ubuntu


I'm trying to install a TwinView system on my Ubuntu Linux 6.06 LTS. my GPU is a GeForce 6800 and the monitors are both 20" Samsung syncmaster 20GLsi.
The direct rendering works fine.
The TwinView worked twice, but then after a reboot it failed again. The nvidia logo wasn't split over both screens but it just showed up on one monitor.
Both monitors are well detected and the TwinView is activated. The nvidia driver show two available monitors (CRT-0 and CRT-1). But X only starts on one of them.

Here's a copy of the Xorg log

Here's a copy of my xorg.conf

Does anybody have a clue of what's wrong?

EDIT : After many tried it seems that the problem is located here :

(**) NVIDIA(0): TwinView enabled
(WW) NVIDIA(0): Unable to read EDID for display device CRT-0
(II) NVIDIA(0): NVIDIA GPU GeForce 6800 at PCI:1:0:0
(--) NVIDIA(0): VideoRAM: 131072 kBytes
(--) NVIDIA(0): VideoBIOS:
(II) NVIDIA(0): Detected AGP rate: 8X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 6800 at PCI:1:0:0:
(--) NVIDIA(0):     CRT-0
(--) NVIDIA(0):     Samsung (CRT-1)
(--) NVIDIA(0): CRT-0: 400.0 MHz maximum pixel clock
(--) NVIDIA(0): Samsung (CRT-1): 400.0 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Devices: CRT-0, CRT-1
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0):     "1600x1200,1600x1200"
(II) NVIDIA(0): Virtual screen size determined to be 1600 x 1200
(WW) NVIDIA(0): Unable to get display device CRT-0's EDID; cannot compute DPI
(WW) NVIDIA(0):     from EDID.
(==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
Even if the nvidia driver detect 2 monitors and validate 1600x1200,1600x1200 (so virtualy 3200x1200), the virtual screen choosen is only 1600x1200

Last edited by Lumbermatt; 08-13-06 at 09:21 AM.
Lumbermatt is offline   Reply With Quote