nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   Can't I run 2 X servers on the one card (http://www.nvnews.net/vbulletin/showthread.php?t=143886)

jdobry 01-19-10 03:45 PM

Can't I run 2 X servers on the one card
 
1 Attachment(s)
Hello,

Do you somebody know why I can't start 2 independent X servers on one card?

Second server every time crash on
Code:

(EE) Jan 19 22:32:43 NVIDIA(0): EVO Push buffer channel allocation failed
(EE) NVIDIA(0):  *** Aborting ***
(EE) Jan 19 22:32:43 NVIDIA(0): Failed to allocate EVO DMA push buffer
(EE) NVIDIA(0):  *** Aborting ***
(II) UnloadModule: "nvidia"
(II) UnloadModule: "wfb"
(II) UnloadModule: "fb"
(EE) Screen(s) found, but none have a usable configuration.

I need 2 independent Xservers, not Twinview.
Google found for me that I am not alone, but not solution.
Card is nVidia G130

AaronP 01-19-10 03:54 PM

Re: Can't I run 2 X servers on the one card
 
How are you starting the second X server? I.e., what command line options are you using? Please note that multiple X servers should work, but not on the same VT (i.e. using the -sharevts or -novtswitch options).

jdobry 01-19-10 04:20 PM

Re: Can't I run 2 X servers on the one card
 
I was add my xorg.conf into first post

Second X server is started by:
Code:

export QT_XFT=1
xinit  -- :1  -sharevts -layout Myth -br -dpms

And I thing that it is correct because it works when I use combination of Integrated GPU and G130 card.
But I want to use G130 only (integrated 8300 have too small power for HD video)

jdobry 01-19-10 04:27 PM

Re: Can't I run 2 X servers on the one card
 
1 Attachment(s)
Complete log from second xserver start is in attachment

AaronP 01-19-10 05:49 PM

Re: Can't I run 2 X servers on the one card
 
You can't drive the same graphics card with two different X servers at the same time.

jdobry 01-19-10 06:41 PM

Re: Can't I run 2 X servers on the one card
 
it was possible and supported from nvidia. see here http://http.download.nvidia.com/XFre...ppendix-p.html

EDIT: Oppps. Referenced page is another situation, two X screens, but one X server.

AaronP 01-19-10 07:01 PM

Re: Can't I run 2 X servers on the one card
 
Quote:

Originally Posted by jdobry (Post 2168679)
Referenced page is another situation, two X screens, but one X server.

Correct. That's supported.

mpaganini 05-30-10 01:39 PM

Re: Can't I run 2 X servers on the one card
 
I'm really surprised that this configuration is not supported. I've been running two X-servers on the same graphics adapter (same display, different virtual terminals: Alt-F7/F8) on Debian/Ubuntu for years. My last successful configuration was on Ubuntu 8.04, with nVidia driver 169.12. Now, I upgraded to 10.04 (which comes with 173 and makes it very hard to go back to 169.12) and it allows me to start another X-server, but once I logout of that server, the display goes into power save mode until I reboot the workstation (switching to a text based console does not work either.) Curiously, the problem seems to happen when the second X server exits. If I start the second server and switch between them with Ctrl-Alt-F7 / Ctrl-Alt-F8, everything works. If I go back to the first server and kill the second server manually (sudo kill <pid>), things work. It's only when I log out of the second server (or kill it with ctrl-alt-backspace) that things break horribly.

I'm using an oldie FX 5900 Ultra. It would be fantastic if we could go back to the behavior we saw on 169.12 regarding this. This is a very useful feature.

JaXXoN 05-30-10 05:47 PM

Re: Can't I run 2 X servers on the one card
 
Quote:

Originally Posted by mpaganini (Post 2260667)
I've been running two X-servers on the same graphics adapter (same display, different virtual terminals: Alt-F7/F8) on Debian/Ubuntu for years.

Please note that this is a different topic: this thread is about running two
X-servesr concurrently where the first X-Server should use the first output
of a video card while the second X-Server should use the second output
of the video card. The two X-Server could be configured to take their
input from a separate set of mice and keyboards so that two people
could work at the same time using the same PC. This would be called
a "Multiseat" setup, but as of today this only works with "Xephyr", where
you start one "physical" X-Server with two screens and then run an
instance of Xephyr on top of each X-Screen. Unfortunately, 3D support
in Xephyr is experimental and performance is lagging behind the physical
X-Server. I guess XGL would also work (where 3D performance is acceptable),
but XGL is deprecate.

Now what you are asking for is running two X-Servers concurrently, but
only one of the two X-Servers "owns" (both outputs of) the video card at
a time and where you can switch between the two X-Server with
ALT+CTRL+F7/F8. Means, you can not see the output of both X-Servers
at the same time. To my knowledge, this feature has never been officially
supported by nvidia but worked flawlessly for many years - at least for me it
still works with 185.18.14 on a GTX260. Can you please post an
nvidia-bug-report.log? (I'd recommend to open a new thread) Maybe
your setup just needs some fine tuning.

regards

Bernhard

mpaganini 05-31-10 12:20 AM

Re: Can't I run 2 X servers on the one card
 
Quote:

Originally Posted by JaXXoN (Post 2260762)
Please note that this is a different topic: this thread is about running two
Now what you are asking for is running two X-Servers concurrently, but
only one of the two X-Servers "owns" (both outputs of) the video card at
a time and where you can switch between the two X-Server with
ALT+CTRL+F7/F8. Means, you can see the output of both X-Servers
at the same time. To my knowledge, this feature has never been officially
supported by nvidia but worked flawlessly for many years - at least for me it
still works with 185.18.14 on a GTX260. Can you please post an
nvidia-bug-report.log? (I'd recommend to open a now thread) Maybe
your setup just needs some fine tuning.

I'm trying to keep a "standard" Lucid system, where 173 is the current driver. Naturally, if I can't fix it there, I'll start playing around with newer versions.

Thanks for the idea Bernhard. Strangely, I tried to reproduce the bug running X by hand, but it is a little hard to do so. I managed to capture a nvidia-report log when the problem was happening (X sessions with gnome in them). I'm opening another thread with the bug. Thanks a lot!

MP

kauos 10-16-10 08:21 AM

Re: Can't I run 2 X servers on the one card
 
Quote:

Originally Posted by AaronP (Post 2168644)
You can't drive the same graphics card with two different X servers at the same time.


Not entirely true.

The NVIDIA driver definitely makes it difficult, but it can be done.

At least I have achieved it with a VGA and a TV-OUT connection on the same card.

I have a Geforce 6200 card (VGA,DVI,TV-OUT), in an Athlon 2700+ (32bit). I am running Ubuntu 10.04 with the latest updates.

The trick it seems is to start both X servers as close in time as possible to one another.

From reading other posts, and from observing the behaviour myself, what seems to be happening when the NVIDIA driver loads up, is that it tries to grab all the video outputs for the current X server. Therefore the last X server that you run will grab all the video outputs regardless of whether or not it intends to use them.

If you start both X servers at the same time (or as close as possible), they both end up keeping the video output that they initially requested.

I've been running it for a few hours now and the results seem stable.

To make it easier on myself, I have written two simple bash scripts that wait until the other script is running before they start the X servers. This ensures both start very close together.

Its not perfect. There is some minor occasional video corruption on the VGA output but the TV-OUT seems fine and its definitely usable.

Still be nice if NVIDIA could fix this properly :)


2010-10-08 UPDATE
------------------------
Have gotten the right DVI cable today and have tested with the DVI Output. I can get DVI + TVOUT and DVI + VGA to work, but a three-seat system of DVI + VGA + TVOUT does not seem to work.

One additional pointer to get it working is that the DVI display seems to want to be started just before the TV-OUT display.

It seems though if you start the computer up with the DVI connected, then VGA + TVOUT stops working (error message : (EE) Oct 18 15:34:57 NVIDIA(0): Error setting DVC ), but its not really a huge problem.

One extra positive is that by using DVI+TVOUT instead of the VGA+TVOUT, it seems to fix the display corruption issue.

mpaganini 10-16-10 12:52 PM

Re: Can't I run 2 X servers on the one card
 
Well, I found a workaround to my problem. I discovered that the crash happens when you close X, not when you start it. So, now, instead of having the second X server "on demand", I have it running at the time. Switching between VTs works like a charm. Just don't try to logout or everything will go down the drain (at least with my driver/card combination). :)

-- MP


All times are GMT -5. The time now is 05:25 AM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.