Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-31-06, 09:02 PM   #1
greystock
Registered User
 
Join Date: Aug 2005
Posts: 3
Default pbuffer rendering in background X-session

Hello,

I have a rendering program running that uses wglCreatePbufferARB to create a context for OpenGL rendering. When I use a single X-session, the program works fine and rendered images can be retrieved and used (sent to to a web server).

What I would like to do is set things up entirely in the background:

1. start two X sessions, e.g. via
startx (start first session)
startx -- :1 (start second session from different console)
2. start the rendering program in one X-session
3. use the computer from the other X-session, i.e. make this the foreground X session, via, e.g. CTRL-ALT-F7/F8

I've tried setting this up, but things only work when the rendering program is in the foreground X-session (I can happily switch between them, but rendering only occurs at those times that the program X-session is active).

Any feedback on whether it is possible to do this would be gratefully received. The hardware is a Tyan K8WE motherboard, with two Quadro FX3450 PCIexpress cards (I'm only use one currently, but if separate cards for separate X servers is the way to go can configure this).

Many thanks
greystock is offline   Reply With Quote
Old 08-01-06, 03:40 AM   #2
Thunderbird
 
Join Date: Jul 2002
Location: Netherlands, Europe
Posts: 2,105
Default Re: pbuffer rendering in background X-session

Yuck, this sounds very dirty. If there's a good reason for this way of doing things 'GLX_EXT_import_context' using which you can share GLX contexts between multiple X displays.

Why are you doing things this way? It is normal that a second X server basicly gets disabled when you ctrl-alt-fN away from it.
Thunderbird is offline   Reply With Quote
Old 08-01-06, 05:18 AM   #3
greystock
Registered User
 
Join Date: Aug 2005
Posts: 3
Default Re: pbuffer rendering in background X-session

Thanks for the reply, Thunderbird.

It's partly trying to make the most of limited resources. The machine is the main visualisation server used within a VR lab. The web server running on the machine is set up to provide real-time interactive graphics to any connected web browser (similar approach to SGI's VizServer, but via a standard browser interface). As well as keeping the web server running continually, I also want to allow people to log in at the machine and use its graphics features normally, so using a separate X-session (and maybe the second FX3450 card) for their login, from the one that the web server uses for its own rendering needs, is desirable.

A side issue is that I'm also trying to measure how fully the graphics card resources on a multi-card / multi-processor setup can be used. For a lot of the work, the VGA/DVI card outputs aren't needed, but the ability to render to pixel buffers and pull these back into main memory as fast as possible, is. It would be useful to know if separate X servers, each using it's own card will work, or if a single X server with access to both cards is going to be better.

Hope that this makes sense. I'm not sure that I've explained it as well as I might, but it's a bit of an odd configuration and I'm still working through my own mental model of the architecture.

All the best,

Mike
greystock is offline   Reply With Quote
Old 08-01-06, 06:41 AM   #4
Thunderbird
 
Join Date: Jul 2002
Location: Netherlands, Europe
Posts: 2,105
Default Re: pbuffer rendering in background X-session

I don't know if you need much performance but perhaps using Mesa is enough for your purpose?
Thunderbird is offline   Reply With Quote
Old 08-01-06, 09:37 AM   #5
AaronP
NVIDIA Corporation
 
AaronP's Avatar
 
Join Date: Mar 2005
Posts: 2,487
Default Re: pbuffer rendering in background X-session

@greystock:
You should be able to do what you want to do with a single X server with a separate X screen per card. Multiple OpenGL clients should be able to use pbuffers on the same X screen at the same time.
AaronP is offline   Reply With Quote
Old 08-06-06, 09:37 PM   #6
greystock
Registered User
 
Join Date: Aug 2005
Posts: 3
Default Re: pbuffer rendering in background X-session

Thanks, Thunderbird and Aaron.

I'd forgotten about possibility of using Mesa and that might be quite a good approach for developing the work. I would be really be trying eventually to exploit hardware acceleration if I can though. Using Mesa for now is something I'll definitely try.

I've experimented with the single X server / multiple X screens and this nearly does what I'm hoping for. I've allowed a gap between the screens (http://gentoo-wiki.com/HOWTO_Dual_Mo...Using_Xinerama) and thereby locked the mouse to the first X screen so that the user is not really aware of the existence of the second X screen that is used for the renderer server.

The remaining problem is how to configure the system so that users can locally log on/off the machine without affecting the renderer.

If I write a normal daemon (that uses no GLX resources), I can start this, leave it running, and users logging on have no effect.

Would you be able to advise me on the best approach to writing a similar daemon, but one which does need GLX reources? This is a bit more general a question, but given the trend towards GPGPU programming, one that a number of people may have an interest in.

I've used graphical login:
id:5:initdefault: (in inittab)

From inspecting Xorg.0.log, it appears that upon a user closing a session and another user logging in, the X server is not restarted, but the NV-GLX module is reloaded (have I interpreted this correctly?)

If this is the case, would there be hooks that I could use to detect when the GLX module is about to become unavailable; and, similarly, to detect when it is again available? Or, with the present X architecture, do I have to leave myself permanently logged in, if I wish to have access to GLX computation?

Many thanks

Mike
greystock is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
[GeForce 8800 GTS] 2D rendering regression (extreme slowdown) introduced with 295.49 Seb L. NVIDIA Linux 0 06-22-12 06:48 AM
Titan Supercomputer Session Showcases Science on GPUs News Latest Tech And Game Headlines 0 05-16-12 03:30 AM
Remote rendering???? nVIDIOT@NASA NVIDIA Linux 1 08-16-02 01:00 PM

All times are GMT -5. The time now is 07:04 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.