Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 03-10-03, 07:49 AM   #1
Master_Yoda
Registered User
 
Join Date: Mar 2003
Posts: 5
Default NVIDIA drivers not thread-safe?

Im currently working on a project that needs to use a thread-safe OpenGL version. When i compile the project with the latest MESA libs the program runs fine. If I compile it with the NVIDIA GLX package I recieve a segmentation fault when I try to call any OpenGL functions. However if I only make rendering calls from the thread that runs the GTK+ part of the program it work with the NVIDIA GLX package. Everything seems to point so thread-unsafe software from NVIDIA.

As I mentioned I do need to make rendering calls from more than one thread so what are my options? Can anyone confirm that the NVIDIA GLX package isn't thread-safe? If so are there any plans for releasing a thread-safe version or do I have to throw our GF4 ti4200 out with the trash and find a graphics card that have working 3D-accelerated thread-safe Linux drivers?

Any help is much appreciated.

-Peder

Department of Computer Science,
University of Copenhagen
Master_Yoda is offline   Reply With Quote
Old 03-10-03, 11:39 AM   #2
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

I don't know if they're thread-safe or not, but as a workaround, you could put mutexes around your rendering code so that only one thread is rendering at any given time. This should be the behavior anyway, AFAIK -- I don't think any GL lib can handle >1 thread calling separate functions at the same time, though I could be wrong.

For example, there is only one set of matrix stacks on the card; if your code is pushing and popping GL matrices, then you have to leave the stacks in the same state you got them in before your thread is preempted. Which means you have to have mutex protection from glPushMatrix() all the way to glPopMatrix() (or whatever the calls are; I don't remember at the moment).
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote
Old 03-10-03, 12:17 PM   #3
Master_Yoda
Registered User
 
Join Date: Mar 2003
Posts: 5
Default

>I don't know if they're thread-safe or not, but as a workaround,
>you could put mutexes around your rendering code so that only
>one thread is rendering at any given time.

The code is already protected to avoid the situation you discribe. Only one thread is rendering at the time and it works fine for the software libs from MESA. The problem is that even though the code is protected it seg-faults when I use the NVIDIA libs.

-Peder

Department of Computer Science,
University of Copenhagen
Master_Yoda is offline   Reply With Quote
Old 03-10-03, 02:25 PM   #4
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

Really? That's ... strange.

Does the first thread's GL stuff work, and then the rest of them segfault?

Is TLS enabled on this machine (is it RH Phoebe)?

You said single-threading works, so it shouldn't be any bugs in the code... perhaps the libs require a different GLX context per thread or something like that?
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote
Old 03-10-03, 03:21 PM   #5
Master_Yoda
Registered User
 
Join Date: Mar 2003
Posts: 5
Default

Yep. The first thread can use GL stuff just fine but any other thread segfaults the second a GL function is called. Its running on a fresh RH 8.0 install with the newest NVIDIA driver package.

Its the fact that the exact same code runs perfectly with the software GLX but chrashes with the NVIDIA GLX that makes me suspect that something fishy is going on in the NVIDIA code.

-Peder

Department of Computer Science,
University of Copenhagen
Master_Yoda is offline   Reply With Quote
Old 03-10-03, 05:02 PM   #6
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

Hmm, yeah, the best I can come up with is maybe your threads are (or aren't) sharing some data structure that the GL library thinks they shouldn't (or should) be sharing.

If they're sharing it, then it's probably data that should be different per thread. If they aren't, then it's probably uninitialized in the other threads. But I really don't know what it would be...
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote
Old 03-11-03, 04:04 AM   #7
Master_Yoda
Registered User
 
Join Date: Mar 2003
Posts: 5
Default

I have now done a temporary rewrite of the code. The second thread is no longer rendering anything but instead emitting a signal (a GTK signal) when it needs to have something rendered. However this still seg-faults. So far the only way i can get the scene rendered on a regular basis is to make a gtk_timeout_add call in the GTK thread (the one that calls gtk_main). That way the application gets a timeout every 10 ms. and renders the scene to the main camera. This is a very very poor solution because in some cases every thread will have its own camera that needs updating once in a while. That means Ill have to write some code that somehow manages render requests from the cameras. While that isnt difficult it makes the code a lot uglier than it should be. It will also affect performance since threads cant have the scene rendered for them when they want but instead they have to wait for the next 10 ms timeout. That way many render requests can stack up.

The MESA version can do all this correctly with a mutex and a render call from the threads. The NVIDIA version needs a lot of hacks and ugly restructuring... unfortunately I need high rendering speed :-(.

It would all be easier if the NVIDIA GLX was working :-).

-Peder

Department of Computer Science,
University of Copenhagen
Master_Yoda is offline   Reply With Quote
Old 03-11-03, 08:17 AM   #8
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default



Are you using a Gtk GL widget? (Is there even such a thing?) I'm wondering if there's an issue with Gtk itself -- although probably not, because Mesa works. Hmm, wish I could help more than just guessing like this...
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote

Old 03-11-03, 08:54 AM   #9
Master_Yoda
Registered User
 
Join Date: Mar 2003
Posts: 5
Default

Yes, im using gtkglarea which is a Gtk GL Widget.. Im getting more and more convinced its the Nvidia GLX that isn't thread-safe though. Like you mentioned yourself it works fine with Mesa but chrashes with Nvidia. I furthermore found out that any attempt to use the widget from any other thread results in a seg-fault. Even emitting a signal that starts a rendering of my scene seg-faults. If I move the instanciation of the camera (which contains the widget) to the rendering thread (I only have one at the moment) the program doesn't seg-fault but the widget doesn't get redrawn for some reason.
Everything points to the Nvidia GLX chrashing whenever any attempt at using the GL widget outside of the thread that init's it is made. Debugging with DDD shows that the seg-faulting function call always is the first GL call made when attempting to render to the GL widget.
Since I only need to get the scene rendered on-screen from one camera I think i will stick with the timeout version for the moment. The other threads in need of rendering will be given their own Gtk GL widget as it seems that wont seg-fault. I wont be able to show those additional widgets on-screen but as long as the scene is rendered to the framebuffer i can copy it from there and do the processing I need.

Thanks for your input and ideas :-)

-Peder

Department of Computer Science,
University of Copenhagen
Master_Yoda is offline   Reply With Quote
Old 03-11-03, 10:14 AM   #10
merlin42
Registered User
 
Join Date: Sep 2002
Posts: 52
Default

quote from: http://oss.sgi.com/projects/ogl-sample/ABI/

Quote:
3.7. Thread safety (the ability to issue OpenGL calls to different graphics contexts from different application threads) is required. Multithreaded applications must use -lpthread.
bwkaz is correct, each thread needs a different graphics context, it may be that Mesa relaxes this restriction.

Having one gtkglarea per thread seems to mostly solve your problem, probably b/c each widget gets a seperate graphics context.

What does sound like a real bug is the lack of output, which may be a longstanding problem: http://www.nvnews.net/vbulletin/show...&threadid=1777

ps You may want to look into moving to gtkglext instead of gtkglarea: see http://www.student.oulu.fi/~jlof/gtkglarea/
merlin42 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Drivers Receive Windows 8 Certification News Archived News Items 0 06-01-12 05:30 AM
Radeon 9700 not all that? sancheuz Other Desktop Graphics Cards 200 10-12-02 09:31 PM
Nvidia Stereo Drivers Soudontsay NVIDIA Windows Graphics Drivers 2 08-26-02 10:48 AM
nvidia drivers in a motherboard with AGP 1.0 (motherboard MVP3+) knocker NVIDIA Linux 1 08-19-02 01:57 AM
NVIDIA 2960 Drivers & RH 7.3 W/2.4.18-5 XASCompuGuy NVIDIA Linux 6 08-02-02 11:53 AM

All times are GMT -5. The time now is 07:44 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.