View Single Post
Old 02-01-04, 06:36 PM   #7
m2-
Registered User
 
Join Date: Apr 2003
Location: None of your Bizland.
Posts: 11
Default

After noticing that there are TLS versions of the glx module, too (makes zero sense to me), I replaced the non-TLS version in /usr/X11R6/lib/modules/ with the TLS version and everything works. Thanks!

Now for a question regarding understanding the situation, I hope someone from NVIDIA is willing to explain what's going on here. If the glx module is loaded by the X server, which is not threaded, why does it matter which threading model the module uses? And why does it segfault during X server initialization. The only thing I can figure right now, is that the X server uses this for indirect rendering, but that doesn't explain the problem to me. In short: why does it make a difference if the glx module is using TLS or not?

(it's pretty annoying to have to switch between the TLS and the non-TLS version when changing kernels -- but that's something else)

To zander: jein. You are right in that the non TLS module was being used. But the problem is not the GL libraries, but the glx module for the X server.

For future reference for other people: if you are using a 2.6 kernel _and_ a TLS-capable libc, check which version of the modules your X server is using.
m2- is offline   Reply With Quote