Originally posted by bwkaz
You are doing something like an xhost +remote_host_name before starting up the program, right? And you are doing something similar to export DISPLAY=xserver_ip_address:0.0, right?
Dumb questions maybe, but you didn't say...
No, that's not at all dumb. I've probably missed out quite a bit of important information, which I'd be happy to provide (although I'll have to do this tomorrow, since I don't have access to the machine right now). BTW, I've noticed one other person with what looks like a similar problem. (GLXBadRenderRequest remote display SGI or SUN
). But yes, I had done this, and I got the same result with or without setting DISPLAY and exporting it. Gears and sproingies will run on the remote machine and appear to make use of the hardware acceleration on the local machine.
When you say "the machine that's running the client", you mean the machine whose CPU the client is primarily using (the one you started the screensaver from), not the machine whose graphics card it's using, right? You may not need the drivers over there, but I don't know for sure. It depends on whether the screensaver uses libGL or the X server does -- and I think it's the client that uses it.
Yes; when I said "the machine that's running the client", I did indeed mean the machine that was running the screensaver, and not the one that's running the X server.
I noticed that similar errors when running locally are caused by existing GL libs which would conflict with NVidia's, and I'm wondering whether this is related. I'm not sure, but I think the client uses libGL, even for GLX. What's confusing is that when I run the client on the same remote machine, but display it on another local machine that uses the Xfree driver rather than NVidia's, I don't get this error. It's a little slow of course, but I don't think it's much slower than running the screensaver locally.
Thanks for helping.