View Full Version : Linux SLI Shader Performance

12-16-06, 12:09 PM
I have written a ray casting program using GLSL shaders. Needless to say, it is a GPU intensive program! I would like to try SLI, but I'm finding little information regarding shader performance while using SLI in Linux. The information I've found so far suggests that there is absolutely no benefit to SLI in Linux, but the evidence provided is derived from a different class of application. I don't use anti-aliasing, don't render millions of primitives, I just upload the models into texture memory and cast rays against them. The resulting 10 fps on a 7800GTX is nice, but will using SLI with two 7800GTX's in Linux result in 20 fps?

An alternative would be to get an 8800GTX. Does anyone have a "gut feel" for the frame rate difference between a shader intensive program on a 7800GTX and the same program on an 8800GTX in Linux? Again, I'm not finding any data for a non-game application.

The final alternative is to port the loader code to Windows, which apparently has no difficulty utilizing SLI. As distasteful as this would be to me (Jihad! Jihad!), it may be the most time/cost effective solution.

I'm obviously looking for anecdotal evidence, otherwise I would have posted the code for benchmarking. If someone has a link to a particularly enlightening web page, I would appreciate the assistance.

Thank you for your time,

Jasper Milquetoast

12-16-06, 12:36 PM
You'll more than likely find the answers you want in the Nvidia linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14) forum. GL!

12-16-06, 04:49 PM
Thank you; I've posted in the Linux forum too!