nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   NVIDIA Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=14)
-   -   280 GTX / 180.44 low OpenGL performance? (http://www.nvnews.net/vbulletin/showthread.php?t=131190)

okreylos 04-06-09 06:29 PM

280 GTX / 180.44 low OpenGL performance?
 
Hi,

I just put together a new computer (Intel i7 2.67 GHz, 6GB RAM, 280 GTX, Fedora 10 x86_64) and was expecting a significant OpenGL performance increase compared to the old one (Intel Xeon 3.6 GHz, 2GB RAM, 8800 GTX, Fedora 8 x86_64).

CPU-wise, the new computer is at least 2x faster in every application. With OpenGL, however, I got nuthin'. :(

I'm developing / running a visualization app that creates very large triangle meshes using vertex buffer objects and indexed triangle sets, and when creating a mesh of ~2M triangles, I get ~21fps on the old computer. On the new computer, I get -- guess what -- 21 fps.

On top of that, I initially got significant stutter for about 20 seconds, right after uploading the mesh into the vbo on the new computer, but then applied the nopat kernel option as suggested here, and now the stutter is barely noticeable, but still there.

On the old computer, I'm running driver 173.14.12, on the new one 180.44.

I've heard everywhere that the 280 GTX beats the stuffing out of the 8800 GTX -- have I been lied to? Have the Nvidia developers successfully throttled OpenGL performance in the 180.* driver series to force people to spend more money on new graphics cards? Am I doing something wrong? What's going on?

Does anybody have any insight? Are there any Xorg configuration options I should try? Apart from the disappointing performance and the (now barely noticeable) stutter, things are working fine.

AaronP 04-06-09 06:40 PM

Re: 280 GTX / 180.44 low OpenGL performance?
 
Please attach a bug report and a test application that reproduces the problem you're seeing.

okreylos 04-06-09 07:00 PM

Re: 280 GTX / 180.44 low OpenGL performance?
 
1 Attachment(s)
Wow, thanks for the immediate reply, and sorry about not posting the bug report right away. Here it goes...

With regards to the application exhibiting the problem, it's rather large. However, it is officially released under the GPL, and you can download a slightly older version from http://www.keckcaves.org/software/VISUALIZERCG if you really want. I can dig up a link to the particular data file I was using and post it in a follow-up. Steps to recreate problem:

- Run Visualizer on large 3D input file.

- Extract large isosurface using a seeded isosurface tool.

- Observe displayed fps value and compare / contrast.

In case it helps, here is the gist of the actual rendering code:

glDisable(GL_CULL_FACE);
glEnable(GL_LIGHTING);
glEnable(GL_NORMALIZE);
glLightModeli(GL_LIGHT_MODEL_TWO_SIDE,GL_TRUE);
glDisable(GL_COLOR_MATERIAL);

glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glBindBufferARB(GL_ARRAY_BUFFER_ARB,dataItem->vertexBufferId);
glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB,dataIt em->indexBufferId);

glInterleavedArrays(GL_N3F_V3F,0,0);
glDrawElements(GL_TRIANGLES,numRenderTriangles*3,G L_UNSIGNED_INT,static_cast<const Index*>(0)); // numRenderTriangles is about 2M


All times are GMT -5. The time now is 01:51 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.