Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 04-06-12, 06:23 AM   #1
ragtag
Registered User
 
Join Date: Apr 2012
Posts: 1
Default Occlusion culling causing slowdown....

We have several of the GTX 580 cards on Linux workstations running Maya, with the 290.10 Nvidia driver.

In Maya and Houdini, we have an issue with playback becoming very jerky when a high number of polygons enters or leaves the viewport. A simple test is to create a sphere with 100.000 polygons, and move the camera so that it goes in and out of the viewport. While it's fully visible everything works smooth, but as soon as it touches the edge of the viewport playback become jerky.

I imagine this has to do with occlusion culling, where the GPU or CPU is churning through the polygons trying to figure out which ones not to draw. Is there any way to fix this?



p.s. Interestingly in Blender this does not happen, but it could just mean that Blender is drawing everything all the time, even stuff that isn't in the viewport.
ragtag is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 07:45 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.