View Single Post
Old 05-10-09, 08:41 AM   #1
bwheaton
Registered User
 
Join Date: Nov 2008
Location: California
Posts: 59
Send a message via AIM to bwheaton
Default VSync Failing under load?

I have a fairly complex OpenGL based app, and after some struggling, I have a working vsync setup. I can increase the number of frames between swaps and see it, and I have a white flash test pattern that shows a good sync when my app starts. The problem is when I start to run video - I start to get tearing in the center of the screen.

In my app, I do have:
a thread per screen,
a shared context that does nothing,
a thread per video I play.

The video threads, which I use to upload frames so that any screen can use them, and to stop shader recompilations etc. (as the video type changes) affecting the screen threads, use a fence so I know when they have completed drawing a frame, at which point they become available to the screen threads.

The exact same situation appears if I only have one screen running, full screen on one display of a twinview pair.

As I say, it seems that the load is making the vsync fail. This is with driver 180.44 on a current Arch Linux based system, and a built in laptop 9300 or an 8200 on a Mini-ITX mobo, or 8800 cards.

Any clues? I'm a bit under the gun on this issue.

Bruce
bwheaton is offline   Reply With Quote