View Single Post
Old 10-21-11, 01:41 AM   #17
lexa2
Registered User
 
Join Date: Jul 2011
Location: Moscow, Russian Federation
Posts: 58
Send a message via ICQ to lexa2 Send a message via Skype™ to lexa2
Exclamation Re: Wine + VSync forced in nvidia-settings doesn't work

Here is source code for testcase demonstrating the problem: http://lexa2.ru/lx2/lx2-opengl-vsync...c-test.tar.bz2

To compile this simple demo one would require development headers for glibc, any GLUT 3.x compatible implementation and OpenGL+GLX development headers (including libGLU which had been deprecated as of OpenGL 3.0). Demo code is very quick-n-dirty and isn't portable due to it is using some stuff from GLX and XLib. OTOH it shouldn't be hard to port it to the WGL in case someone would need it.

By default at startup demo creates 800x600 sized window with attached GL render context and enters endless loop displaying teapot that is quickly moving up and down. One may use "+" and "-" keys to change the threshold of the internal demo FPS limiter, Alt+Enter or F11 to toggle fullscreen mode, "S" or F2 to toggle vsync and ESC to exit the demo.

As a part of GL context initialization demo gets pointer to glXSwapIntervalSGI and call it as glXSwapIntervalSGI(0). This is the bug trigger.

Expected behavior: to always get vsync enabled in case I set "Sync to VBlank" in driver CP or export __GL_SYNC_TO_VBLANK environment variable with value set to "1".

Actual behavior: vsync is disabled (tearing is easily noticeable, FPS isn't capped to monitor refresh rate) until specifically enabled by demo using "S" or "F2" hotkeys.

Upd. Had attached the source tarball to this post just in case.
Attached Files
File Type: bz2 2011-10-21-R01-lx2-opengl-vsync-test.tar.bz2 (4.7 KB, 60 views)
lexa2 is offline   Reply With Quote