We're porting our app to Linux and trying to get it to run well with a couple HD monitors. One monitor is 1366x768p@60Hz and the other is 1920x1080p@49Hz. I've followed various NVIDIA advice to create modelines for these monitors which exactly match the timings being used in Windows.
The xorg.conf and nvidia-bug-report.log are attached.
My application is using JOGL (Java/OpenGL). We ordinarily call gl.setSwapInterval(1) to limit our frame rate to monitor refresh, to avoid tearing. When I show a trivial scene in Windows, I get 60FPS. If I call gl.setSwapInterval(0) to allow gl to run as fast as it can I get 300-340 FPS.
Now running that same app in Linux on the same hardware, I'm getting 55FPS, regardless of whether I ask GL to wait for vsync or not. It consistently tears since 55!=60 (on the first monitor), and 55!=49 (on the second monitor).
I've tried export __GL_SYNC_TO_VBLANK=1 and =0, and it has no effect.
It seems that *something* is trying to sync to vblank, and that something is convinced that we're running at about 55Hz.