|05-16-07, 06:55 PM||#1|
Join Date: May 2007
Tearing in YUV display
I posted this message on the NZone nvidia forums as well.
I work for Telanetix, a net presence and teleconference company. We are using NVidia video cards on Linux machines to render our displays. We are using dual head cards in a dual monitor configuration. The problem I have been tasked with curing is tearing.
First off, the best cure yet has been to set XVideoTextureSyncToVBlank, however, when more than one decoder is used per screen, the CPU load on the X server increases... When 4 decoders are rendering to a screen, the decoders start dropping packets because they can't process them all in due time. Also, using that setting cures the tearing on one monitor only. The other settings related to VBlank, XVideoBlitterSyncToVBlank and SyncToVBlank don't seem to have any effect.
Other than that, I have tried numerous techniques to cure tearing. The most luck that I have had is using the glX extension call glXWaitVideoSyncSGI. Unfortunately, this call seems to be a timed estimate, and not very accurate. When I use this call, I get a tear that slowly marches up the screen. When I run only one screen, the results are flawless, however, we need to be able to use both monitors and potentially run as many as four decoders per monitor.
Is there an accurate way for me to detect the VBlank pulse?
Thank you for your support
|Thread||Thread Starter||Forum||Replies||Last Post|
|Russian Masterpieces Put on Display Using NVIDIA GPU Technology||News||Latest Tech And Game Headlines||0||06-25-12 05:10 PM|
|302.07 (beta) for Linux x86/x86_64 released||AaronP||NVIDIA Linux||0||05-02-12 10:55 AM|
|Glx||mrbig1344||NVIDIA Linux||7||09-30-02 07:45 AM|
|Suse 8.0 Dual display on Quadro 4 700XGL||TheCowStir||NVIDIA Linux||3||08-10-02 01:16 PM|
|Dual Display with main CRT, and secondary TV-out||BlackDogg||NVIDIA Linux||3||08-10-02 10:52 AM|