Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-18-06, 06:06 PM   #1
pe1chl
Registered User
 
Join Date: Aug 2003
Posts: 1,026
Default XvMC de-interlacing and CPU usage when playing HDTV

I tried XvMC on my new 6600GT (AGP) card.
I use mplayer 1.0pre7try2

What I find is when playing a recorded standard definition TV stream it maybe halves the CPU usage (from about 16 to about 8%). But normally I use a de-interlacing postprocessing filter and it cannot be used with xvmc. Is there another way to enable de-interlacing? I understand the 6600GT supports de-interlacing in its hardware, but how can this be enabled?

Even more important: when I play a HDTV stream, the CPU usage does not decrease but it INcreases to the point the system can no longer keep up. Normal HD playing via -vo xv and without filtering uses 60-90% CPU (depending on the content), but with xvmc it immediately clamps at 100% even for scenes that require only 60% with xv.
Do others see this? Is this normal?
pe1chl is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 10:27 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.