Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-11-09, 05:55 AM   #1
Sky777
Registered User
 
Join Date: May 2007
Posts: 65
Default why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

my PCI 8400 card has connection to full hdtv TV set (LCD Philips 47PFL9703) by dvi-hmi cable.

Currently I'm using Modeline "1920x1080@50p"

But when I tried to use the Modeline "1920x1080@50i" in my Xorg.conf I have very bad pictures quality on my TV set even on unmoved pictures (kde desktop for example). So , seems the my card can't
transfer properly the interlaced video. Am I right ?

my idea was - to use 1080i modeline, to turn off the vdpau deinterlacer. Insted of it - to use Tv's deinterlacer. But with so bad pictures quality I have to discard from that idea.
Attached Files
File Type: gz nvidia-bug-report.log.gz (35.1 KB, 113 views)
__________________
PCI Sparkle GeForce 8400 GPU G98 A2 512 MB + AMD Sempron(tm) Processor 2800+ + svn Mplayer + xine-vdpau rev. 279 + vdr 1.7.9 & Nvidia 190.25 + XBMC
Sky777 is offline   Reply With Quote
Old 07-11-09, 08:51 AM   #2
jusst
Registered User
 
Join Date: Mar 2006
Posts: 99
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

It would be helpful if you post a picture of how the "bad picture" looks, or at least describe it more detailed than just saying it's bad.
jusst is offline   Reply With Quote
Old 07-11-09, 10:37 AM   #3
Sky777
Registered User
 
Join Date: May 2007
Posts: 65
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

it's hard to explain me.

Each 5-10-15 secs I have interlaced drift/jitter/flicker effect even on unmoveded lines.

I can't read the name of shortcuts on kde desktop because the letters are distorted
__________________
PCI Sparkle GeForce 8400 GPU G98 A2 512 MB + AMD Sempron(tm) Processor 2800+ + svn Mplayer + xine-vdpau rev. 279 + vdr 1.7.9 & Nvidia 190.25 + XBMC

Last edited by Sky777; 07-11-09 at 01:54 PM. Reason: add
Sky777 is offline   Reply With Quote
Old 07-16-09, 01:00 PM   #4
Sky777
Registered User
 
Join Date: May 2007
Posts: 65
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

with the line

Code:
Option"ExactModeTimingsDVI" "False"
I could reach good pictures quality

but my TV reported that video mode is 1080p , not 1080i

I don't know why ....
__________________
PCI Sparkle GeForce 8400 GPU G98 A2 512 MB + AMD Sempron(tm) Processor 2800+ + svn Mplayer + xine-vdpau rev. 279 + vdr 1.7.9 & Nvidia 190.25 + XBMC
Sky777 is offline   Reply With Quote
Old 07-17-09, 09:39 AM   #5
ambro887
Registered User
 
Join Date: Nov 2007
Posts: 31
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

Why would you want to send an interlaced signal to your LCD TV? Interlacing is meant for analog TVs, and will only result in loss of quality if used on an LCD, especially for fine still images. If you're getting interlaced video from somewhere (like IPTV), just enable deinterlacing in the video player.
ambro887 is offline   Reply With Quote
Old 07-17-09, 11:46 AM   #6
Sky777
Registered User
 
Join Date: May 2007
Posts: 65
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

I would like to try use the TV hardware deinterlaicer instead of vdpau bob deinterlaicer and to compare the pictures quality.

Quote:
result in loss of quality if used on an LCD, especially for fine still images.
why do you think so ?
__________________
PCI Sparkle GeForce 8400 GPU G98 A2 512 MB + AMD Sempron(tm) Processor 2800+ + svn Mplayer + xine-vdpau rev. 279 + vdr 1.7.9 & Nvidia 190.25 + XBMC
Sky777 is offline   Reply With Quote
Old 07-17-09, 12:42 PM   #7
ambro887
Registered User
 
Join Date: Nov 2007
Posts: 31
Default Re: why Modeline "1920x1080@50i" has so bad quality on LCD TV set ?

Quote:
Originally Posted by Sky777 View Post
I would like to try use the TV hardware deinterlaicer instead of vdpau bob deinterlaicer and to compare the pictures quality.
why do you think so ?
I think the video card is applying a filter (anti-aliasing) when producing an interlaced signal, which causes unrecoverable quality loss. Thus the picture your LCD shows, even though it used deinterlacing, is worse than the original. See:
http://en.wikipedia.org/wiki/Interla...by_interlacing

I suppose turning off the filter would improve quality of still images, but I have no idea how or even if it's possible.
Even if you manage that, regular (non-interlaced) video would look worse because interlacing is a lossy process. Also I'm also not sure your interlaced video will be properly deinterlaced, because the video card doesn't expect its input to be already interlaced.
ambro887 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 07:01 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.