View Single Post
Old 10-03-09, 02:53 PM   #2
FFmpeg developer
Join Date: Jan 2009
Location: Vienna, Austria
Posts: 467
Default Re: can Nvidia vdpau-cards do right interlaced output ?

Originally Posted by Sky777 View Post
the idea is to use the hardware deinterlaicer in TV set for 1080i video. It should be better than vdpau deinterlaicer. But for such tests need that graphic card is doing corrected right interlaced video signal.
If you do not specify any de-interlacing algorithm, the VDPAU video mixer will output exactly the video that it has been fed with, which in case of libavcodec decoding of H264 is bit-exact with what it received (it is difficult to find streams that do not decode bit-exact with libavcodec). If we assume that NVIDIA's hardware decoder is also bit-exact (I did not test yet), you get a correctly decoded picture including the comb artefacts you are looking for.

Note that for the TV to be able to de-interlace, exact timing would be necessary which is difficult to achieve. There is a discussion about it in the mplayer-vdpau thread.

Note 2 that all this "interlaced" signal has of course nothing to do with interlaced analog television broadcast (that might have made sense once).

Carl Eugen
cehoyos is offline   Reply With Quote