Originally Posted by AlphaWolf_HK
So long as your TV can de-interlace proper there is no difference between 1080i and 1080p.
Is this entirely correct?? I've never entirely understood...
If you are receiving a signal from whatever device at 60Hz and it is sent to a 1080p HDTV, then if the signal sent is:
a) 1080i then for every 1/60th of a second the HDTV receives half of the screen's picture (540 lines) waits for the alternate lines to appear in the next 1/60th second interval and then displays both at the same time in its native progressive format. Overall, you get an updated screen every 1/30th second. Is this correct?
b) 1080p then for every 1/60th of a second the TV receives a a full screen picture (1080 lines) and progressively displays it every 1/60th second.
Now, the way I understand it is that the video on the HD-DVD or Blu-Ray or even normal TV transmission is only ~24fps so it makes no difference whether it updates the entire screen every 1/60th of a second or every 1/30th of a second as it is more than enough time to display the entire image.
For a games console you'd really want progressive output to a progressive TV as games often run up to 60fps so you'd need the screen to update fully every 1/60th of a second to get every frame shown.
Is this all correct?