|09-20-08, 05:55 PM||#1|
Join Date: Aug 2004
Location: Portland, OR
1080i modeline not outputing interlaced
I have two HDTV displays. One is a brand new 1080p display and the other is an older 1080i display. They both are connected to my GeForce 7600 via HDMI through an HDMI receiver. I'm running 173.14.12.
I want to output 1080i to the 1080p display since its internal delinterlacer is far superior to the bob-deinterlacing with the Nvidia card. I'm using a standard modeline for this:
ModeLine "This1080i" 74.52 1920 1952 2016 2208 1080 1084 1096 1126 interlace
While the Xorg.0.log says (II) NVIDIA(0): Setting mode "This1080i" the display is actually displaying 1080p instead of the interlaced 1080i when connected to the 1080p display.
To prove this I start X with the older 1080i display connected. Once again the X log says (II) NVIDIA(0): Setting mode "This1080i" however this display is truly getting an 1080i signal. I can prove this by switching the receiver from the 1080i display to the 1080p display. The 1080p display correctly says its displaying 1080i. If I then restart X, it's back to 1080p.
If it says it's setting the "This1080i" mode, how could it be changing its mind later? And why would it do that?
I've also done this without the receiver by simply switching cables, so it's not the source of the problem.
I've attached my xorg.conf and two verbose 8 logs. 1080p.log is when the 1080p display is connected and the 1080i.log is when the 1080i display is connected. I had to edit out the many irrelevant mode validations to keep the logs under 100K.
If I disable Edid completely then X says "This1080i" is not a valid mode so there must be an option that will make this work, although since it says it's selecting "This1080i" it should be working, right???