Go Back   nV News Forums > Linux Support Forums > NVIDIA Linux

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-31-07, 11:28 AM   #1
tweeknockr
Registered User
 
Join Date: May 2007
Posts: 8
Default 1080i Modeline outputs 1080p, not 1080i

Hi All,

So I'm trying to get 1080i working on my SONY 55A2000. It does support 1080p input, but I am trying to get my HTPC to output at 1080i. I am able to do this in windows, and using powerstrip I get the following Modeline:

Code:
ModeLine       "1920x1080i" 74.25 1920 2008 2052 2200 1080 1084 1094 1125 interlace +hsync +vsync
However, when I attempt to use this modeline in Linux (Ubuntu Feisty), I always get a 1080p output. Even when I try to configure the display with nvidia-settings, and the GUI says it is outputing at 1920x1080 @ 30 Hz (Interlaced), the output is still 1080p (as indicated by the TV's display).

I've attached my bug report, which indicates that the 1920x1080i modeline is in fact being selected, but why is the output still 1080p?!

Thanks for any help!
Attached Files
File Type: gz nvidia-bug-report.log.gz (37.9 KB, 125 views)
tweeknockr is offline   Reply With Quote
Old 05-31-07, 12:24 PM   #2
rerushg
Registered User
 
Join Date: Mar 2007
Location: South Carolina, USA
Posts: 99
Send a message via AIM to rerushg
Default Re: 1080i Modeline outputs 1080p, not 1080i

Sounds like you know what you're doing so I expect that if you look at your attachment you'll figure it out. There's a lot of intense stuff there but the basics are understandable. As you scroll through note that NV reads the EDID for from your display. Then it evaluates its own list of modes against what your display can handle and accepts/rejects as it sees fit. Then X passes judgement on them as well. The result is mode mode pool further down in the log. Note that while NV considers your 1080i mode valid it uses 1080p because of "best backend fit". Not sure why that is. Suspect you'll need to add a few more mode validation options to xorg.conf to prevent this.
See this thread: http://www.nvnews.net/vbulletin/showthread.php?t=90084

You'll probably get this working so some food for thought:
1. Most of us 1080i guys would love to have 1080p. Are we wrong about something?
2. Overscan will likely be an issue. Best fix I've found is to generate a custom timing in Powerstrip. Something like 1616x884 might work for you. Just start in P'strip with the same timing you're using and customize it by changing H & V resolution. P'strip will adjust timing details to suit. Print it. Plug it into xorg.conf. You'll need to set parameter in NV config to "center" and you should be good to go.
rerushg is offline   Reply With Quote
Old 05-31-07, 12:35 PM   #3
tweeknockr
Registered User
 
Join Date: May 2007
Posts: 8
Default Re: 1080i Modeline outputs 1080p, not 1080i

Thanks for the reply.

Yes, I had the same observations as you that it is actually using the EDID. I have tried another configuration where I ignoreEDID, and use enough ModeValidation options that it forces it to accept the 1920x1080i, but then my TV says "Unsupported resolution," which is troublesome because (supposedly) the exact same modeline works fine in Windows.

As to why I am trying to get 1080i to work... I am unhappy with 1080i content playback. I have tried countless combinations of XvMC and deinterlacers, as well as several different video cards (6600, 7600). Ultimately, 1080i playback is "jerky" when compared to the actual source (when decoded and played back directly from my cable box, or through my TV's tuner). The main problem comes during horizontal panning scenes.

I am moving to MythTV from a BeyondTV Windows box, where I was using an ATI X1300 which output at 1080i, and was very happy with playback over there. So my thought is that something is happening on the conversion from 1080i source to 1080p output. One obvious (to me) thing is that outputting at 1080p forces the GPU to deinterlace, where I should be able to turn off deinterlacing when outputting at 1080i and let the TV take care of it.

One thing, however, that I have not wrapped my head around is what happens to 720p content when I am outputting at 1080i... do I lose half of the frames? Either way, if I am able to get smooth 1080i content playback I will be more than happy, as this goose-chase has gone on way too long for my taste.
tweeknockr is offline   Reply With Quote
Old 06-01-07, 08:30 AM   #4
rerushg
Registered User
 
Join Date: Mar 2007
Location: South Carolina, USA
Posts: 99
Send a message via AIM to rerushg
Default Re: 1080i Modeline outputs 1080p, not 1080i

Quote:
Originally Posted by tweeknockr
....works fine in Windows.
Yep. Heard that before a million times.
Don't understand why a TV that can run 1080p won't accept a (much less demanding) 1080i resolution. Maybe the darn thing is just particular. Looked back through your log but don't see anything of major import.
Only possibility I see might be a modeline change. Seems your display likes this mode:
CEA-861B Timings:
(--) NVIDIA(0): 1920 x 1080 @ 60 Hz
(--) NVIDIA(0): Pixel Clock : 74.18 MHz
(--) NVIDIA(0): HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0): HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0): VRes, VSyncStart : 540, 542
(--) NVIDIA(0): VSyncEnd, VTotal : 547, 562
(--) NVIDIA(0): H/V Polarity : +/+
(--) NVIDIA(0): Extra : Interlaced
(--) NVIDIA(0): CEA Format : 5

You might try this in lieu of your present 74.25 pix clock mode.

As for playback. Outside my expertise. There are other threads here on that. "Sync to Vblank" maybe?
...from a fellow goose-chaser
rerushg is offline   Reply With Quote
Old 06-01-07, 10:52 AM   #5
tweeknockr
Registered User
 
Join Date: May 2007
Posts: 8
Default Re: 1080i Modeline outputs 1080p, not 1080i

Interesting that you mention that ModeLine. As of last night, that line is the one I've had "the most success" with. It is the only one that I have been able to get a "viewable" (where the TV doesn't tell me "unsupported signal") interlaced output. However, as I would expect, this limits the vertical resolution to 540 lines, so the desktop appears stretched out.

Not exactly sure what is going on here, or how I would tweak the modeline to get a full 1080 pixel height.
tweeknockr is offline   Reply With Quote
Old 06-01-07, 08:49 PM   #6
rerushg
Registered User
 
Join Date: Mar 2007
Location: South Carolina, USA
Posts: 99
Send a message via AIM to rerushg
Default Re: 1080i Modeline outputs 1080p, not 1080i

Quote:
Originally Posted by tweeknockr
Interesting that you mention that ModeLine.
If you've got a timing that it accepts you must be getting pretty close. (Maybe you're already fixed by now. If so. Great.)
Several people in the thread I referenced above had that problem and got fixed. But looking back through I couldn't find a "silver bullet" that made everything work. Seems every CRT has its own glitches that have to be overcome. A few more "options" ought to get it.
Good luck.
rerushg is offline   Reply With Quote
Old 06-01-07, 10:12 PM   #7
tweeknockr
Registered User
 
Join Date: May 2007
Posts: 8
Default Re: 1080i Modeline outputs 1080p, not 1080i

Yes, I think I probably was getting close. When I got home from work, I decided to try using component out instead of DVI. Set it up as per nvidia's instructions and it worked right away. Video playback is better in 1080i than 1080p, but still not perfect. However, it is now tolerable.

Only problem (as we all know) is that overscan doesn't work in tv out mode, which I'm fine with for video playback, but it makes some of the settings menus in MythTV hard to work with (as it cuts off lots of the option check boxes). I can get around the desktop issue by adding additional panels around the borders.

Your comment that I was "getting close" gave me a little bit more motivation to go back until I get DVI going, but I'm kind of sick of this whole thing now and may just sit on a relatively-working configuration for a while

Thanks for the help...
tweeknockr is offline   Reply With Quote
Old 06-02-07, 06:56 AM   #8
rerushg
Registered User
 
Join Date: Mar 2007
Location: South Carolina, USA
Posts: 99
Send a message via AIM to rerushg
Default Re: 1080i Modeline outputs 1080p, not 1080i

Know how exactly how you feel: "I've got all this expensive hardware and a $25 DVI cable but the only way I can make it work is to hook it up like I did with my VCR 15 years ago." Been there, done that, and ran that way for awhile. I couldn't detect any difference in picture quality. Makes sense: all output is via the NV video encoder anyway. Overscan was a problem, of course, so I tried S-Video. That worked surprizingly well. Max rez was 1024x768 but for viewing from about 12 ft away as I do it was good (as computer monitor only). With 1080 you wind up enlarging fonts and icons anyway. The real issue I had with S & component was dual booting with WinXP on DVI. Video out would sometimes get "confused" about which way to go.
It's interesting that your display accepts component no sweat yet is picky on the DVI side. I guess if NV and your CRT just "play dumb" they're happy campers. Put 'em on DVI and they're too smart for their own good!
Real nice idea about the border panels.
Good luck again.
rerushg is offline   Reply With Quote

Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 01:35 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.