View Full Version : Force HDTV resolution/timings

09-24-05, 12:39 AM

I have an LCD TV that supports a full 1920x1080p. It's the Sharp Aquos LC-45GX6U.

The video card (6800GT), when connected to the TV via DVI, does not show me the bootup screens. However, once in Windows, it shows the display- but it's the wrong mode.

The TV, I believe (but I can't be certian) is reporting that it supports 1920x1080 interlaced. The video card happily displays this resolution. It doesn't work correctly because I'm bypassing the sharp AVC and plugging directly into the panel - I see two copies of the screen displayed on on top of the other - the interlaced frames being displayed progressively.

Anyhow, if I go into the "Screen Resolutions and Refresh Rates" section of the nVidia display settings, and then select Advanced Timing, I can pick mode 861B (1920x1080 @ 59.94) -P, the screen looks great. Nice sharp non-interlaced 1080p at 60hz.

But the damned thing keeps going back to the interlaced mode. When I reboot, it's back to 1080i. If I run some games, they revert the display back to 1080i. It's damned annoying.

Isn't there any way to force the video card to display my requested mode, no matter what? Is there a registry hack? What? It seems really strange that I can't just tell it "Use this mode, these timings, and forget about doing anything else!"

Any help would be *greatly* appreciated. I really like my nVidia card but it seems like ATI has better HDTV support these days.

09-24-05, 02:22 AM
This is a problem with the nVidia cards. For instance I've been using the component output on a 6600 GT at 576i. This has amazing picture quality compared to S-Video or composite. However on boot up, the overscan compensation's all wrong. You need to fiddle with the settings on every single bootup. I think part of the problem is that the drivers canna detect that 'tis the same display connected every time it boots up. So it falls back to some default value.

09-24-05, 02:28 AM
At least I'm not the only one with the problem, but I got to think that there's some way to tell the driver to stop trying to detect anything and just use the settings I provide.

Sheesh, this is basic stuff that used to be easy in Windows 95 and NT4. I guess they need to protect us from ourselves at all costs!