View Single Post
Old 12-05-09, 03:33 PM   #5
kleptophobiac
Registered User
 
Join Date: Nov 2009
Posts: 3
Default Re: Artifically limiting usable resolution in X or WM to compensate for overscan?

Quote:
Originally Posted by AaronP View Post
You can try configuring your display for a smaller resolution using nvidia-settings, and then going into the properties page for the display, enabling "Force Full GPU Scaling", and setting the scaling method to "Centered".
Huh, I never noticed that option before. It seems to work pretty well when I have an LCD connected to DVI, but when it's the CRT television it doesn't seem to do anything. I'll bet the TV does some scaling as well.

I spent some time biting the bullet and just taking the custom modeline approach. It worked out quite well, though it took some time. For anyone who is curious, I started with the standard 1280x720 modeline that matched the pixel clock reported by EDID on my television, and then increased the front and back porch while decreasing the resolution by a corresponding amount. Thus both the horizontal and vertical scan time remained constant, but the resolution got smaller.

Now my display is perfectly sized and centered. In case anybody has a 37" Toshiba HF85 television and is trying to use linux with DVI... the modeline is:

Modeline "1160x672" 74.25 1160 1334 1374 1650 672 696 701 750 +hsync +vsync

But I'm guessing nobody else in the world has this combination.

Thanks guys!
kleptophobiac is offline   Reply With Quote