View Single Post
Old 08-30-08, 03:52 PM   #2
Armedblade
Registered User
 
Join Date: Oct 2007
Posts: 23
Default Re: disable fallback to nvidia-auto-select

I finally get some working modelines (using http://www.tkk.fi/Misc/Electronics/f...2rgb/calc.html), but the end result is the same exact picture displayed onscreen. My guess is the nvidia driver is determining the "best fit" resolution and using/scaling to that instead.

Code:
(II) NVIDIA(0):   Validating Mode "632x444_nos":
(II) NVIDIA(0):     632 x 444 @ 61 Hz
(II) NVIDIA(0):     Mode Source: X Configuration file ModeLine
(II) NVIDIA(0):       Pixel Clock      : 21.00 MHz
(II) NVIDIA(0):       HRes, HSyncStart :  632,  636
(II) NVIDIA(0):       HSyncEnd, HTotal :  716,  720
(II) NVIDIA(0):       VRes, VSyncStart :  444,  461
(II) NVIDIA(0):       VSyncEnd, VTotal :  463,  480
(II) NVIDIA(0):       H/V Polarity     : -/-
(II) NVIDIA(0):     BestFit Backend for "632x444_nos": 640x480
(II) NVIDIA(0):     Mode is valid.

(II) NVIDIA(0): Requested modes:
(II) NVIDIA(0):     "632x444_nos"
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): MetaMode "632x444_nos":
(II) NVIDIA(0):     Bounding Box: [0, 0, 632, 444]
(II) NVIDIA(0):     DFP-0: "632x444_nos"
(II) NVIDIA(0):         Size          : 632 x 444
(II) NVIDIA(0):         Offset        : +0 +0
(II) NVIDIA(0):         Panning Domain: @ 632 x 444
(II) NVIDIA(0):         Position      : [0, 0, 632, 444]
(II) NVIDIA(0): Virtual screen size determined to be 632 x 444
What I don't understand is that if htotal, vtotal is 720, 480 then how is it that the best fit would be 640x480 when the sole purpose of the adding the edge buffer (thus getting the picture to 720x480 total) in the modeline was to compensate for overscan...
Armedblade is offline   Reply With Quote