View Single Post
Old 11-22-06, 11:55 AM   #23
radu.c
Registered User
 
Join Date: Sep 2006
Posts: 21
Default Re: (beta) nvidia-settings not detecting VGA monitor correctly

I solved my issue (Modelines being filtered out) with this:

Option "NoDDC" "true"
Option "UseEDID" "false"
Option "ExactModeTimingsDVI" "true"
Option "NoBandWidthTest" "true"
Option "ModeValidation" "NoDFPNativeResolutionCheck, NoEdidMaxPClkCheck, NoMaxPClkCheck, AllowNon60HzDFPModes, AllowInterlacedModes, NoHorizSyncCheck, NoVertRefreshCheck, NoWidthAlignmentCheck"

The first 3 options don't seem to do much, as I obtain the same results either way, but I keep them arround.

As you can see I told the driver to be as dumb as possible regarding the resolutions (this is what I needed in the first place). Disabling the use of EDID would position the "maximum resolution the monitor says it can do" to 640x480. Using it, filters my modelines. I told the driver to let me specify whatever I wanted with the "ModeValidation" option when I found the "NoBandWidthTest" option. Using them individually still filters resolutions. Using them together and pow, monitor overdrive

I guess I should also post a disclaimer: THIS MAY FRY YOUR MONITOR. It didn't fry mine (just yet), but who knows how the die-exactly-the-day-after-the-warranty-expires component was designed to work like.
radu.c is offline   Reply With Quote