Originally Posted by portwolf
On a side note, my pixel clocks ect are reading the same as before... and if you remember when you did the math, you suggested the output being 47.6hz
which probably rounds up to 48
not a coincidence i think
Not very likely, in deed :-)
It appears as if the the pixel clock calculation is done incorrectly by the
nvidia control panel. That would be a bug in the nvidia driver/panel, IMHO.
But on the other hand nvidia officially only supports a maximum resolution of
2560x1600 pixel per output, so filing a bug report to nvidia may not be effective,
because nvidia may not recognize it as a problem they need to fix.
Maybe it helps to increase the "Desired refresh rate" underneath "Advanced
Refresh Rate Parameters"? Does changing the "Desired refresh rate" has
any effect on the pixel clock value displayed? If so, maybe you can trick the
setting into creating the right pixel clock applying an actually wrong refresh rate?
Means: at the moment, a setting of 57Hz obviously generates a 48Hz output
(as actually expected when looking at the pixel clock). Assuming the error is
linear, then a setting of 68Hz might generate a 57Hz output (57*327.27/273.45).
BTW.: underneath "Custom display mode values", there is "Bit per pixel: 16".
Does it make any difference when setting this value to 32?
Other than that, i can only recommend to install Windows XP just to make
sure everything works fine, there. I fear, installing Linux with the modelines
as published earlier in this thread wouldn't help you for testing, because Linux
isn't officially supported by Matrox, so reporting "but it's working on Linux"
wouldn't impress Matrox support engineers a lot, i assume :-)