[Sorry this got soo long. I'm hoping it's remotely helpful and who knows? Maybe the search indexes will make it useful for others too...]
I too am trying to get my nVidia-based graphics card to work at 1920x1200 with DVI. I too am running into some difficulties.
I'll tell you where I'm at and maybe we can help each other...
I have a Samsung SyncMaster 243T. It is a 24" LCD with both DVI and VGA inputs. It's native resolution is 1920x1200. It came with a DVI-D Single Link cable (and a VGA cable, but where's the fun in using that?)
My graphics card is an nVidia FX5700 Ultra-based card. I don't have the actual card manufacturer name and manual handy at the moment, but the card has a DVI output, a VGA output, and an S-Video output. The card has 128MB of texture memory.
My PC is currently set up to dual boot XP and Fedora Core 3 (linux). Under Windows, using relatively recent nVidia drivers, I cannot get 1920x1200 to work via DVI. Things work fine over DVI if I use 1600x1200. Since I'm not a windows guru, I'm not very good at diagnosing the problem there. Instead, I tried to get things working in Linux first. This is where I made a few potentially interesting discoveries...
First, I can get the whole mess to work at 1920x1200 via DVI in Linux...*but* only if I use the Xorg distributed 'nv' driver. If I use the nVidia supplied, closed-source driver, called 'nvidia', it won't work. I would prefer to use nVidia's driver, so the current arrangement isn't really acceptable for me. (Not to mention that the current Linux setup doesn't help me at all with Windows...)
Here's what I figured out while mucking around with my xorg.conf file. My monitor is fairly persnickety about the timings that it will accept. It will only see 60Hz timings. What's more, it seems to be particular about the sync pulse timings as well. Perhaps if your monitor is less touchy, you might be able to work around the problems that I am seeing by trying a timing with a smaller sync/porch profile. This might help you avoid what I think is the problem (see below).
If you examine your monitor specifications, via the manual or maybe the vendor's web site, you can probably find the exact recommended timings. Alternatively, in Linux with Xorg, you can examine the Xorg log file (in /var/log/Xorg.0.log for me) to see what the X server and driver are able to deduce about your monitor from its EDID responses. In my case, the EDID data from the monitor in these logs said exactly what timing the monitor wanted for its native resolution (1920x1200). This seemed great. I should just plug those numbers in and I'm good to go. Right? Apparently not.
The problem that I seem to be running into is that the monitor's preferred timing for 1920x1200 resolution is actually 2080x1235 pixels once you add in the "invisible" sync times. This size multiplied by the monitor's required 60Hz refresh results in a total pixel
clock of 154Mhz. "So what?", you might ask.
Welp. it seems that DVI-D Single Link cabling is only rated up to 150Mhz. The nVidia drivers, on both Windows and Linux, seem to know about this. They refuse to even try driving the card at that timing. Ironically, the Xorg 'nv' driver doesn't pretend to be nearly as clever and it seems to just work.
I honestly don't know if the nVidia driver limitation is due to the card's TMDS capabilities or it's assumptions about the capapbilites of Single Link DVI cables. If anyone knows, do tell!
I'm trying to get my hands on a Dual Link DVI cable to try it, but it seems silly that I need one when I know that the current cable *can* work, since I've seen it!
Now some more technical mumbo-jumbo if you want to try some of this yourself under Linux. My monitor wants the following timing...
# Modeline reported by the Samsung SyncMaster 243T EDID...
# 1920x1200 @ 59.95 Hz (GTF) hsync: 74.04 kHz; pclk: 154.00 MHz
ModeLine "1920x1200" 154.00 1920 1968 2000 2080 1200 1203 1209 1235
I tried making up some timings that were <150Mhz but still 1920x1200 visible. Here are a few of them. For me, they didn't work even though the nVidia driver would actually accept them and drive the card. The monitor wouldn't sync them though. Perhaps other monitors are more forgiving with their timing ranges.
# None of the funny (half-baked) modes below are loved by my 243T.
ModeLine "1920x1200" 162.00 1920 1984 2176 2480 1200 1201 1204 1250
ModeLine "1920x1200" 148.99 1920 1968 2000 2080 1200 1203 1209 1235
ModeLine "1920x1200" 148.20 1920 1936 1984 2000 1200 1203 1209 1235
ModeLine "1920x1200" 138.60 1920 1968 2000 2080 1200 1203 1209 1235
So, anywho, I feel kinda stuck. Maybe one of those modes above helps you. For me, I wonder the following...
2005/06/18 9:30pm - Ok, I thought I would go back and edit this message to include my newer understanding of things based on what I learned from all the great comments and support from others here on the boards. Thanks for the help everybody! I think I can answer some of my own questions now. I thought that I would go ahead and update this message to include the things that I think I've figured out for posterity's sake.
o Is the problem with the nVidia drivers fundamentally a DVI Single Link bandwidth issue or a TMDS issue with my card?
It seems most likely that the problem is a limit in my graphics card. It also appears that Single Link DVI is rated up to 165MHz (according to this SiliconImage white paper). Since Single Link DVI is rated up to 165MHz and my monitor's preferred (only!) 1920x1200 mode is 154MHz, it should be able to work with the current cabling. What's more, it appears that the 150MHz limit that I was seeing is most likely a limit that the driver induced based upon its understanding of the limits of my graphics card, not the DVI cabling. Other people (running the more advanced (and more expensive!) 6800 cards) have seen the driver limit them to 165MHz instead of the 150MHz limit that I see.
o Is there a way to tell the nVidia drivers to "just do it"? Will that work? It seems to work with the 'nv' driver...
I have not been able to figure out a way to tell the nVidia 7664 build driver to "just do it". *But* the 7174 build and the Xorg 'nv' driver both seem to lack the offending checking code, so they can be made to work.
o Is anyone else able to use an nVidia driver at a pixel clock above 150Mhz on Single Link DVI?
Yes. See above. Additionally, Single Link DVI is really only rated for 1600x1200 with CRT sized sync timings. *But* with reduced sync profile timings Single Link DVI can drive 1920x1200 or the 1920x1080 (HDTV-like) resolutions.
o Even if I could get this to work under Linux, how do you do equivalent timing tweaking under Windows?
I saw recommendations to look into a tool called PowerStrip. I haven't tried it myself...yet.
o Would a Dual Link cable just solve all my problems? (If so, why the heck didn't Samsung just ship one in the box with this product that is obviously intended to run at 1920x1200!?!?)
Unlikely. Given the supposition above that the problem is really the capabilities of my graphics card's TMDS signalling chip, a better cable would probably have no effect whatsoever.
Any of you big brained people out there willing to help clue me in?