View Single Post
Old 05-08-08, 06:47 AM   #1
bj1
Registered User
 
Join Date: May 2008
Posts: 2
Default GeForce FX 5200 & Dual Monitors

Hi All,

Sorry for my first post to be a "help me!" but I have searched through these forums for an hour and haven't found a similar scenario with an answer.

I have a computer that I got built for me a number of years ago, and I asked for a video card that I would be able to run 2 monitors on when I could eventually afford a 3rd monitor (for wife's computer).

So that ended up being about 3 years later (i.e. now), so I now have two LG 1750SQ monitors.

The card is a GeForce FX 5200, which has a VGA out, an S-Video out, and a DVI out. My computer as work has the same outputs (different card though) using the VGA and the DVI for running 2 monitors. I got a spare adapter and brought it home, and discovered it didn't plug into my FX 5200 - since the FX 5200 has a DVI-D out, as opposed to the DVI-A outs at work.

So I got a DVI-D -> VGA adapter off eBay, it arrived today, and I've been spending the past 4 hours trying to get it to work.

I have:
- Looked for a 'run 2 monitors' in the BIOS
- Downloaded & installed the latest Forceware version 169.21
- Played around with the nView software trying to get it to work.

What happens:
- the 2nd monitor (plugged into the DVD-D -> VGA adapter) gets no signal, nor is it detected as either a monitor or TV by the nVidia software.

So, I'm prepared for a "It just won't work" in which case I'll get a different card, but I did specify that I wanted to be able to run 2 monitors when I got the computer, and the people who built it are pretty switched on and did our computers at work which all work fine.

Any ideas, nVidia experts? Do I need to post any more information?

Thanks in advance.
bj1 is offline   Reply With Quote