PDA

View Full Version : no dual-dvi nv35 5900 model?


imemine
06-01-03, 09:07 PM
its the strangest thing. why is it that all cards with dual monitor support just dont outright have dual-dvi support so that someone can always downgrade to the analog 15-pin with the adapter? it seems that as lcds have dropped in price to under $500 for 17 and 18 inch, and if someone were to have two monitors, wouldnt they opt for the two lcds because of the space it would save, much less eyestrain and heat dissippation and electricity saved? i have a gainward 4600 with the dual-dvi, and i thought at the time future nvidia reference design would include them, and now we're looking at the 5900 dropping from the womb within a month, and ive seen 7 vendors nv35 models and they all offer one digital, one analog, and god knows if you have two dvi lcds its hard to live with one sharp and one looking analog crinklely.
is it because it is easier at first to release an initial reference design board so that you are trying to be the first vendor out of the gate, and that within a couple months they will expound on making a higher-up model with dual-dvi? im hoping gainward will release one, or asus, but noones released a high-end dual-dvi card since the 4600 gainward agp 4x, 700 golden sample have they?
i eagerly await the first newsclip i read that shows someone releasing that ultra dual-dvi nv35 card. alas.

Solomon
06-01-03, 09:11 PM
Originally posted by imemine
its the strangest thing. why is it that all cards with dual monitor support just dont outright have dual-dvi support so that someone can always downgrade to the analog 15-pin with the adapter? it seems that as lcds have dropped in price to under $500 for 17 and 18 inch, and if someone were to have two monitors, wouldnt they opt for the two lcds because of the space it would save, much less eyestrain and heat dissippation and electricity saved? i have a gainward 4600 with the dual-dvi, and i thought at the time future nvidia reference design would include them, and now we're looking at the 5900 dropping from the womb within a month, and ive seen 7 vendors nv35 models and they all offer one digital, one analog, and god knows if you have two dvi lcds its hard to live with one sharp and one looking analog crinklely.
is it because it is easier at first to release an initial reference design board so that you are trying to be the first vendor out of the gate, and that within a couple months they will expound on making a higher-up model with dual-dvi? im hoping gainward will release one, or asus, but noones released a high-end dual-dvi card since the 4600 gainward agp 4x, 700 golden sample have they?
i eagerly await the first newsclip i read that shows someone releasing that ultra dual-dvi nv35 card. alas.

Man I hear ya... You would think that for their top of the line models they would resort to Dual DVI. Kinda pisses me off that they don't. I mean, you are already spending $499.99 for it. So tack on a measly $3.00 extra for the DVI PCB connector. Hell... All of the retail boxes already come with a DVI to VGA connector. So why not include two then?

I'm waiting for ATi or nVidia to step it up. I have the goods! Why can't they deliver? :p

http://www.*********.com/temp/split.jpg

They have it on their workstation cards. Don't understand why they don't at least do this for their top of the line consumer version.

Regards,
D. Solomon Jr.
*********.com

Solomon
06-01-03, 09:14 PM
Originally posted by imemine
God knows if you have two dvi lcds its hard to live with one sharp and one looking analog crinklely.

I'm glad I'm not the only one who can notice DVI vs Analog. It's like night and day isn't it? I was using a Radeon 9700 Pro, but damn! That was just annoying having one sharp DVI and one built in FSAA analog... It sucks, and I'm with you. Step up ATi or nVidia. It's a market you could capitalize.

Regards,
D. Solomon Jr.
*********.com

omv
06-02-03, 11:31 AM
I think the DVI / analog difference depends on how good the monitor's ADC's are & the video card's ramdac. With my T210/GF4 there are subtle differences. T210/9500pro gives me a bit worse quality.

However, when the noise on DVI gets high enough to break the signal, it looks pretty crappy. My 9500pro would give green sparkles randomly in the image when running 3D. very distracting.

Here's a good test pattern ftp://ftp.pc.ibm.com/pub/pccbbs/visuals/testpat.exe

Using the analog connection, I get 1bit noise in the middle section. On a high-end CRT at max resolution, it just looks like grey. DVI/LCD I can see clearly distinguished alternating bars.

ricercar
06-02-03, 04:15 PM
waiting for ATi or nVidia to step it up. ... They have [dual DVI] on their workstation cards. Don't understand why they don't at least do this for their top of the line consumer version....Step up ATi or nVidia. It's a market you could capitalize

Port configuration of video cards is the decision of the board manufacturers, not the chip maker. For what it's worth, ASUS has released dual-DVI consumer card models for GeForce4 MX and Ti4600; they're likely to continue with FX cards, eventually.

Chip makers such as NVIDIA and <ATI's-chipmaking-division> don't make the decision regarding ports on the actual boards. Both companies' graphics chips offer support for dual DVI, even with their consumer versions. Both companies allow for such a configuration in their reference designs.

Board makers like <ATI's-boardmaking-division>, Sapphire, Tyan, ASUS, Leadtek, MSI, Visiontek, etc, these are the companies who make the decision regarding the ports on their cards. (Note that NVIDIA was a board maker only for the FX 5800 Ultra. All other designs from NVIDIA have been manufactured by third parties.)

So for best results, direct your ire and email to board manufacturers, not the chip makers.

imemine
06-02-03, 10:41 PM
really, i was wondering that myself, did nvidia include dual-dvi in their reference design and let the manufacturers choose to use it or not?
because i figured that if nvidia really did inlude dual-dvi in the reference design, wouldnt they show it in their reference boards initial international release in their ads and such? why would they spend hundreds of millions in design and all, then show a board with one vga and one dvi to the world, shorting it from a $10 piece to make it dual-dvi card to wow the world a bit more.
its just wierd. why not spend a few dollars more and beat the competition first, especially if it were in the original design reference anyhow....
im using two vx-2000s with a dual-dvi gainward 4600, using the silent zalman fan-less heatsink to clock it to 725/325 with no problems, on a p4c800 board with 1024mb pc3700 and a 2.4HT oc'ed to 3.02 with a slk-900, and two sata wd raptors, im just itching for the 5900 dual-dvi to complete it.....

omv
06-02-03, 11:02 PM
Spinning a new board is pretty cheap (<<$10k) compared to spinning new board designs (>>$100k). NVidia probably had seperate board layouts for dvi/vga and dual-dvi.
As far as why manufacturers don't just do dual-dvi... They have to put a dvo-dvi converter for each dvi port - so there is extra expense. And from my experience, board designers care about pinching whatever pennies they can, even though it seems odd to me. There's just not enough dual-lcd people to make a big market-share impact for them. In fact, i'm surprised the board designers don't just do single-vga or single-dvi to even further cut costs.

NOTE: The FX 5200 and FX 5600 have a DVI port right out of the chip, and don't need the external adapters chips (ATI is doing this as well). However, I don't think have two integrated dvi drivers.

DSC
06-02-03, 11:40 PM
Asus has Dual DVI 5200 and 5600 cards.... would they step up to save everyone with a dual DVI 5900? :p :D

Deimos
06-03-03, 02:45 AM
Originally posted by omv
I think the DVI / analog difference depends on how good the monitor's ADC's are & the video card's ramdac. Don't forget the cable connecting the two.

I'd kill/frag for a dual-dvi 5900U :)

/Deimos

wilson
06-03-03, 08:35 AM
Originally posted by Deimos
Don't forget the cable connecting the two.

I'd kill/frag for a dual-dvi 5900U :)

/Deimos

Actually, you should rob for a dual-dvi 5900U. It's called the Quadro FX 3000.

ricercar
06-03-03, 12:44 PM
Originally posted by omv
The FX 5200 and FX 5600 have a DVI port right out of the chip, and don't need the external adapters chips

Every NVIDIA GPU since the NV17 has dual-link TMDS/LVDS transmitters, supporting 18-bit or 24-bit single-pixel or dual-pixel mode panels. Does this mean no external logic is required for dual DVI displays?

GlowStick
06-03-03, 02:39 PM
Originally posted by wilson
Actually, you should rob for a dual-dvi 5900U. It's called the Quadro FX 3000.

Ha, good one! : P

wilson
06-03-03, 03:33 PM
Originally posted by ricercar
Every NVIDIA GPU since the NV17 has dual-link TMDS/LVDS transmitters, supporting 18-bit or 24-bit single-pixel or dual-pixel mode panels. Does this mean no external logic is required for dual DVI displays?

The quality of the on-chip transmitters is inconsistent. Some GPUs support up to 165MHz and beyond, some are below. The Silicon Image transmitters are validated to 165MHz. If you are pushing close to 165MHz, read the spec before you buy.

--wilson

imemine
06-07-03, 03:25 AM
someone just wake me up, or email me, when someone finally announces a dual-dvi 5900U.....yawn.....thanks!