PDA

View Full Version : Card BIOS fails to set correct sync values on POST


skwid
06-25-05, 01:57 AM
I have owned a SGI multilink and SGI 1600sw flatpanel, and have been running with an Analog-to-DVI cable with various cards for some time.

Recently I have tried a DVI-to-DVI connector with my 6600 GT and have found that the Card BIOS does not setup the sync values properly for the display device.

The display appears to be drawn beyond the width of the framebuffer and artifacts appears on the left side. Once I boot the system far enough to where it sets a different display (changing sync values), it properly displays the screen. However it can be unstable at certain resolutions.

SGI has posted more information about this issue here (http://www.sgi.com/products/legacy/multilink/nvidia_faq.html).

Can this be fixed? I have not seen the Manufacturer of my card supply any form of card bios update for this problem. Could this be a limitation in the GF6 chipset?

subbo
06-25-05, 08:15 AM
Or could it be the SGI display doesnt properly support the post level resolution and refresh rate of 70hz? Some apple displays are known to only show a picture when you reach the desktop as they dont support old-school pc resolutions.

My TFT supports dos resolutions but not the refresh rate, so its all ok during bootup and in full screen dos, but play a game like doom (your avatar) that wants to go 75hz the game will be all stuttery and jumpy because the display forces the mode to 60hz. Which incidently, they didnt bother to fix for the xbox version of doom1 and 2 that come with the Doom3 xbox version. It too stutters about twice a second due to the engine being synced to 75hz and the TV only doing 60hz.

skwid
06-26-05, 07:09 PM
Again, when the card is using the Analog 'VGA' connector it displays every mode properly, the multilink module will scale the standard VESA modes. The problem is when the Digital 'DVI' connection is used, the card does not set proper sync values on POST. I get artifacts because the card is drawing beyond the right side of the monitor.

I believe the reason for this is that the card is not communicating with the monitor properly during POST as it does with the Analog connector. Could this mean that the DVI protocol used is an older standard not supported by the nVidia chipset? Could I hack the video bios rom to set the defaults to my monitor values?

subbo
06-26-05, 09:04 PM
Could also be down to the fact that the DVI "chip" is no longer a separate quality component on the pcb but integrated into the gpu. I had and still have some "blue noise" problems with my GT in the black end of the signal, but the latter drivers fixed it almost completely (obviously some sort of adjustments were made to the dvi output on the driver level), and it was always only when the driver was loaded.

You could be right, the dvi "emulation" just isnt up to spec like it used to be on the GF4 and FX series with dedicated chip.