View Single Post
Old 08-11-03, 02:25 PM   #163
nutiu
Registered User
 
Join Date: Jul 2003
Posts: 3
Default

Xavyer,

Thanks for more theory of why flickering is happening. A ground loop is definitely on the list of possible causes.

It seems that if they don't decouple the power supply from the Molex connector and the power supply provided by the AGP slot, a ground loop is inevitable. I doubt nv has decoupled the power supply (separating the ground completely from PC's ground), because the NV35 ASIC consumes so much power that designing the power supply that way would be both inefficient and expensive. It seems like we have to wait for the PCI-X or whatever standard that allows the slot to supply 100+ watts to the peripheral card.

I cannot see any flicker in 2-D mode either, except when I do that autodetect thing. I didn't find out by myself, but I read from this forum that, if you would like to see what the "flickering" problem looks like, you should do autodetect from the clock adjustment menu. In regular 2-D use, my FX5900 ultra is giving out solid 2-D output. But at 1600x1200, with Nokia's monitor testing utility available right here at NVnews, I see some "flicker" that I cannot trace the source. it may be my monitor (Sony G-520, not the best in the world) or the FX.

The autodetect button will be grayed out in 2-D clock adjustment mode, at least on detonator 45.20 that I am using.

A little story of nvidia and ATI that you may find interesting (or perhaps you already know it ):

This ATI team that is responsible for R9700 onward is not the old ATI that has been around for some time.

I almost joined SGI's AGD (Advanced Graphics Division) in late '97 after I finished school. At that time they were doing "Bali," the follow-up to the famous Infinite Reality series of graphics supercomputer. One of the strengths of Bali would be floating point color representation, just like what we have on the FX and ATI's 9x series today.

SGI had been in trouble for quite some time in late '97, so I decided to turn their job offer down, but I managed to get to know some SGI ASIC designers. SGI AGD staff was mostly laid off in mid-99, so these engineers with super-duper graphics experience went 3 routes, one went to nvidia, another went to found ArtX (responsible for graphics ASIC in Nintendo's gamecube), and some jumped out of the graphics industry completely and went to work for telecom/networking industry (wise choise back in 99).

ArtX was in Silicon Valley when it was acquired by ATI. So, when the 9700 was announced, I was not surprised that it was so good. After all, it was designed by top-notch ex-SGI engineers at ArtX. The only missing "feature" was robust windows platform driver, and it took their team some time to get to have the driver maturity to be comparable with nvidia, who has been doing a PC-based product for much longer than they have.

I believe it would be interesting to see what would come out from nvidia vs. ATI battle. Each has its weaknesses and strengths. It's possible that the "winner" didn't win on technical merits alone. Good marketing and cash management can get a company very far in this economic downturn.
nutiu is offline   Reply With Quote