Originally posted by ZenOps
Anything less than 50 percent yield is nothing to be proud of, it would be shameful for CPU makers Intel or AMD to release a chip with that low a yield.
If ATi is getting around 90 to 95 percent yield on a very mature .15, Nvidia would have to get at least 75 percent on the .13 to get the same number of chips per wafer (It is after all only 2 microns) At the end of the day, it comes down to how many wafers you end up throwing in the garbage.
With a .15 process which is mass manufacturable, could easily get 30 times more chips than .13 when demand kicks in. (Not sure of the real numbers, but 30 percent .15 manufacturing from TSMC would be reasonable, <1 percent for .13 looks positively anemic) Mass production = less expensive, always has.
One NV30 on the shelf to 30 Radeon 9700's... As I mentioned somewhere else, even though *a lot* of Radeon 9700pros were made in the first run, they have already sold out.
DDR2 is *very* expensive and rare, and from what I have heard the NV30 requires it.
There is no doubt in my mind the NV30, if released as specified now, will easily cost $100 more than the Radeon 9700pro.
You're forgetting that at the speeds required for a high end vidcard, even the Radeon 9700 is low yield, especially with so many transistors on it.
So can the 'if'...they aren't.
Also, hard as it is for us to believe, high end vidcards aren't very high demand. While this is undoubtedly due to the prices, you can't beat the competition and worry about prices too much.