PDA

View Full Version : GeForce FX mainstream and 0.15u


Pages : [1] 2

GamblerFEXonlin
03-04-03, 04:41 AM
http://www.nvnews.net/#1046754998


The 0.13u aim really blew up in their face it seems. They aimed for extreme clockrates to live up to their 0.13u hype and get back the "speed king" status but had to design a central vacuumcleaner fresh air from outside is very good cooling) to achieve it, and now yelds turn out too low.

I can guess they didnt go for a mainstream FX 0.15u card before the great FX ULTRA came out since it would dissapoint.


The excerpt volt provided (http://jooh.no/gf3/15u.mp3) is very interesting (and convenient). Its funny too though, he mentiones things like:

00:54
"dissapointed with how long it took to get FX to market"
no kidding.

01:02
"arms around 0.13 challenge"
and a challenge it was, even nvidia couldnt make it. I'm glad they tried that leap though.

01:05
"clever things we did, was dedicated our lowest end GFFX, mainstram, that is likely to have an opportunity of tens millions a year, target that chip to 0.15u because of huge amounts of capacity"
wow, great idea! ATI are good at keeping market secret to themselves, so they get a head start of over half a year, and nvidia didnt get it until now. Why didnt we see a .15u FX until now, well because then they'd loose their face. They've lost more today.

01:40
"0.15u on bottom, 0.13u on top, I think, I think that makes sence"
sounds like a politician. he goes on:

"you want your highest performance technology to be the highest performance technology..."
omg, here he preaches to the converted.

"...and mainstream will require millions of millions of chips ... in a very short period of time, my expectations is we will have a tenfold advantage when it comes to number of units and marketplace"
Using ATI's success when they are lying down, good they've learned?



Don't you love marketing monkeys? Are "the people" really this stupid so he feel comfortable saying this? A pity.

Call me biased, but there is no mercy here. If they fail, and do this "damage control" I get pissed off. its so cheap. The incredibly noisy "cooling solution" pisses me off even more, I hate noisy computers. High-poly tech demos that has nothing on real games also pisses me off.


I wish everybody was more critical to hype, marketing monkey speeches and companies damage control.


If they could only make a watercool-ready GFFX (with both ram and gpu under same aluminium block or something), Im not confident on AITs drivers.

Lezmaka
03-04-03, 04:46 AM
There's already a few threads over in the rumor mill with the same info, but thanks for trying.

volt
03-04-03, 07:01 AM
There is no actual thread on this, so you guys can continue :)

Evildeus
03-04-03, 07:22 AM
Originally posted by GamblerFEXonlin
[url]01:05
"clever things we did, was dedicated our lowest end GFFX, mainstram, that is likely to have an opportunity of tens millions a year, target that chip to 0.15u because of huge amounts of capacity"
wow, great idea! ATI are good at keeping market secret to themselves, so they get a head start of over half a year, and nvidia didnt get it until now. Why didnt we see a .15u FX until now, well because then they'd loose their face. They've lost more today.

01:40
"0.15u on bottom, 0.13u on top, I think, I think that makes sence"
sounds like a politician. he goes on:

"you want your highest performance technology to be the highest performance technology..."
omg, here he preaches to the converted.

"...and mainstream will require millions of millions of chips ... in a very short period of time, my expectations is we will have a tenfold advantage when it comes to number of units and marketplace"
Using ATI's success when they are lying down, good they've learned?
Yes Ati has made a good job on the R300 part whereas Nvidia did poor on NV30.

But as for mainstream, perhaps Ati is doing the wrong move today (i. e. going to 0.13) and Nvidia the good one. All this depends on the quantity of chip TSMC can deliver using each of these methods.

If 0.15 >> 0.13 then Nvidia takes a point.
If 0.13=< 0.13 then Ati got it.

Knowing that it's in this part of the market that money is, then it's really an important move.

After thoughts, NV34 is in the same market as 9000-9200 parts, isn't it? All 0.15 parts, so nervermind, Nvidia is still behind :D

jbirney
03-04-03, 07:50 AM
"clever things we did, was dedicated our lowest end GFFX, mainstram, that is likely to have an opportunity of tens millions a year, target that chip to 0.15u because of huge amounts of capacity"

What I find highly ironic is that ATI has went the other way. They are basing their main stream parts with the highest volumes on the new .13u process and the highend on the older tech. I also am not sure why ATI seems not to have much trouble with the RV350 on .13u while nV shure was scared enough to move their NV31/34 to the older process? Things that make you go hmmmmm

borntosoul
03-04-03, 08:24 AM
it makes sence to me that ati got their mainstream chip on .13 rather than on .15 , i thought that nv wouldve used .13 accross the board ,seems logical to use the best technology for the card that sells the most and makes the most money ,just a thought .

Richthofen
03-04-03, 08:29 AM
well the question is did the CEO clearly say that NV31 is on 0.15?

Or is the main stream/low cost part he is talking about the NV34?
As far as i understand this - he is talking about NV34.
If that is true this is not bad news at all.
The point is there is no similar product from ATI concerning NV34.
NV34 will be DX9 but absolutely low cost.
It will have about 45 Mio transistors.
Pretty comparable to XABRE chips but from Nvidia this time.
That would mean pretty high volume. For this situation and concerning the current situation on the manufacturing side 0.15 for NV34 would be not bad at all.
Demand for 0.15 is falling during the next quaters but demand for 0.13 will rise.
That in the end means reduced prices on all 0.15 wafers.
With only 45 Mio this chip is still pretty cheap on 0.15 and they need very high volume.

We can't compare RV350 or NV31 to NV34 because NV34 has a lot less transistors. We could compare RV280 to NV34 but RV280 is DX8 only while NV34 will be DX9 no matter how they will achieve that.
I think SIS, Trident and so on did a good job - giving Nvidia an example how to built an up to date product featurewise but with only few transistors.
In the end this will hurt SIS a lot because nV34 will be cheap to and Nvidia still sells better than SIS.
ATI on the other hand has no comparable product to NV34 this time.

We still have to wait and see but if he is talking about NV34 only this is not bad news.
If he is talking about NV31 too, then this is bad news.

Uttar
03-04-03, 08:49 AM
The 0.13 NV31 is to compete with the 0.13 RV350 - So I don't quite understand what Jen Hsun Huang wants to say there...
The RV280 is 0.15, and is supposed to compete with the 0.15 NV34.

IMO, nVidia's plan two months ago was:
NV30->R300 - On Par
NV31->RV350 - On Par
NV34->RV280 - Win

The only problem?
Well, this is what is likely going to happen:

NV30->R350 - Lose
NV31->RV350 - Lose
NV34->RV280 - On Par

But they've still kinda "won" with the NV34, because OEMs would love to show DX9 everywhere. Even though it's only "DX9 compliant".
Really, the NV34 is not "DX9 Compatible" - because it doesn't support hardware VS. And maybe even some other things, who knows :(


Uttar

gstanford
03-04-03, 09:00 AM
What you meant to say Uttar is that NV34 is DX9 compatible ony, not DX9 compliant (compliance implies hardware capable of DX support, compatible is software enabled).

NV34 should be a success for nVidia. They will have a cheap part on a cheap process, won't have to waste 0.13 micron production on NV34 while production ramps and NV34 will become cheaper again when it does migrate to 0.13 micron.

It will also be selling to the mainstream/lowend market which is IMO more important to hold than the highend.

Joe DeFuria
03-04-03, 09:39 AM
Originally posted by gstanford
It will also be selling to the mainstream/lowend market which is IMO more important to hold than the highend.

IMO, you can't simply "separate" the low-end from the high-end markets. They are deeply intertwined.

Typically, the "high end" products get all the press. That's what gets the customers excited. Products like the Radeon 9500-9800 and the Geforce FX 5800 and higher. Whoever wins those battles tends to win the mind-share battle. REGARDLESS of how good the lower end products are.

Assuming the ATI products win the high-end battles this round (which is a good assumption, IMO), the result is, you have customers "demanding ATI / Radeons", and that's what drives sales of the value parts.

The Radeon 9000 is CLEARLY a superior product vs. the GeForce4 MX. But the 9000 really didn't start taking off until the GeForceFX was unveiled and met with criticism, and everyone realized how late it was which damaged the GeForce brand IMO. The 9700 and 9500 products continued to blow the reviewers and public away, and while OEMs added those products to replace the GeForce4 Ti because of demand....at the same time they began to replace the GeForce4MX with the 9000 to carry the "complete Radeon line-up."

So again, while it is indeed very important to have a good grasp on the low-end market, it's just not the low-end product itself that does that. It's customer demand, and that is influenced by the demand / success of the higher-end products.

Assuming the 9200 is COST competetive to the NV34, I see a tough road for the NV34....

Uttar
03-04-03, 09:45 AM
Originally posted by gstanford
What you meant to say Uttar is that NV34 is DX9 compatible ony, not DX9 compliant (compliance implies hardware capable of DX support, compatible is software enabled).

Yes, sorry, my mistake.

Joe: Agreed. But the GeForce is still an excellent brand. It's as good, if not better, than the Radeon brand.
If nVidia didn't deliver with the NV35, then they're in big trouble for their brand. But for now, they're fine.


Uttar

Joe DeFuria
03-04-03, 10:18 AM
Originally posted by Uttar
Joe: Agreed. But the GeForce is still an excellent brand. It's as good, if not better, than the Radeon brand.

Probably true at this time. However, the 3dfx / Voodoo Brand used to be the top-notch brand too. ;) Brands certainly help, but they don't work miracles....especially in this rather unforgiving market...

Things change, and it's quite possible that the GeForce brand isn't enough to prop up this round of nVidia's products...on the contrary, this round of products might ruin the GeForce brand.

As far as I can tell, a good product brand might keep you in good graces for 1 product cycle of inerior products....MAYBE 2.

If nVidia didn't deliver with the NV35, then they're in big trouble for their brand. But for now, they're fine.

First of all, nVidia HASN'T delivered with the NV35. Where is it? Until the NV35 is out, and we get to judge it against whatever ATI has at the time, who knows.

Secondly, the NV35 could be 1 single solitary 3DMArk point slower than the competition....but that's all it takes for the "public" to blow it way out of proportion and decry it as a failure. ;)

Right now, I agree that the GeForce brand is pretty stong....but as far as the trends go, it's the radeon brand that's going in the right direction.

Paul
03-04-03, 11:05 AM
Originally posted by Joe DeFuria
First of all, nVidia HASN'T delivered with the NV35. Where is it? Until the NV35 is out, and we get to judge it against whatever ATI has at the time, who knows.

He didn't word it correctly. What he actually means is that "If nVidia doesn't deliver with the NV35, then they're in trouble" - Which is true.

It's going to come out after the R350 by a number of months, so they really need to be quicker, rather than just about on a par, as it is with NV30-R300. However, it's looking like ATi are being very aggressive with the R350, certainly with regard to their memory selection.


Right now, I agree that the GeForce brand is pretty stong....but as far as the trends go, it's the radeon brand that's going in the right direction.

Currently, the Geforce brand is incredibly strong in the mainstream market, where the users are going to be less tech-savy than people on forums like these. It's only places like this that nVidia are getting slated - The general public (the core revenue) don't know any better just yet, and won't for quite a while, as was the case when 3dFx started going down the pan - They had a number of successive problems before news filtered out to the average Joe on the street that they should steer clear.

nVidia need to concentrate now more than ever, and deliver. They can't wait until NV40 or NV45 - They have to act now.

Solomon
03-04-03, 11:25 AM
A reader suggested the following. They are probably going to .15 micron with their main line beacause they could be anticipating a "substantial" loss of revenue with their .13 micron line. Sounds to me that they aren't expecting to make a lot of money with their .13 micron products and resorting back to .15 micron to make face. Or I could be just talking out of my Arse. Heh.

Regards,
D. Solomon Jr.
*********.com

Uttar
03-04-03, 11:32 AM
Originally posted by Paul
He didn't word it correctly. What he actually means is that "If nVidia doesn't deliver with the NV35, then they're in trouble" - Which is true.

Thanks :) I didn't even realize that wasn't exactly the same thing. English isn't exactly my native language, even though I can type english fairly fast...

It's going to come out after the R350 by a number of months, so they really need to be quicker, rather than just about on a par, as it is with NV30-R300. However, it's looking like ATi are being very aggressive with the R350, certainly with regard to their memory selection.

Sorry to correct you on your wording, since you're obviously better at that than me, but anyway...
"A number of" should not be used when launch difference is 3 months or less - a few is better ;) Of course, that is if nVidia is on schedule...

nVidia need to concentrate now more than ever, and deliver. They can't wait until NV40 or NV45 - They have to act now.

Agreed. The NV35 *will* beat the R350. That's not the question.
The question is wether ATI's 0.13 R350 can beat the NV35...


Uttar

Skynet
03-04-03, 11:45 AM
It is pretty obvious to me that Nvidia followed the .13 micron path for the wrong reasons. They knew that the NV30 was not going to beat up the Rad9700 like they had promised and a great many people assumed it would. In a desperate attempt to increase the performance they upped the clock speed to dangerous levels and we all know the results of that strategy.

If Nvidia did not feel the desperate need to outperform the R300 at all cost they would have used the process shrink cycle that has been proven to be successful in the past. Both AMD and Intel have consistently introduced the lower speed or less complex processor on the new process first. This makes so much sense because it allows for the refinement of the process on a slower, less complex part, and most importantly a much lower volume part.

It cannot be overstated how much it hurt Nvidia to not follow this logical course of action. I forsee very very dark days for Nvidia this year. It is not looking good AT ALL. As for NV35, what makes anyone think Nvidia can pull off a miracle and release this thing in a few months? You can barely even get your hands on a GFX as it is.

muzz
03-04-03, 12:01 PM
I have not seen the NV35 or the R350, so I cannot say for CERTAIN that the NV35 WILL beat the R350 , until they ARE released........
To say anything else is pure DELUSION IMO, if you have benchmarks from both cards, then please do share.:D

Solomon
03-04-03, 12:14 PM
I have to agree with Muzz, there is no way anyone can say the NV35 will beat the R350. Besides with the problems obviously forseen with Nvidia's announcement of the .15 micron used on mainstream GeForce FX's. There is no safe assumption that the NV35 be what everyone "thinks" it's going to be. As for the memory, I hope that 256-bit is used on NV35. But for the core speed, it's not set in stone as seen by the .13 hard ships of the Ultra's 500Mhz clock being so fricken hot. Heh. I love reading these "rumor specs" as they always list like 600Mhz and 700Mhz or what not for core speeds! Hehe.

Regards,
D. Solomon Jr.
*********.com

tamattack
03-04-03, 02:02 PM
Originally posted by Joe DeFuria
So again, while it is indeed very important to have a good grasp on the low-end market, it's just not the low-end product itself that does that. It's customer demand, and that is influenced by the demand / success of the higher-end products.

Exactly. Remember that ATI used to be the volume & marketshare leader based almost solely on the lowend, but when they missed their ship date targets for Rage128, they steadily began to lose marketshare over the following year.

Paul
03-04-03, 02:18 PM
Originally posted by Uttar
"A number of" should not be used when launch difference is 3 months or less - a few is better ;) Of course, that is if nVidia is on schedule...

The NV35 isn't coming out during (or before) June. You're looking at late July/early August as a more feasible deadline.


Agreed. The NV35 *will* beat the R350. That's not the question.
The question is wether ATI's 0.13 R350 can beat the NV35...

Not entirely sure what your point here is. The main ATi chip is being manufactured at 0.15, with the lower end products being 0.13. The RV products won't beat the NV35, although the R350 might be able to match it in most benchmarks.

Originally posted by Skynet
It is pretty obvious to me that Nvidia followed the .13 micron path for the wrong reasons. They knew that the NV30 was not going to beat up the Rad9700 like they had promised and a great many people assumed it would. In a desperate attempt to increase the performance they upped the clock speed to dangerous levels and we all know the results of that strategy.

Using a new process and ramping up the clock speed are two separate issues. The 0.13 micron switch can't be done at the drop of a hat, and will have been planned for quite a while, no doubt before the 9700 benchmarks were around.

The clock speeds were probably pushed to the max on the Ultra as a result of the 9700, yes. That is shown by the need for something like the Dustbuster. If the 9700 hadn't of been the challenge it is, nVidia would have clocked lower and shipped with a standard fan.

As for NV35, what makes anyone think Nvidia can pull off a miracle and release this thing in a few months? You can barely even get your hands on a GFX as it is.

I've pointed this out in other threads - The NV35 has been in production for longer than most people realise. They didn't start the moment the NV30 shipped to manufacturers - It's been in development for a number of months already. They've got a lot of fixes (hardware fog, for example) and additions in there (not least the 256bit bus, Solomon), which should help them fix the problems that are proving so troublesome to the NV30 right now.

They really could have done with Low K, but copper interconnects should help a little with regard to the heat issues. I don't expect it will be clocked much higher, and i don't think it needs to be. It's problems with the architecture that are holding the chip back, not clock speed.

Solomon
03-04-03, 02:35 PM
Well I know for one thing.. I ain't believing the hype. Look at the NV30, GeForce FX. So many sites hyped the hell out of it. Now look how it's actually performed. I'll wait until we see "real numbers" and not PR smoke. You can say all this was fixed, added, etc... The fact remains it's vaporware until it's released. Just my two cents.

Regards,
D. Solomon Jr.
*********.com

Paul
03-04-03, 02:59 PM
Sure, but we're also speculating about performance of the R350. We may know the specifications, but there are no performance results. That is as much vaporware as the NV35.

Cotita
03-04-03, 03:00 PM
Originally posted by Skynet

It cannot be overstated how much it hurt Nvidia to not follow this logical course of action. I forsee very very dark days for Nvidia this year. It is not looking good AT ALL. As for NV35, what makes anyone think Nvidia can pull off a miracle and release this thing in a few months? You can barely even get your hands on a GFX as it is.

ATI managed to pull off a miracle with the R300 so its possible for nvidia to get their own.

When the radeon 8500 was released, it was a huge dissapointment, not only was it slower, but had driver issues, when nvidia released the nv25 things just got worse for ATI. Many people tought that the R300 would be sort of a nv25+dx9, almost reaching ti4600 performance with the added bonus of DX9. Now we all know how that story turned out. So maybe nvidia can pull it off too, only time will tell.

Cotita
03-04-03, 03:15 PM
The fact that nvidia will be making highend nv3x at .13 and low end at .15 makes sense somewhat if nvidia's statementes regarding TSMC are correct.

TSMC has a huge .15 capacity that has been working for years delivering high yields specially for low end parts.

Wether TSMC .13 process is mature enough to provide .13 parts in a high volume for ATI remains to be seen, specially when nvidia claims to have most of the .13 manufacturing capacity for their own chips.

Again only time will tell who made the right choice, still, ATI has the advantage here, because they already have the 9500 line, so there is really no need to rush.

digitalwanderer
03-04-03, 03:25 PM
Originally posted by Cotita
ATI managed to pull off a miracle with the R300 so its possible for nvidia to get their own.

Yes, but their track-record as of late don't exactly encourage...I'd say the odds are pretty high AGAINST a miracle out of the NV35 squad. :)