PDA

View Full Version : Just let the NV30 RIP please...


Pages : [1] 2 3 4 5 6 7 8 9 10 11

DMA
04-14-03, 01:58 PM
So, are you tired of hearing how bad that chip is or what?
Ever since the first previews of NV30 it has been painful to visit hardware forums. Everywhere you go you hear about it. I bet we all know by now that the FX 5800 Ultra can't beat 9700 Pro, but why go on and on about it? :)

We need something new to discuss, we need NV35..now!! :D
Please please please god, don't let nV's new chip blow too, i can't take another six months of this :p

And no..i have nothing against ATi (Or NV for that matter), i have a 9700 Pro myself. But it will be more fun with a hard fight for the top spot..

Have a nice day


:bye:

marcocom
04-14-03, 02:04 PM
whats amazing to me is how it nv30 just suddenly becomes crap because of some benchmark scores (which are changing with every new driver revision)

criticizing my GeforceFX makes about as much sense as telling me that, for example, my Ferrari is a piece of crap because its just a bit slower than a Lamborghini.

fukn nonsense and each of you sounds like a total newb when you say it.

ATi is having a good year, but they are not the innovators that nvidia are. nvidia is taking all the risks to improve gaming, introduce new methods and codes, revolutionize 3D gaming, and all you people do is criticize the work because some other mfr just speed-tweaked their hardware a little further (9800pro) as if thats all that matters...speed.

Uttar
04-14-03, 02:14 PM
The NV30 ain't all that bad. That is, if everything is made with it in mind.
The NV30 is horribly bad and slow with the OGL2.0. & DX9 standards. It is good, though, if you use Cg & some integer processing. It still ain't great, but it ain't bad.

The problem is that overall, even with that, it's still bad :(

Okay, the NV35... Well, this ain't the rumor mill, but anyway...
Just posted a summary of the rumors over at GPU:RW ( http://www.notforidiots.com/GPURW.php )

The model at $399:
500Mhz core on TSMC's 0.13u ( IBM is just too early yet - remain assured that they're not going to remain idle until the NV40, though! )
128MB 400Mhz DDR-I ( might even be 256MB! :) ) w/ 256-bit memory bus
True 8 pipeline architecture ( 8FP/8FX/8TMUs instead of 4FP/8FX/8TMUs )
Most of the design is optimized both in speed & transistor count ( mostly transistor count, I think ) , but nothing major.

( The GPU:RW update is planned for in about a week, BTW - but please don't post that too much, I'm not sure at all I can accomplish that deadline )

Please note that $299 & $499 models are both quite likely to appear.


Uttar

marcocom
04-14-03, 02:31 PM
firstly, the nv30 didnt fail. its a beautiful design that is just slow to take off because of how powerful it is. designing a faster geforce4 (the ATi method) would have been much easier. they are taking risks and i like risks and pioneering.

what failed was DDR2. nvidia made a huge (and really the only true mistake that caused all the other mistakes) when they excitedly jumped on a new exotic ram that had a buzz around it and next thing they knew they were in a contract for a ram that was way to unstable without extra cooling (hence the need to build a custom, expensive cooling solution) and a special PCB (meaning that nvidia was going to need to hold this products hand through production and build it all themselves because of these exotic needs) and then finally sourcing issues.

next thing they knew, they couldnt make their schedule, they were way over-budget, and so they got even stupider and went for a cheaper 128bit ram interface to cut costs. people have lost their jobs, careers are ruined over the millions lost when a very expensively developed product was totally screwed up by some final details that spoiled the soup.

DDR2 has lost their contract now. they totally screwed up and couldnt fulfill their commitments. ATi will not deal with them, and nvidia is dropping the standard for their nv35. problem solved.

DDR2 sucked, but nvidia is the single most innovative entity in this business. **** nvidia, and your ****ing yourselves. because ATi will never do anything without nvidia doing it first. (if you dont know their track record, then check it out over the past 5 years. ATi rage shipped full price for 6 years without upgrading. if ATi can sell it, they will. ATi is very shrewd and very smart, but innovators they are not and they will never be the company or supporter to this industry that nvidia is.

Moose
04-14-03, 02:46 PM
I would beg to differ with ATI not being innovative. I'd say that the R9700 was quite innovative and they made some very well thought out decisions on what to put into it. So much so that Nvidia got caught with their pants down with the NV30 and have been scrambling ever since to try to compete with a 6 month old card.

They didn't pick DDR2 at the start. They only picked it when they realized that ATI's 256 bit bus would have the NV30 for lunch without it (and with it for that matter as it turns out).

Truform is another ATI innovation which if it wasn't for Nvidia controlling the developers the way they do, would be much bigger.

For what its worth, until Nvidia "innovates" some usable image quality I'm staying with ATI.

Speed at the price of IQ just doesn't cut it any more.

DMA
04-14-03, 02:53 PM
Hmm, yeah, bad choice of words by me there, of course NV30 doesn't blow. But people like to compare, and man..they have done that alot the past months ;)
That was what my post was about. People gotta stop ranting about how bad NV30 is day in and day out.

And reading "Uttar's" post about NV35 made me warm inside :D
I can't wait to see some benchies at E3(hopefully)

256bit with good old(and cool) DDR..thank god :cool:

Uttar
04-14-03, 02:58 PM
Marcoom: Well, there's the fragment processing speed problems too. It's possible to get good performance out of it, but it's ridiculously hard. The design just got too many restrictions, IMO. I don't mean it's a bad design, though - it simply isn't amazing due to a few quite important problems. Beside that, though, I've got to agree the NV30 is *very* innovative. The 9700 isn't a "big GF4" either, though - the 9700 is quite innovative, too. And considering the timeframe, it might even be considered more innovative. Compared to the GF4, though, the NV30 is still the winner IMO. Too bad it's a too ambitious design and it got so many problems... Let's hope most of them are fixed with the NV35 :)

Originally posted by Moose
They didn't pick DDR2 at the start. They only picked it when they realized that ATI's 256 bit bus would have the NV30 for lunch without it (and with it for that matter as it turns out).

Wrong, wrong, wrong. Please get your facts straight.
The NV30 was *always* supposed to have DDR-II. But it was originally supposed to have GDDR-II faster than the core clock. That has been confirmed in several forums by several sources, IIRC.

I'd guess the original NV30 was more like 400/500, with Low K. But they had to retrieve Low K, and increase the clock speed to remain kinda competitive on a Vertex Shading & Pixel Shading front, as well as to get the Texturing advantage. Which explain the whole heat problems.
But this part is speculation, though.


Uttar

marcocom
04-14-03, 02:58 PM
Originally posted by Moose
I would beg to differ with ATI not being innovative. I'd say that the R9700 was quite innovative and they made some very well thought out decisions on what to put into it. So much so that Nvidia got caught with their pants down with the NV30 and have been scrambling ever since to try to compete with a 6 month old card.

They didn't pick DDR2 at the start. They only picked it when they realized that ATI's 256 bit bus would have the NV30 for lunch without it (and with it for that matter as it turns out).

Truform is another ATI innovation which if it wasn't for Nvidia controlling the developers the way they do, would be much bigger.

For what its worth, until Nvidia "innovates" some usable image quality I'm staying with ATI.

Speed at the price of IQ just doesn't cut it any more.

good points. i am forgetting Tru-form and ATI was wise to keep their color calibration and image quality standards from their Apple OEM supplier days.

Steppy
04-14-03, 03:13 PM
Originally posted by marcocom
look guys, im hearins some sensible minds on this thread...so im gonna try and talk some sense here. I worked for years in SiliconValley for a competing graphics card company, and honestly you guys arent quite right.

firstly, the nv30 didnt fail. its a beautiful design that is just slow to take off because of how powerful it is. designing a faster geforce4 (the ATi method) would have been much easier. they are taking risks and i like risks and pioneering.

what failed was DDR2. nvidia made a huge (and really the only true mistake that caused all the other mistakes) when they excitedly jumped on a new exotic ram that had a buzz around it and next thing they knew they were in a contract for a ram that was way to unstable without extra cooling (hence the need to build a custom, expensive cooling solution) and a special PCB (meaning that nvidia was going to need to hold this products hand through production and build it all themselves because of these exotic needs) and then finally sourcing issues.

next thing they knew, they couldnt make their schedule, they were way over-budget, and so they got even stupider and went for a cheaper 128bit ram interface to cut costs. people have lost their jobs, careers are ruined over the millions lost when a very expensively developed product was totally screwed up by some final details that spoiled the soup.

DDR2 has lost their contract now. they totally screwed up and couldnt fulfill their commitments. ATi will not deal with them, and nvidia is dropping the standard for their nv35. problem solved.

DDR2 sucked, but nvidia is the single most innovative entity in this business. **** nvidia, and your ****ing yourselves. because ATi will never do anything without nvidia doing it first. (if you dont know their track record, then check it out over the past 5 years. ATi rage shipped full price for 6 years without upgrading. if ATi can sell it, they will. ATi is very shrewd and very smart, but innovators they are not and they will never be the company or supporter to this industry that nvidia is. ATI went to a 256-bit memory interface before NV did. They tried using more TMU's back the the radeon days than NV did. They had a working hardware accelerated tesselation engine out before NV did. The radeon was a much more advanced GPU than the GF2 was feature wise(It had a LOT of DX8 features already, just not the PS and VS). They've had multichip solutions out before NV(who hasn't had one). ATI focused on 32-bit color before nvidia did(the rage fury was the first to have very little performance hit with 32-bit, followed by the radeon) ATI and NV have BOTH done their fair share of innovation. NV had the on die T&L unit and the programmable T&L unit. Most of NV's other innovations were stolen from 3dfx, so I really don't see how you arrive at the conclusion you did. NV30 also failed because it HAS to be clocked as high as it is to match R300, which brought on the fan and the complex pcb. BTW, NV said the "ATI method" was impossible(a complex high transistor count GPU clocked reasonably high on .15). I think your post is giving WAY too much credit to NV, and way too little to ATI.

marcocom
04-14-03, 03:37 PM
im afraid that perhaps i am a bit biased towards nvidia and perhaps had my blinders on in those days, because i dont think i was paying much attention to ATI's work then.

I guess i just remember the way 3dfx and nvidia nurtured the development of speedy new API standards like glide and openGL, while all i see is ATi focused only on what works for them to win, like their focus on microsoft's DirectX.

Trueform is ATI doing what i want them doing with my 400$

Cg gets me hot too. i like anything to take us forward, especially feature-set. i feel like ATi is focusing on benchmarks because thats all they really are focussed on.

I run all of my nvidia products on Quadro certified drivers (for daytime work usage on this machine) and in SoftQuadro mode and frankly, the ImageQuality has always been top-notch for me in both 2D and 3D . IQ only gets cut out for speedy drivers and is a non-issue if one stays clear of that whole rat race.

im stoked dude! i have an nvidia card (try 3dmax without one) and its ALMOST AS FAST AS THE LATEST ATI CARD in its first edition. wow what a great start! why is everyone looking at this like sink or swim? (ferrari/lamborghini argument again)

marcocom
04-14-03, 03:39 PM
i guess i should mention this, but i do not own a ferrari. sorry if that example was misleading. just an illustration.

sorry gentlemen.

Steppy
04-14-03, 03:44 PM
Being the market leader allowed both 3dfx and Nvidia more freedom to innovate, since most software is geared for the market leader, deviating from that is usually fruitless(case in point the 3 TMU's of the original Radeon...since NV was the market leader at the time and only had 2 TMU's most games would only make use of 2 texture layers, leaving the 3rd one of the radeon idle 95% of the time. Had a game made use of a third texture layer, you'd have seen the Radeon still run that game at the same speed as if it had two texture layers since it could do 3 in a single pass, but the GF2's would have lost a lot of speed since that third texture would have required a second pass.)Now ATI is in the drivers seat, so you may in fact see the opposite happen...software geared for ATI hardware and NVidia having to stick for the most part with ATI's way of doing things.

marcocom
04-14-03, 04:03 PM
thats pretty insightful.

lately it seems though as if many developers are signing their allegiance to nvidia. keeping in mind the completely different schedule that game developers see technology on (were talking a year ahead of Cebit usually) is it possible that nvidia has shown what they have in mind for the future and that its more impressive and all-encompassing to these developers than we estimate?

because this is confusing lately, EA and others jumping on nvidia's side in this.

StealthHawk
04-14-03, 04:50 PM
Originally posted by Steppy
They've had multichip solutions out before NV(who hasn't had one).

how is that innovative? the Maxx needed 2 RageFury chips just to compete with the gf256, which it was on par with at best. the gfddr was by far superior.

ATI focused on 32-bit color before nvidia did(the rage fury was the first to have very little performance hit with 32-bit, followed by the radeon)

while this is true it is also true that the TNT was faster in 16bit than the RageFury. go figure. so in that regard looking solely at the performance hit is misleading. the RageFury was still a lot faster in 32bit than the TNT though. it should also be pointed out that the TNT was supposed to be clocked at the same speed as the TNT2 was, but the clocks had to be throttled down because of heat.

ATI and NV have BOTH done their fair share of innovation.

i'll agree to that :)

marcocom
04-14-03, 05:58 PM
Originally posted by Onde Pik
:lol:



Hierarchical Z
Z-Compression
Fast Z-Clear
Shadow Buffer
3D Textures
etc.


Hmm funny though how your resume doesn't mention working for either Diamond or S3. :hmm:

ok anyways..i guess your going to my portfolio site. you brits (err.. danish ) are pretty intense i guess...heh

those listed above are technologies that ATI brought to the table? because those sound alot more like internal operations with catchy marketing names. i appreciate the thread response but isnt this kind of a stretch when compared to nvidia's contributions to OpenGL and to directX and to just about every developer that wants to try something NEW.

is this the way forward? i had a LAN party i attended a few weeks ago in a guys garage in south-central los angeles that was sponsored by nvidia with t-shirts and stickers and posters and whatever they could send after just one phone call to their marketing dept. ATI didnt even return the phonecall. besides how erect i get when thinking about 128bit color and new dimmensions of cg rendered goodness.

great, ATI runs without needing extra cooling. wtf. the FX chip is piping hot because theyre building themselves an entirely independent, programmable GPU there! piping hot with possibilities and im totally amazed they managed to get it to such competitive speeds too! poor parhelia...

i mean are we jsut saying 'nvidia is 5% slower and so lets just change direction and focus and forget the push to cinematic effect rendering? because i have to tell ya...i dont see alot of incentive for ATi to ever do a damn thing but just using their french-canadian money and endless resource to keep beating nvidia at just one game...speed. and its working for them.

congrats on getting that extra 7fps on that one feature there...but im interested in developers getting access to next-level dynamic capabilities and software potential.

maybe its because nvidia is a california company... :)
but im trying to look at this as a gamer, not as some techie . im a creative artist and a consumer and ive seen alot from nvidia, but i have not seen jack from ATi

i prefer ATi following nvidia into the 3D world and not impeding the evolution with another speed race.

Moose
04-14-03, 06:30 PM
Originally posted by Uttar

Wrong, wrong, wrong. Please get your facts straight.
The NV30 was *always* supposed to have DDR-II. But it was originally supposed to have GDDR-II faster than the core clock. That has been confirmed in several forums by several sources, IIRC.

That may well be, but I didn't hear Nvidia say anything about DDR2 until after ATI announced they would use a 256 bit bus. At which point all the PR hype was about how 256 bit was unnecessary. I followed the whole GFFX saga pretty closely up until they missed the holiday season and I saw the FX flow fan. Then I kind of lost interest for obvious reasons. Got a link for any of those discussions???

Steppy
04-14-03, 07:03 PM
Originally posted by StealthHawk
how is that innovative? the Maxx needed 2 RageFury chips just to compete with the gf256, which it was on par with at best. the gfddr was by far superior.



while this is true it is also true that the TNT was faster in 16bit than the RageFury. go figure. so in that regard looking solely at the performance hit is misleading. the RageFury was still a lot faster in 32bit than the TNT though. it should also be pointed out that the TNT was supposed to be clocked at the same speed as the TNT2 was, but the clocks had to be throttled down because of heat.



i'll agree to that :) The rage fury came out between the TNT and the TNT2, and ATI was able to get a card out that was nearly on par with a card nearly two generations older. I'd call that innovative(at least the AFR method). This was before ATI really focused on the 3D market(with the radeon).

The Baron
04-14-03, 07:16 PM
Uttar, I disagree with you on the NV35 specs. I think the clocks will be at least as high as the NV30 because of the appearance of it. If a card is released with SLOWER memory than the 5800 Ultra, you are going to get reviews along the lines of, "What the ****--where's my DDR2?! My memory speeds! Oi Vey!"

I think we'll see either 550/550 (not a big increase) or 500/500 again with a 256-bit bus. Some improvements here and there, nothing too major besides the 256-bit bus. IQ improvements, FSAA/AF speed bumps, that kind of thing.

But, I don't think clocks will go down.

shmall
04-14-03, 07:17 PM
Originally posted by marcocom
look guys, im hearins some sensible minds on this thread...so im gonna try and talk some sense here. I worked for years in SiliconValley for a competing graphics card company, and honestly you guys arent quite right.

firstly, the nv30 didnt fail. its a beautiful design that is just slow to take off because of how powerful it is. designing a faster geforce4 (the ATi method) would have been much easier. they are taking risks and i like risks and pioneering.

what failed was DDR2. nvidia made a huge (and really the only true mistake that caused all the other mistakes) when they excitedly jumped on a new exotic ram that had a buzz around it and next thing they knew they were in a contract for a ram that was way to unstable without extra cooling (hence the need to build a custom, expensive cooling solution) and a special PCB (meaning that nvidia was going to need to hold this products hand through production and build it all themselves because of these exotic needs) and then finally sourcing issues.

next thing they knew, they couldnt make their schedule, they were way over-budget, and so they got even stupider and went for a cheaper 128bit ram interface to cut costs. people have lost their jobs, careers are ruined over the millions lost when a very expensively developed product was totally screwed up by some final details that spoiled the soup.

DDR2 has lost their contract now. they totally screwed up and couldnt fulfill their commitments. ATi will not deal with them, and nvidia is dropping the standard for their nv35. problem solved.

DDR2 sucked, but nvidia is the single most innovative entity in this business. **** nvidia, and your ****ing yourselves. because ATi will never do anything without nvidia doing it first. (if you dont know their track record, then check it out over the past 5 years. ATi rage shipped full price for 6 years without upgrading. if ATi can sell it, they will. ATi is very shrewd and very smart, but innovators they are not and they will never be the company or supporter to this industry that nvidia is.


Great post, and very true (IMHO)...

I have been an Nvidia fan since the first day I got a TNT card and was longing for the FX to be awesome to replace my much loved GF4 card.

However I have one over to the other side (ATI9700pro) for the first time in a long time but I am hoping the NV35 will bring me back :)

Simon.

Steppy
04-14-03, 07:17 PM
Originally posted by marcocom
those listed above are technologies that ATI brought to the table? because those sound alot more like internal operations with catchy marketing names. i appreciate the thread response but isnt this kind of a stretch when compared to nvidia's contributions to OpenGL and to directX and to just about every developer that wants to try something NEW.

Last time I checked, EVERY feature of a video card is "internal operations"...there ain't no little green men inside my computer painting pictures on my monitor screen.
How about an example of these contributions to OGL and DX here?

is this the way forward? i had a LAN party i attended a few weeks ago in a guys garage in south-central los angeles that was sponsored by nvidia with t-shirts and stickers and posters and whatever they could send after just one phone call to their marketing dept. ATI didnt even return the phonecall. besides how erect i get when thinking about 128bit color and new dimmensions of cg rendered goodness.

So, because NV sent you a few T-shirts means they're more innovative? It sounds like because they're marketing department made you happy that you put on the NV sunglasses. BTW, it's not 128-bit color, it's 128-bit used for calculating colors.

great, ATI runs without needing extra cooling. wtf. the FX chip is piping hot because theyre building themselves an entirely independent, programmable GPU there! piping hot with possibilities and im totally amazed they managed to get it to such competitive speeds too! poor parhelia...

i mean are we jsut saying 'nvidia is 5% slower and so lets just change direction and focus and forget the push to cinematic effect rendering? because i have to tell ya...i dont see alot of incentive for ATi to ever do a damn thing but just using their french-canadian money and endless resource to keep beating nvidia at just one game...speed. and its working for them.

Running software is the main thing these cards do...'innovation' doesn't mean much of anything if the end result is subpar. How things get done isn't very important if the performance and quality isn't as good as the competitor. ATI has not only the speed, but arguably the better 'cinematic' rendering too. If you worked for S3, I can see why they're no longer a player in the video card market, because your views of the industry over the past 5 years seems nearly the opposite of what actually transpired over that time. ATI focusing on speed and NV on IQ??? Not quite.

congrats on getting that extra 7fps on that one feature there...but im interested in developers getting access to next-level dynamic capabilities and software potential.

maybe its because nvidia is a california company... :)
but im trying to look at this as a gamer, not as some techie . im a creative artist and a consumer and ive seen alot from nvidia, but i have not seen jack from ATi

i prefer ATi following nvidia into the 3D world and not impeding the evolution with another speed race. I don't see how it's looking at it from the gamers perspective and not a techie's, because your comments paint the exact opposite picture. As a gamer you're gonna want more FPS at a higher framerate regardless of how the hardware is doing it. Buying more 'innovative' tech that is slower at the task it's designed to do IS what a 'techie' would do.

threedaysdwn
04-14-03, 07:48 PM
Just to clear up...

You may remember that ATI *intentionally* slowed 16-bit performance of the rage128 chip in order to make it appear as though it lost no ground when going to 32-bit. The TNT2 offered about the same 32-bit performance and better 16-bit performance... go figure (while ATI's wonder-graphs showed 3-5% loss in 16 vs 32 bit and they made a huge deal about it).

NV30 was supposed to be out last year. Nvidia did the work and had the specs ready on time. The problem was TSMC. They couldn't deliver.

And now because they (tsmc) could only produce NV30 as non low-k (and it was never designed for a non low-k process), it requires a leaf blower to cool it.


I certainly hope that NV35 is produced by IBM. Their .13mu process is far ahead of TSMCs and will benefit Nvidia (and us) greatly. Might be more expensive for them... not sure about that.

marcocom
04-14-03, 08:20 PM
ya steppy i guess your right. nvidia sucks.

and the nv30 scores mid 5000+ in 3dmark and its still slow and lame and stupid. hasnt even shipped and its clocked at 1ghz and is totally software configurable so the drivers could possibly uncover upwards of 25% increases over just the next year but thats all totally thin. rubbish. nope, ATi owns em.

and their support for this industry(t-shirts and much more. talk to carmack about doom3, talk to Lithtech about Tron2, and you see that nvidia mark because nvidia is holding their hand through each vision , over the long haul) is totally inconcequential because of some specs you can read off the back of the ATi box and a 500 score increase in 3dmark?

http://www.rage3d.com

threedaysdwn
04-14-03, 09:03 PM
Originally posted by CoWBoY


Yes, we are aware of this... welcome to the forum. :D lol


Maybe you didn't notice that I've been registered much longer than you :P

And actually much longer than that, but for some reason had to create a new login (i don't post often, must've been deleted). Anyway yeah, I've been here since it was nvnews.telefragged.net (or whatever it was)
=D

Steppy
04-14-03, 09:33 PM
Originally posted by marcocom
ya steppy i guess your right. nvidia sucks.

and the nv30 scores mid 5000+ in 3dmark and its still slow and lame and stupid. hasnt even shipped and its clocked at 1ghz and is totally software configurable so the drivers could possibly uncover upwards of 25% increases over just the next year but thats all totally thin. rubbish. nope, ATi owns em.

and their support for this industry(t-shirts and much more. talk to carmack about doom3, talk to Lithtech about Tron2, and you see that nvidia mark because nvidia is holding their hand through each vision , over the long haul) is totally inconcequential because of some specs you can read off the back of the ATi box and a 500 score increase in 3dmark?

http://www.rage3d.com First, I did not say Nvidia sucks, I just countered YOUR viewpoint that ATI sucks and is not innovative. With this reply I guess you really want both barrels huh? I'm not the one who says one thing, then types a paragraph saying the EXACT opposite.

1. Software configurable? Yeah, if they optimize for Nvidia they can get 25% speed boost...but if they did the same thing for ATI that should also be the case. And those optimizations are on WHQL drivers without sacrificing IQ right?. Optimizations also have what exactly to do with innovation? I'm not the one who seems to have no clue what HSR, z-culling, and a myriad of other "fancy marketing names" are am I? But yeah, I'M the one reading specs off a box. :rolleyes: I did see that when using Industry standards(which if EVERYBODY would use would mean developers optimize ONCE and ALL cards benefit)NV30 was running at HALF the speed as R300, and once it uses the NV30 path it performs slightly faster at the cost of some IQ. Boy how silly I am in wishing that the time he HAD to use to 'optimize' NV30 could have been spent optimizing STANDARD paths(which if NVidia had made NV30 to perform well on would benefit it just as much) and/or getting OTHER work on the game done. As for Tron 2, if NV30 stuck to standards, they could just code for the standard and wouldn't NEED a developer holding their hand. I don't want EITHER company straying from standards, because all that does is add work for the developer and lengthen production times. Now 3dmark...it's funny you should mention that as that is probably one of the things that shows the LEAST disparity between the two and Inever even mentioned 3dmark, but obviously this more 'innovative' tech has an issue that YOU know about to assume I was talking about it...go look at some damn game benchmarks. Enjoy the fog in NFS...oh wait there isn't any. YOU are saying NV30 is more innovative, but didn't bring up anything specific. Even with these problems according to you we should buy it because it's 'more innovative' though so far your idea of innovation seems to be in the marketing arena. I brought up specific examples of things that both companies innovated on, you brought up some that NV did, some they didn't , and completely dismissed anything ATI did. I said it before and I'll say it again(since you seem to think it gives you 'special' opinions), if YOU are an example of how S3 watched its competitors, it is most definately not a shock they are a small time player in the market.

ChrisW
04-14-03, 09:42 PM
I'm sorry, but just what is so inovative about the NV30? It has the same 4x2 architecture of the GeForce4, same anisotropic filtering, same FSAA modes (can be enabled in software), and the same 128 bit memory bus. Ok, it has really long and slow pixel shaders and is based on .13 technology but what else? It had to be way overclocked just to compete with the six month old 9700 and had to use brand new technology (DDRII) that didn't exist when the card was supposed to be released. It has to use two slots and use a giant, loud fan to cool it. If it overheats while playing a game, it will underclock itself and slow down your game. For DirectX 9 games, it has to use hacked drivers that drop the floating point precision down below the minimum requirements of DirectX 9 just to compete in speed with the 9700, and while doing that, it drops image quality out the window. They even had to tell reviewers to use settings that set the GFFX to much lower image quality settings than the 9700 just to compete in speed on benchmarks. I can go on and on...

Basically, the way I see it, the GFFX is nothing more than a highly overclocked GeForce 4 with super long pixel shaders and floating point precision and nVidia has had to use every trick in the book to make it compete with the 9700.

Back to the original topic, I seriously hope the NV35 is everything it is being hyped up to be. That would be one great card if it is. The NV30, however, deserves a nice funeral.