PDA

View Full Version : ATI slams NVIDIA!


Pages : [1] 2 3 4 5 6 7

druga runda
11-21-02, 08:48 AM
A bit of PR talk


The focus of our conversation with ATI was dealing with the misconceptions brought about by NVIDIA during the GeForce FX launch. ATI essentially feels that the RADEON 9700 is a more balanced solution than GeForce FX, which doesnt have the bandwidth to perform many of the operations its boasting at an acceptable frame rate.

For instance, NVIDIA is proud to claim that GeForce FX boasts pixel and vertex shaders that go beyond DirectX 9.0s specs, yet a 400-500MHz chip with 8 pixel pipelines running very long shaders would spend all of its time in geometry, bringing frame rate to a crawl. ATI feels that with RADEON 9700s multi-pass capability, having native support for thousands of shaders is useless, as the RADEON 9700 can loopback to perform those operations. ATI ran a demonstration of a space fighter that was rendered using this technique.

As far as NVIDIAs bandwidth claims of GeForce FXs 48GB/sec memory bandwidth, ATI states that the color compression in their HYPERZ III technology performs the same thing today, and with all of the techniques they use in RADEON 9700, they could claim bandwidth of nearly100GB/sec, but if they did so no one would believe them, hence theyve stood with offering just shy of 20GB/sec of bandwidth.

One other clarification is in regards to DDR2 memory support. Late last week rumors were floating around that ATIs DDR2 demonstration wasnt actually running as DDR2 memory. ATI reiterated that the RADEON 9700 memory controller does indeed support DDR2 and that was the memory type used in the demonstration board.

from http://firingsquad.gamers.com/features/comdex2002/page3.asp

huh... 100GB/sec :D... now that is nice, and may explain that Nature score were R300 scores better than NV30 despite 175 mhz slower clock.

So perhaps we should stick to real numbers 16Gb/sec for NV30 and 19 gb/sec for R300. Anyway... they mention R350 on 15um process as a spring refresh to combat NV30,


Finally, it will be interesting to see what ATI is up to come February. Word on the street is that ATIs follow-up to RADEON 9700 PRO (codenamed R350) has taped out recently and will be marketed under the name RADEON 9900. Weve heard conflicting reports on its manufacturing process, we naturally assumed it would be a 0.13-micron part, but weve also been informed by another source that the design is 0.15-micron.

In any case, R350 will certainly boast higher clock speeds and performance, so GeForce FX could be in for quite a battle if it slips any further.





on the page before, plus a few overclocked R300 solutions coming soon from partners.

All in all the battle continues. Which can only be good for us ;)

Bigus Dickus
11-21-02, 09:54 AM
Originally posted by druga runda
For instance, NVIDIA is proud to claim that GeForce FX boasts pixel and vertex shaders that go beyond DirectX 9.0s specs...

The GeForce FX does not go beyond DX9.0 specs. Microsoft may not expose VS/PS 3.0 initially, but they're in there.

Uttar
11-21-02, 12:58 PM
Thank you very much ATI PR for claiming things which, in practice, put in the mud.

You claim between 8:1 and 4:1 Z compression in ALL your docs. As does nVidia. So, let's suppose it's an average of 5:1.
And, err, 20*5 = 100
Which is where that number comes from.

Which means either mean that:
1. ATI's Z compression will never do more than 4:1 in practice. And that means their color compression techniques only save about 25%, since 80+25%=100
2. This is nothing more than stupid, dumb and useless PR. Those numbers are invention and doesn't make any type of sense.

If 1 was true, and if Nvidia's claim of 4:1 ( 400% ) was true, then...
Since the NV30 also got 4:1 compression...
16*4*4 = 256GB/s

Let's even suppose, in practice, NV30's color compression only does 2:1 with AA and 1.5:1 without AA.

That would give us...
16*4*1.5 = 96GB/s without AA ( pretty much the same as R300 )
16*4*2 = 126GB/s with AA ( 25% more than the R300 )


Now, I'd suggest ATI fans agree with proposition 2 ( that this is PR crap ) - else, they're agreeing that R300 color compression barely does any effect and that if nVidia delivers, ATI lost the performance crown and won't be able to win it only with higher clocked memory.

And if you think those calculations aren't so bad, then I'd suggest you remember that with a 256 bit bus, waste will always be higher. Maybe ATI did miracles, but 90% efficiency is really a maximum IMHO.


Uttar

EDIT: After rethinking about this, I've realised that there also are frame buffer writes. But by considering *4 for Z Buffer writes ( while the average might rather be 5 ) and that games are often pretty much front-to-back, I'd say those numbers are still fairly correct.

-=DVS=-
11-21-02, 01:18 PM
And if you think those calculations aren't so bad, then I'd suggest you remember that with a 256 bit bus, waste will always be higher. Maybe ATI did miracles, but 90% efficiency is really a maximum IMHO

Well Uttar 90% efficient 256bit bus will have more bandwith then 100% 128bit bus thats for sure :p its like haveing 40% advantage in bit power :D IMHO




But "yet a 400-500MHz chip with 8 pixel pipelines running very long shaders would spend all of its time in geometry, bringing frame rate to a crawl"

this statement might be very legit , just think for a moment even if NV30 will do better in current low level ps/vs games , it would do badly on a game who use all the eye candy NV30 can do becouse of shader calculations and need for biggg bandwith beyoned 128bit bus , you can't compress everything you know ;)

Nutty
11-21-02, 01:33 PM
yet a 400-500MHz chip with 8 pixel pipelines running very long shaders would spend all of its time in geometry, bringing frame rate to a crawl

Its a bit vague. Do they mean pixel shaders, or vertex shaders? If they mean pixel shaders, then nv30 wont spend all its time in geometry. If the mean vertex shaders, then I'm pretty sure nv30 has the capability to run long shaders on a fair few vertices before performance becomes a problem. I could run the longest vertex shader ever, and still achieve a few 1000 frames a second. If the number of vertices was sufficiently low.

At the end of the day nvidia's vertex processing is faster than ATI's. So the same situation applies to them.

Nvidia already said stupidly long pixel shaders wont work at acceptable frame-rates. But the limits are large to allow you to do say offline rendering using the card instead of doing it on the cpu.

Uttar
11-21-02, 01:38 PM
Originally posted by -=DVS=-
Well Uttar 90% efficient 256bit bus will have more bandwith then 100% 128bit bus thats for sure :p its like haveing 40% advantage in bit power :D IMHO




But "yet a 400-500MHz chip with 8 pixel pipelines running very long shaders would spend all of its time in geometry, bringing frame rate to a crawl"

this statement might be very legit , just think for a moment even if NV30 will do better in current low level ps/vs games , it would do badly on a game who use all the eye candy NV30 can do becouse of shader calculations and need for biggg bandwith beyoned 128bit bus , you can't compress everything you know ;)

Of course 90% of 256 is higher than 100% of 128. But it still reduces the advantage lead in memory bandwidth of the R300 quite significantly, since 20GB-10% = 18GB. That's a lot closer to the NV30's raw 16GB/s.

Also, it was my understanding that, for the Vertex shaders instructions/temps & for the Pixel Shaders temps, the memory wasn't used. I think small, internal caches worry about that, but I'm not quite certain... That's something I should try to verify.


Uttar

-=DVS=-
11-21-02, 02:16 PM
We also don't have any proof that NV30 uses 128bit bus up to 100% , but in any case in todays games with 500mhz core cloak it should do far better what you think ?
Just by looking in GF4 if it had 500mhz core and DDR 500/500=1000 or DDR2 4x250=1000
It would score better in games then Radeon 9700 i guess :p without aa/aniso and ps/vs

Spectral
11-21-02, 02:20 PM
Of course ATI is gonna smack talk Nvidia.. and vice versa. Thats what competition is all about.

I think we all know the GeForce FX will outperform ATis 9700Pro.

I dont think there is any doubt about that. ATI reminds me of 3dfx back in the day... Always claiming what Nvidia is doing is wrong, and what they are doing is right... Look what happened to 3dfx.

Remember when they said 32 bit color was useless? What about FSAA before there was enough fillrate to really use it.

ATI makes crap cards IMO.. This 9700 Pro has given me more problems than any piece of hardware Ive ever had. Stutter problems galore, crashes to desktop constantly... Its just a piece of crap. I can hardly wait till the GeForceFX comes out so I can have a real videocard again.

saturnotaku
11-21-02, 02:27 PM
Originally posted by Spectral
ATI makes crap cards IMO.. This 9700 Pro has given me more problems than any piece of hardware Ive ever had. Stutter problems galore, crashes to desktop constantly... Its just a piece of crap. I can hardly wait till the GeForceFX comes out so I can have a real videocard again.

Oooh, take that ATI! :D

Seriously, hardware has never been ATI's problem. It's (still) drivers. I had a 9700 myself for a time. I put my Ti4600 back in my machine when I was getting weird OpenGL rendering errors (dynamic lights shining through walls) as well as non-funcitonal FSAA in 16-bit color. Games like Half-Life and Carmageddon 2 use 16-bit and FSAA works wonderfully on my GF4. It may not be the fastest card available now, but it's sure worked a lot better for me.

Holy poop, I'm over 1000 posts now. Hadn't noticed that until just now. :D

-=DVS=-
11-21-02, 02:28 PM
Originally posted by Spectral
Of course ATI is gonna smack talk Nvidia.. and vice versa. Thats what competition is all about.

I think we all know the GeForce FX will outperform ATis 9700Pro.

I dont think there is any doubt about that. ATI reminds me of 3dfx back in the day... Always claiming what Nvidia is doing is wrong, and what they are doing is right... Look what happened to 3dfx.

Remember when they said 32 bit color was useless? What about FSAA before there was enough fillrate to really use it.

ATI makes crap cards IMO.. This 9700 Pro has given me more problems than any piece of hardware Ive ever had. Stutter problems galore, crashes to desktop constantly... Its just a piece of crap. I can hardly wait till the GeForceFX comes out so I can have a real videocard again.


Well in your opinion its crap , but there are many users who have them , enjoy them , and don't have such problems :p
and ATI is far from being 3dfx , ATI promote new tech not downgrade it like 3dfx did it with Voodoo , right now Nvidia is makeing downgrade on 256 buss to ---> 128 :eek: :p

But sure its just my opinion , i to am pissed when something doesn't work but there is a reason for it ,not always its Video cards problem ;)

-=DVS=-
11-21-02, 02:33 PM
Originally posted by saturnotaku
Oooh, take that ATI! :D

Seriously, hardware has never been ATI's problem. It's (still) drivers. I had a 9700 myself for a time. I put my Ti4600 back in my machine when I was getting weird OpenGL rendering errors (dynamic lights shining through walls) as well as non-funcitonal FSAA in 16-bit color. Games like Half-Life and Carmageddon 2 use 16-bit and FSAA works wonderfully on my GF4. It may not be the fastest card available now, but it's sure worked a lot better for me.

Holy poop, I'm over 1000 posts now. Hadn't noticed that until just now. :D


Hehe yeah ATI screwd up on this one i miss my 16bit AA , don't play old games to much so its not a problem :)
you must run Half-Life in 32bit mode -32bit something command and AA works just fine and fast :D

Dunno about Carmaggedon 2 sweet game Carma 3 TDR was buggy as hell , and did not run so good on my Geforce 4ti , did not try it on Radeon 9700 , but they stoped makeing that game :(
new version with dx8 features would look so much better :rolleyes:

jbirney
11-21-02, 03:46 PM
Uttar,

your missing the point. Again read what ATI said again

, and with all of the techniques they use in RADEON 9700, they could claim bandwidth of nearly100GB/sec,

Note the ALL OF THE TECHNIQUES part. There are using more tech then you are giving them credit for. But the whole point is they are not claiming effective bandwidth. They have stated raw bandwidth were as nV provided NDA documents that had 48gb instead of raw bandwidth. I really suggest you get a 9700 and see what it can do as you really have no idea of what it can or can not do until you try it for your self.



Saturnotaku,
you could have added -32bbp to the command line to force HL games to use 32 bit color and thus gotten FSAA to work on all HL games. While having no 16bit is not a good thing I really dont think this will be an issue much longer. After all 16 bit games are so 2001 :)

OT
Its kind of funny how some (not saying you) cried that 3dfx is holding back the industry with no support of 32bit color in the V3 days and now these same people are critical of ATI for not having 16 bit FSAA support. It always funny to see fanboys flip the aurguments to favor their IHV (again not saying you did).

Spectral,
sorry you had issues however lots of other people, including me, have had nothing but success with their 9700. Win some loose some I guess.

ReDeeMeR
11-21-02, 03:48 PM
I agree nv30 is a total waste, you people just dont see it in the cloud of hype, but it's your loss anyway :D

Nv30 wont have enough power for what it brings now, I respect Nvidia for this new technology and I'm not fan any company except maybe AMD :D , but ATi has a bigger advantage here and now, I'm not sure why they still havent reached the Nvidia class in theyr drivers, but still ATi is far superior and Nvidia is late, but I hope it wont end ala 3Dfx way for them as I had 4 of Nvidias cards and they were/are very good products.

Uttar
11-21-02, 03:55 PM
Originally posted by jbirney
Uttar,

your missing the point. Again read what ATI said again



Note the ALL OF THE TECHNIQUES part. There are using more tech then you are giving them credit for. But the whole point is they are not claiming effective bandwidth. They have stated raw bandwidth were as nV provided NDA documents that had 48gb instead of raw bandwidth. I really suggest you get a 9700 and see what it can do as you really have no idea of what it can or can not do until you try it for your self.


Let's suppose it gets nearly 100GB/s with all of the techniques. Now, yes, this does include fast color clear, fast Z clear, Early Z...
But the NV30 *also* got that. So what's the point even considering them?

Now, ATI claims it can do compression according to that PR thingy. Now, if it can, that would obviously get its bandwidth *higher* than 100GB/s, since all of those techniques already put it at 100GB/s effective.

My point is that if ATI can calculate 100GB/s, nVidia can calculate 256GB/s using the exact same idea. Comparing it to the original Voodoo :P

The actual idea would be to compare raw bandwidth to the GF4 bandwidth. Remember that the GF4 did have Early Z. The only two things it misses in bandwidth optimizations, compared to the R300, is:
1. Color compression when doing AA
2. Fast color clear

I'd be surprised if ATI could get at more than 32GB/s effective using that system. Not saying they couldn't; but they'd impress me.


Uttar

vitocorleone
11-21-02, 04:12 PM
ATI has also stated (sorry, don't remember where - saw it on Rage3d) that the 9700Pro is running at about 66% right now and that, through driver improvements, is expected to be 85%+ by the time the NV30 hits the shelves.

Yes, Nvidia will also have driver improvements over time (prob faster than ATI), but ATI will then also have a new card out...

And the cycle continues.

saturnotaku
11-21-02, 04:35 PM
Originally posted by jbirney
Saturnotaku,
you could have added -32bbp to the command line to force HL games to use 32 bit color and thus gotten FSAA to work on all HL games. While having no 16bit is not a good thing I really dont think this will be an issue much longer. After all 16 bit games are so 2001 :)


Yes, but that wasn't my point. There's a difference between supporting a feature for 16-bit color and supporting nothing but 16 bit. I could still play Carmageddon 2 on the 9700 with no problems, but what's the point of having ATI's excellent FSAA if I can't use it in some of my applications. Granted, there are some circumstances where FSAA on my Ti4600 isn't practical, but since Carma 2 only runs at 640x480 max resolution, a maximum degree of FSAA causes no performance problems.

The 9700 is more than capable of FSAA, and darn good FSAA at that. The Voodoo3 simply couldn't do 32-bit color at all. Unless someone tells me otherwise, I can't imagine it's that difficult to have a driver update that includes 16-bit FSAA support. Back in the 3dfx days, no one expected them to release a driver that all of a sudden magically gave true 32-bit color support.

Bigus Dickus
11-21-02, 04:54 PM
Originally posted by Spectral
ATI reminds me of 3dfx back in the day... Always claiming what Nvidia is doing is wrong, and what they are doing is right...

I hope that is a joke. nVidia has been continually badmouthing every design choice ATi made. They've even put their chief scientist David Kirk up front as a PR guy, doing mostly mudslinging against the 9700.

Your statement above is ass backwards. It is nVidia who is always claiming what they do is right, and what ATi does is wrong.

ATi very rarely makes comments about the competition... nothing like nVidia does. And what they did above was simply point out that the "effective bandwidth" claims were PR BS.

Uttar... I can't believe you're actually trying to argue over PR numbers pulled from thin air. For all we know, the PR guy interviewed just made up the 100GB/s number to illustrate the point. He could have said 70, or 200, and his point would have been the same. That you are trying to analyze what this says about ATi's bandwidth reduction schemes is quite amusing.

SavagePaladin
11-21-02, 05:17 PM
The nvnews pissing match about PR pissing matches...come one, come all....
:rolleyes:
hey saturnotaku :rolleyes: (ahem, shut up Craig)
I don't know if I agree. I think if their hardware was great, they'd be ABLE to write good drivers...
it can't be that freakin hard.

Now as to the pissing match...folks, nobody cares what either company claims, they care about what works better for them and is faster

PR has to mention things, or people might be tempted to buy someone elses card...

Engineers can trash talk whenever they want, because they know exactly what they're talking about. But I doubt you'll see one talking about effective bandwidth or things they don't even know about in the other companies cards.

Bigus Dickus
11-21-02, 05:23 PM
Originally posted by SavagePaladin
Engineers can trash talk whenever they want, because they know exactly what they're talking about. But I doubt you'll see one talking about effective bandwidth or things they don't even know about in the other companies cards.

David Kirk is chief scientist at nV, and he does more trash talking than anyone else there it seems.

StealthHawk
11-21-02, 05:28 PM
Originally posted by Spectral
ATI makes crap cards IMO.. This 9700 Pro has given me more problems than any piece of hardware Ive ever had. Stutter problems galore, crashes to desktop constantly... Its just a piece of crap. I can hardly wait till the GeForceFX comes out so I can have a real videocard again.

perhaps this is your problem Radeon 9700 Pro (core370/mem666),

After all 16 bit games are so 2001

16bit was so 2000 :p

StealthHawk
11-21-02, 05:29 PM
Originally posted by SavagePaladin
Now as to the pissing match...folks, nobody cares what either company claims, they care about what works better for them and is faster

obviously they do, or else we wouldn't even have this thread.

plus the fanboys always tout "features" from thier brand of cards.

Smokey
11-21-02, 05:30 PM
Originally posted by ReDeeMeR
I agree nv30 is a total waste, you people just dont see it in the cloud of hype, but it's your loss anyway :D

Nv30 wont have enough power for what it brings now, I respect Nvidia for this new technology and I'm not fan any company except maybe AMD :D , but ATi has a bigger advantage here and now, I'm not sure why they still havent reached the Nvidia class in theyr drivers, but still ATi is far superior and Nvidia is late, but I hope it wont end ala 3Dfx way for them as I had 4 of Nvidias cards and they were/are very good products.

You seem to remember some things. GF3 was well hyped, it was a success and eveone that got one or still has one is/was very happy, it was also the fastest card out at the time. GF3Ti500, also hyped, also the fastest card out, lots of happy users. GF4Ti4600, lots of hype, fastest card out at the time, lots of happy users. Do you see the trend here? GF-FX, lots of hype, fastest card out at the time(it will be) lots of happy users. Remember, that it has been Nvidia leading the way, ATI has done this once! and only between Nvidias cards, if and I say if, the GF-FX doesnt match and surpass the 9700Pro, I will create a thread here and eat my own words and admit that Nvidia are no longer the market leaders, and that ATI is, I will also ask my folks to buy me the 9700Pro instead of the GF-FX ;)

SavagePaladin
11-21-02, 05:40 PM
Originally posted by StealthHawk
obviously they do, or else we wouldn't even have this thread.

plus the fanboys always tout "features" from thier brand of cards.
too true. I wish I had better things to do than hang around here

SavagePaladin
11-21-02, 05:41 PM
Originally posted by Bigus Dickus
David Kirk is chief scientist at nV, and he does more trash talking than anyone else there it seems.
I don't remember all that much of it. I have a bad memory, true, but still.

ReDeeMeR
11-21-02, 06:05 PM
Originally posted by Smokey
You seem to remember some things. GF3 was well hyped, it was a success and eveone that got one or still has one is/was very happy, it was also the fastest card out at the time. GF3Ti500, also hyped, also the fastest card out, lots of happy users. GF4Ti4600, lots of hype, fastest card out at the time, lots of happy users. Do you see the trend here? GF-FX, lots of hype, fastest card out at the time(it will be) lots of happy users. Remember, that it has been Nvidia leading the way, ATI has done this once! and only between Nvidias cards, if and I say if, the GF-FX doesnt match and surpass the 9700Pro, I will create a thread here and eat my own words and admit that Nvidia are no longer the market leaders, and that ATI is, I will also ask my folks to buy me the 9700Pro instead of the GF-FX ;)

Well sure it will be faster then Geforce4ti4600 and abit faster then Radeon9700, but like Geforce3 was very ineficient in vertex and pixel shader operations ,just like geforce1 had weak T&L engine, Geforce3dfx will be ineficient in the tech it brings, so all the demos and hype is useless, btw geforce3 didnt get any serious competition, radeon8500 was late that's why so many bought GF3 and when radeon came Nvidia started hyping Nv25(GF4) shortly, so ATi got pissed and we have Nvidia behind :rolleyes: :D