PDA

View Full Version : 5870/5970 vs. GF100 (Fermi)


Pages : 1 [2] 3 4 5 6 7 8 9 10 11

scubes
01-20-10, 03:47 AM
im a nvidia fanboy ive always had there cards and thats not gonna change.:)

LydianKnight
01-20-10, 07:40 AM
Couldn't agree more really.

I bought a 5970 for my secondary rig (and 2 x 5870 before that) and I have not been particularly happy with any of them. I havn't really used ATI as a mainstay card since the 9700 and I must say I really hope Fermi is a winner (or at least on par) with ATI's current offerings as I'm not happy with the prospect of going fully ATI in lieu of no comparable product from Nvidia.

I'm not listening to any of these so called "insiders" either. I'm only interested in seeing "proper" testing from trusted sites. Everyone else can go fish with their conjecture.

After looking at/reading/comparing every possible review site in regards of any piece of software/hardware, the real conclussion is... the only review that's worth the time, the effort and the value... is yours, rest can go down the drain...

Because, in the end, the hardware you're going to run those future/present games is yours :P

Slytat
01-20-10, 08:15 AM
After looking at/reading/comparing every possible review site in regards of any piece of software/hardware, the real conclussion is... the only review that's worth the time, the effort and the value... is yours, rest can go down the drain...

Because, in the end, the hardware you're going to run those future/present games is yours :P

This is true, and having tested ATI, I'm really hoping for Nvidia to pull out a winner.

Obviously, we have to rely on reviews etc to some extent to form opinions, but I hear what you are saying.

kaptkarl
01-20-10, 08:45 AM
You NVidia fanboys really trip me out.....funny stuff. I'm not a fanboy of ANY brand....just buy the best at the time. I have a GTX260 216core vid card and have been very happy with it for 8 months now. But let's face it.....ATI's 5800 series kicks ass...period....end of story. I would buy one in a heartbeat if my budget allowed. They are the BEST right now....period.....end of story. Hopefully NVidia's new GF100 series will be competitively priced and perform better than ATI's best.......we'll see. Right now I'm not so sure that's going to happen. NVidia's ego, along with it's fanboys, has been hurt. You guys hate playing 2nd fiddle to ATI. HA! Isn't competition great!!:D It keeps the market in check.

Let's all be patient and see some real world benches before we make such silly accusations about performance and price. Try to set your NVidia fanboy egos aside and let the facts speak for themselves.;)

My 2 cents......

Kapt

Sowk
01-20-10, 09:06 AM
You NVidia fanboys really trip me out.....funny stuff. I'm not a fanboy of ANY brand....just buy the best at the time. I have a GTX260 216core vid card and have been very happy with it for 8 months now. But let's face it.....ATI's 5800 series kicks ass...period....end of story. I would buy one in a heartbeat if my budget allowed. They are the BEST right now....period.....end of story. Hopefully NVidia's new GF100 series will be competitively priced and perform better than ATI's best.......we'll see. Right now I'm not so sure that's going to happen. NVidia's ego, along with it's fanboys, has been hurt. You guys hate playing 2nd fiddle to ATI. HA! Isn't competition great!!:D It keeps the market in check.

Let's all be patient and see some real world benches before we make such silly accusations about performance and price. Try to set your NVidia fanboy egos aside and let the facts speak for themselves.;)

My 2 cents......

Kapt

The fact that you can buy 2 X GTX285's in SLI for less money and get ~ the same performance as the ATI 5970, doesn't show me a major win for ATI. Plus I get to use PhysX without a hack that I'm sure Nvidia will break in a future driver... :(

I wish they would make PhysX avail to anyone that just owns one of their cards and wants to use PhysX.

You can get the GTX285's for about $300 each new if you know where to look.

Ninja Prime
01-20-10, 09:30 AM
The fact that you can buy 2 X GTX285's in SLI for less money and get ~ the same performance as the ATI 5970, doesn't show me a major win for ATI. Plus I get to use PhysX without a hack that I'm sure Nvidia will break in a future driver... :(

I wish they would make PhysX avail to anyone that just owns one of their cards and wants to use PhysX.

You can get the GTX285's for about $300 each new if you know where to look.

You must be smoking something powerful if you think 2x285 equals a 5970...

Not to mention, they are about the same price.

Sowk
01-20-10, 09:34 AM
You must be smoking something powerful if you think 2x285 equals a 5970...

Really?

http://www.guru3d.com/article/radeon-hd-5970-review-test/12

Go page by page...

SLI GTX285 wins some

ATI 5970 wins some

About equal in my book.

Ninja Prime
01-20-10, 09:39 AM
Really?

http://www.guru3d.com/article/radeon-hd-5970-review-test/12

Go page by page...

SLI GTX285 wins some

ATI 5970 wins some

About equal in my book.

GJ chosing the one spot and a game that doesnt matter, because it can be run at full settings on a 3 year old card anyway. You book must be drawn by 3 year olds with crayons I guess...

In most of the others its winning by ~25%, this is on beta drivers they had back then, mind you.

Edit: In fact, upon further review, looking at newegg, 2xGTX285 is actually more expensive too.

Slytat
01-20-10, 09:50 AM
GJ chosing the one spot and a game that doesnt matter, because it can be run at full settings on a 3 year old card anyway. You book must be drawn by 3 year olds with crayons I guess...

In most of the others its winning by ~25%, this is on beta drivers they had back then, mind you.

I've got both and I've tested them side by side in many games. My findings basically mimic G3D's apart from Hawx where as much as the 5970 was faster, it was by a smaller margin.

If you look at the G3D review the 5970 is maximum ~10-15% faster when it is faster.

http://www.guru3d.com/article/radeon-hd-5970-review-test/

The fact is that 285 SLI keeps up (and in some cases beats the 5970) and is essentially 18 month old technology, just get over it.

2 x 5870 are a more definitive win for ATI but also more expensive.

Blacklash
01-20-10, 10:17 AM
The nVidia GPU is going to be multi GPU by default? If so I'll never own one.

I would be interested in how much more powerful a single GF100 will be over a single HD 5870. I'm done with multi-GPU. I largely got into that due to hype and following the crowd.

Slytat
01-20-10, 10:20 AM
The nVidia GPU is going to be multi GPU by default? If so I'll never own one.

I would be interested in how much more powerful a single GF100 will be over a single HD 5870. I'm done with multi-GPU. I largely got into that due to hype and following the crowd.

Huh, where did you hear that?

kam03
01-20-10, 10:31 AM
The fact that you can buy 2 X GTX285's in SLI for less money and get ~ the same performance as the ATI 5970, doesn't show me a major win for ATI. Plus I get to use PhysX without a hack that I'm sure Nvidia will break in a future driver... :(

I wish they would make PhysX avail to anyone that just owns one of their cards and wants to use PhysX.

You can get the GTX285's for about $300 each new if you know where to look.

LOL who in their right mind would buy 2x GTX285 and be limited with DX10? when they can buy a future proof DX11 graphics card with performance >= 2x GTX285 and cheaper then 2x GTX285?

2x Nvidia GTX285 costs 560 (280 each)
1x ATi 5970 costs 530

Slytat
01-20-10, 10:34 AM
LOL who in their right mind would buy 2x GTX285 and be limited with DX10? when they can buy a future proof DX11 graphics card with performance >= 2x GTX285 and cheaper then 2x GTX285?

2x Nvidia GTX285 costs 560 (280 each)
1x ATi 5970 costs 530

HOLD ON.

I never said you should buy them, I simply said that they are roughly the same speed.

At this time you are obviously better off buying a 5970/5870CF or holding out for Fermi.

I wouldn't purchase 28x GPUs at this point unless I already had 1 and I was adding SLI or whatever.

Of course "limited" by DX10 is a rather redundant statement as there is little in the way of anything significant in terms of DX11 right now or in the immediate future.

Yah, Yah I know, AvP, BF etc but will we actually be able to see the difference a la DIRT 2?

LydianKnight
01-20-10, 11:15 AM
The nVidia GPU is going to be multi GPU by default? If so I'll never own one.

I would be interested in how much more powerful a single GF100 will be over a single HD 5870. I'm done with multi-GPU. I largely got into that due to hype and following the crowd.

Hold on... who said NVIDIA's GPU is multi-mode by default? The news goes on the contrary... GTX380, GTX360 and mainstream variations on single-GPU (GPU as in chip, not as a whole card), with GTX395 (or whatever the name ends being) the dual-GPU model...

Where did you read/hear it's gonna be multi-GPU by default?

Sowk
01-20-10, 11:30 AM
The nVidia GPU is going to be multi GPU by default? If so I'll never own one.

I would be interested in how much more powerful a single GF100 will be over a single HD 5870. I'm done with multi-GPU. I largely got into that due to hype and following the crowd.

They will have single GPU units

hell_of_doom227
01-20-10, 01:00 PM
I would not compare NVidia 200series with ATI 5000series cause it's stupid to do so. 200 series are DX10 and 5000 series DX11 cards. Speaking of DX10 and DX11, the current state of the market says that we don't need it. All games are still on DX9, pretty much. Maybe the reason behind it are Consoles and considering the fact that Sony and Microsoft wont replace their Consoles by year 2014 tells me enough. We ain't going to see fully utilized DX11 by year 2013, speaking of which i don't think Microsoft will push any new DirectX with Windows 8.
I am glad Doom 4 is going to be using OpenGL, it's time we revisit OpenGL again :). If you already bought Crossfire ATI setup with 5000 series and even though Fermi is going to be faster then ATI counterpart, it's not worth replacing it. SSD Drives, USB3.0, SATA 6, 6 Core CPU, a bigger Monitor sounds a better deal.

Slytat
01-20-10, 01:16 PM
I would not compare NVidia 200series with ATI 5000series cause it's stupid to do so. 200 series are DX10 and 5000 series DX11 cards. Speaking of DX10 and DX11, the current state of the market says that we don't need it. All games are still on DX9, pretty much. Maybe the reason behind it are Consoles and considering the fact that Sony and Microsoft wont replace their Consoles by year 2014 tells me enough. We ain't going to see fully utilized DX11 by year 2013, speaking of which i don't think Microsoft will push any new DirectX with Windows 8.
I am glad Doom 4 is going to be using OpenGL, it's time we revisit OpenGL again :). If you already bought Crossfire ATI setup with 5000 series and even though Fermi is going to be faster then ATI counterpart, it's not worth replacing it. SSD Drives, USB3.0, SATA 6, 6 Core CPU, a bigger Monitor sounds a better deal.

How is it "stupid to do so" (compare the two) if we "don't need it" (DX11)? You've totally contradicted yourself.

The fact is that 285 SLI offers comparable performance (vs 5970). If DX11 doesn't count (which it doesn't really at the moment as you quite rightly pointed out), then it's a completely legitimate comparison.

I would still buy a 5970 over 285 SLI at present, but that doesn't change the performance.

Yet another person trying to justify their expenditure while making absolutely no sense whatsoever.

Enthusiasts want the fastest cards, why would we stop at ATI if Nvidia turns out to be faster or vice versa? I'll take the cards AND the SSD, USB, SATA, Gulftown etc, not either or.

shadow001
01-20-10, 02:45 PM
How is it "stupid to do so" (compare the two) if we "don't need it" (DX11)? You've totally contradicted yourself.

The fact is that 285 SLI offers comparable performance (vs 5970). If DX11 doesn't count (which it doesn't really at the moment as you quite rightly pointed out), then it's a completely legitimate comparison.

I would still buy a 5970 over 285 SLI at present, but that doesn't change the performance.

Yet another person trying to justify their expenditure while making absolutely no sense whatsoever.

Enthusiasts want the fastest cards, why would we stop at ATI if Nvidia turns out to be faster or vice versa? I'll take the cards AND the SSD, USB, SATA, Gulftown etc, not either or.


I need a much Faster CPU to unlock even more speed out of the HD5970's i have to begin with,yet such CPU doesn't exist yet,and the same goes for Fermi,so even with the eventual reviews happening where both ATI latest and Nvidia's latest are tested,that's something to keep in mind,and the fact that most software is a joke to run on such powerfull hardware is also another consideration.


Given how Nvidia implemented triple display support,regardless if you use 3D glasses or not,and that it'll require 2 cards,since each card only has 2 outputs,triple display comparisons,which would increase the workload on the GPU's by quite a bit,even with the current games,and reduce CPU limitation concerns,can be run with a single HD5870/HD5970 card,but can't be run with a single Fermi card


It's also possible to run 3 displays with a single HD5970 card of course,and also can't be done with a single GPU Fermi card,so will reviewers will be forced to use a single display when running benchmarks?,and ignore something that single ATI cards can do,while single Fermi cards can't do at all?....The reviews should be quite interesting to say the least.


And before i hear "Triple displays are a small minority of users" argument,we are talking enthusiast level cards here,and 3 LCD displays can actually cost less than 1 of these high end cards do,especially if we're talking about triple 22" LCD's,or maybe even triple 24" LCD's if you shop carefully.

Slytat
01-20-10, 02:56 PM
I need a much Faster CPU to unlock even more speed out of the HD5970's i have to begin with,yet such CPU doesn't exist yet,and the same goes for Fermi,so even with the eventual reviews happening where both ATI latest and Nvidia's latest are tested,that's something to keep in mind,and the fact that most software is a joke to run on such powerfull hardware is also another consideration.


Given how Nvidia implemented triple display support,regardless if you use 3D glasses or not,and that it'll require 2 cards,since each card only has 2 outputs,triple display comparisons,which would increase the workload on the GPU's by quite a bit,even with the current games,and reduce CPU limitation concerns,can be run with a single HD5870/HD5970 card,but can't be run with a single Fermi card


It's also possible to run 3 displays with a single HD5970 card of course,and also can't be done with a single GPU Fermi card,so will reviewers will be forced to use a single display when running benchmarks?,and ignore something that single ATI cards can do,while single Fermi cards can't do at all?....The reviews should be quite interesting to say the least.


And before i hear "Triple displays are a small minority of users" argument,we are talking enthusiast level cards here,and 3 LCD displays can actually cost less than 1 of these high end cards do,especially if we're talking about triple 22" LCD's,or maybe even triple 24" LCD's if you shop carefully.

I respect your opinions guy but you're not making any points with me as far as benching on multiple monitors.

I don't care about benching or playing on Eyefinity/multiple monitors and never have. I freely admit that people do like and use that feature but I am not one of them.

You have a habit of responding to my posts in a defensive manner (as in defending your purchases) and you shouldn't feel the need to.

If you are happy with your choices, then don't worry what anyone else thinks. I've had several 5xxx series ATi cards and they aren't for me. For a start they are problematic in one of the games I play the most and then there is nHancer and Physx and plenty of other things that make me lean towards Nvidia in the final analysis.

I'll be going Fermi SLI/Tri SLI as soon as they are available (providing they don't turn out to be total garbage, which doesn't look likely).

One thing I will say, I spent a fair bit on the ATi 5 series so I am no fanboy, I've tried them and I am now eager to see what Nvidia has to offer.

PS : You can buy Gulftown ES on Ebay as we speak if you really want a faster CPU :) - Gigabyte offers a beta bios for compatibility as Im sure most other mobo manufacturers do as well.

shadow001
01-20-10, 03:16 PM
I respect your opinions guy but you're not making any points with me as far as benching on multiple monitors.

I don't care about benching or playing on Eyefinity/multiple monitors and never have. I freely admit that people do like and use that feature but I am not one of them.

You have a habit of responding to my posts in a defensive manner (as in defending your purchases) and you shouldn't feel the need to.




Nothing to do with defending purchases at all,as i know i bought a pair of monsters,but the sad reality is,benchmarking cards as powerfull as these,on either HD5970's of Fermi cards,on a single display,even if it's a 30" LCD at 2560*1600 resolutions with 8X AA,will result in at least some cases that are CPU limited,so in those cases,we're not really testing the video cards,as it becomes more the video cards testing the rest of the system....Bank on it.


Not to mention that single 30" LCD is over 1000$ easy,and that 3 displays can be not much over 1/2 that amount,depending on what you choose,and the extra workload reduces the chances of being CPU limited,so it's not a matter of liking or disliking triple displays,it a matter of wanting to show what these GPU's can really do with current games.


ATI can do it with a single card....Nvidia needs at least 2 fermi cards in SLI,making it a more expensive option,and since you'll be running dual or triple SLI fermi cards,you're definitely going to need those 3 displays to uncork the speed potential of those cards.

Slytat
01-20-10, 03:31 PM
Nothing to do with defending purchases at all,as i know i bought a pair of monsters,but the sad reality is,benchmarking cards as powerfull as these,on either HD5970's of Fermi cards,on a single display,even if it's a 30" LCD at 2560*1600 resolutions with 8X AA,will result in at least some cases that are CPU limited,so in those cases,we're not really testing the video cards,as it becomes more the video cards testing the rest of the system....Bank on it.


Not to mention that single 30" LCD is over 1000$ easy,and that 3 displays can be not much over 1/2 that amount,depending on what you choose,and the extra workload reduces the chances of being CPU limited,so it's not a matter of liking or disliking triple displays,it a matter of wanting to show what these GPU's can really do with current games.


ATI can do it with a single card....Nvidia needs at least 2 fermi cards in SLI,making it a more expensive option,and since you'll be running dual or triple SLI fermi cards,you're definitely going to need those 3 displays to uncork the speed potential of those cards.

I have plenty of monitors, I just don't game on multiple monitors. I may well set something up for BlackShark but that's a future project.

As far as NEEDING multiple monitors to exploit the cards fully, well that's your opinion, and again, it sounds more like a justification but you are entitled to it.

We've agreed before that we have entirely too much GPU power and nothing to exploit it so talking about ATI's ability to use more monitors may be a selling point for you, but it isn't for me.

I'm not happy with ATi's offerings, you are.

The bottom line is that enthusiasts are the kings of overkill!

/end of :)

Sowk
01-20-10, 04:07 PM
Well I'm buying a Dual Fermi for Graphics and a Fermi for PhysX. :)

shadow001
01-20-10, 04:26 PM
I have plenty of monitors, I just don't game on multiple monitors. I may well set something up for BlackShark but that's a future project.

As far as NEEDING multiple monitors to exploit the cards fully, well that's your opinion, and again, it sounds more like a justification but you are entitled to it.

We've agreed before that we have entirely too much GPU power and nothing to exploit it so talking about ATI's ability to use more monitors may be a selling point for you, but it isn't for me.

I'm not happy with ATi's offerings, you are.

The bottom line is that enthusiasts are the kings of overkill!

/end of :)


True,setups like these are overkill since they're well ahead of the curve in terms of software,no matter which side you pick,but as for the triple monitor issue,it's very much real,since i've been benchmarking my setup now for well over a month,and in many different games and benchmarks,and even in the highest and craziest settings,at the max resolutions that my single monitor supports(1920*1200),i simply can't make the cards slow down to the point where i'm fairly sure it's because i'm actually hitting their hardware limits.


I've ran benchmarks in Crysis at absolute max settings at 4X antialiasing and got the same result when benching with 8X antialiasing,using supersampled AA,not the lesser quality multisampling method.....The cards simply don't care,and they are running at stock clocks,imagine when i overclock them.


So in that scenario,and given than a 30"LCD isn't exactly cheap so i could raise the resolution even higher,there's the triple display option,which could extend resolutions to as high as 5760*1200 using 3 cheap 24" displays,and right off the bat,that's bout 65~70% harder than running on a single 30" LCD,when both are trying to sustain the same frame rate:


2560*1600 = 4.096 megapixels each frame.
5760*1200 = 6.912 megapixels each frame.


Yup,the cards will be definitely working harder in a pretty substancial way.

LydianKnight
01-20-10, 04:34 PM
Speaking of DX10 and DX11, the current state of the market says that we don't need it. All games are still on DX9, pretty much. Maybe the reason behind it are Consoles and considering the fact that Sony and Microsoft wont replace their Consoles by year 2014 tells me enough. We ain't going to see fully utilized DX11 by year 2013, speaking of which i don't think Microsoft will push any new DirectX with Windows 8.

You don't know what you're speaking about, it seems...

1. While DirectX 10 hasn't had too much of an impact (my guess is Microsoft has used it as a next-gen graphics architecture in comparison with DirectX 9, while DirectX 11 improves and extends DirectX 10 in some ways), DirectX 11 clearly looks like a winner.

2. In contrast to previous DirectX versions, DirectX 10 & 11 have updates in the SDK in a timely basis, no more need to wait more than a year to have a new revision, as soon as new features are created, they will be incorporated, so programmers just have to upgrade their DirectX SDK to be able to play with the features, DirectX 11, DirectCompute, XAudio, you name it...

Saying DirectX 11 won't be fully utilized by 2013 is non-sense, by that time we'll probably have DirectX 12 or further revisions for DirectX 11 like DirectX 9 had like a/b/c/whatever.

I am glad Doom 4 is going to be using OpenGL, it's time we revisit OpenGL again :)

There's no reason for Carmack (et al.) to change their API, he's always been with OpenGL, but you're right... it's good to see OpenGL in action :P

If you already bought Crossfire ATI setup with 5000 series and even though Fermi is going to be faster then ATI counterpart, it's not worth replacing it. SSD Drives, USB3.0, SATA 6, 6 Core CPU, a bigger Monitor sounds a better deal.

Now we have a clear idea about your priorities, but that has nothing to do with ours :P

Slytat
01-20-10, 04:37 PM
True,setups like these are overkill since they're well ahead of the curve in terms of software,no matter which side you pick,but as for the triple monitor issue,it's very much real,since i've been benchmarking my setup now for well over a month,and in many different games and benchmarks,and even in the highest and craziest settings,at the max resolutions that my single monitor supports(1920*1200),i simply can't make the cards slow down to the point where i'm fairly sure it's because i'm actually hitting their hardware limits.


I've ran benchmarks in Crysis at absolute max settings at 4X antialiasing and got the same result when benching with 8X antialiasing,using supersampled AA,not the lesser quality multisampling method.....The cards simply don't care,and they are running at stock clocks,imagine when i overclock them.


So in that scenario,and given than a 30"LCD isn't exactly cheap so i could raise the resolution even higher,there's the triple display option,which could extend resolutions to as high as 5760*1200 using 3 cheap 24" displays,and right off the bat,that's bout 65~70% harder than running on a single 30" LCD,when both are trying to sustain the same frame rate:


2560*1600 = 4.096 megapixels each frame.
5760*1200 = 6.912 megapixels each frame.


Yup,the cards will be definitely working harder in a pretty substancial way.

Well, we play different games and at different resolutions so as much as you may be maxing out your cards, I'm not sure I'll be so lucky, but we will see soon enough.

I can assure you that my mainstay game can cripple 3 x 280 @ 2560 x 1920 and equally the 5970 at the same settings on a 4GHz i7 so that's why I stand by my earlier comments.

I don't really want to get into a game specific discussion here as it is totally off topic.

Enjoy :)