PDA

View Full Version : gtx590 release?


Pages : [1] 2 3 4

Beavermatic
02-01-11, 06:42 PM
Just bought a p67 mobo, i7 2600k cpu, and 12gb's of ram over the weekend, and went to go purchase the evga superclocked gtx580. I actually made the card purchase on newegg, and as I was glancing at other stuff, I glimpsed something a bout a gtx590.

I noticed it said it would be out sometime in february, and i keep following what new articles I can find... basically gtx580's that have hand picked cores, will be limited run, and nvidia is certain of a february launch.

Just wondering if anyone else knows anything else about that card? What price to expect (I'm hoping $700 or less will cover one due to my budget, lol), if there's any reason to think nvidia may delay that longer?

I'd just buy the evga gtx580 right now, but I dont want to deal with the "Step up" hassle if its only a couple more weeks wait.

Any thought?

john19055
02-01-11, 09:47 PM
I heard it should be released some time this month,should be a monster.

grey_1
02-02-11, 10:38 AM
With probably not enough VRAM :o

2Gb ought to be minimum for such a high end card, I would think. I'd love to see 3, but that would drive cost too high imo.

Can't wait to see final specs and what this thing can do.

grey_1
02-02-11, 10:55 AM
It'll probably be 1.5GB or 1.28GB as with the normal 580/570. That's of course per GPU so they can sell it as a 3GB card...

That'll work for most but those cards aren't really targeted for the 1920x1200 sector.

I just found out that the 3GB 580s will be available here in 2 weeks or so. Maybe we'll know more about the 590 by then but otherwise I'm getting the 3GB 580s.


Doubling the VRAM per GPU would be nice. For example two 570 with 2560MB per GPU. But something like that would probably require a Dual PCB card. I wouldn't mind that at all though :p

Yep, that's what I meant. I hate the marketing crap regarding how much vram the card has. It's very misleading. :)

Rollo
02-02-11, 12:33 PM
It'll probably be 1.5GB or 1.28GB as with the normal 580/570. That's of course per GPU so they can sell it as a 3GB card...

That'll work for most but those cards aren't really targeted for the 1920x1200 sector.

I just found out that the 3GB 580s will be available here in 2 weeks or so. Maybe we'll know more about the 590 by then but otherwise I'm getting the 3GB 580s.


Doubling the VRAM per GPU would be nice. For example two 570 with 2560MB per GPU. But something like that would probably require a Dual PCB card. I wouldn't mind that at all though :p

I have some thoughts on your posts in this thread. I have a 25X16 panel, a 5040X1050 surround set, and a 5760X1080 surround set, so VRAM and limitations have been on my mind lately.

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review

This is by far the best resource I've found for seeing how much VRAM is used in 3 panel gaming. As you can see, for 5760X1200 4X AA the rumored 1.5GB per GPU of the GTX 590 is enough 95% of the time. This covers almost EVERYONE. Anyone with $3000 to buy three 25X16 panels should spend a bit more and get 2 X 3GB GTX580.

Kyle at H's review is "OK" as well, but I've never liked his "A-ha! One of these cards can run 8X AA at 24fps minimum! Superior!" stance. 8X AA IQ improvements are marginal at best, and his "playable" is where I'd turn off AA in exchange for more fps.

The other thing is the vast, vast majority of people aren't using multi panels, and for anyone who isn't, 1.5GB is more than enough. I'd like someone who thinks otherwise to show me a review where even a relatively rare 25X16 panel runs out of RAM at 1.5GB.

I just paid $580 to try out two GTX560 2GB card, and I already have a lot of video cards, so I am interested in this. However, I think the issue is WAY overblown since the release of the 6970 cards, usually by people who have no need of more than a GB of VRAM.

I know you have three 25X16 panels, and personally, I wouldn't buy anything but those 3GB 580s to drive them. 6970s are weak at DX11 comparatively.

The 3GB GTX580s are on sale at newegg now, and at $600 each your choice is clear and one dimensional in my opinion.

Beavermatic
02-02-11, 01:49 PM
Rollo,

I have just a 22'inch screen, i think it maxes out at 1680x1050

So your saying it would be dumb for me to get a gtx590, and just go with the gtx580?

That th gtx590, 6790's, etc are all for hi-res or multi screen support?

Roadhog
02-02-11, 02:53 PM
I know you have three 25X16 panels, and personally, I wouldn't buy anything but those 3GB 580s to drive them. 6970s are weak at DX11 comparatively..

Im sorry, 6970's weak? Put them in crossfire and see who is faster then, and much cheaper.

Madpistol
02-02-11, 03:01 PM
Im sorry, 6970's weak? Put them in crossfire and see who is faster then, and much cheaper.

I agree. 6970's by themselves aren't weak, and in Crossfire, they're pretty frikin good.

The GTX 580 is better, and it's more expensive. The GTX 570 is about the same as the 6970, but it can't drive more than 2 monitors by itself (compared to the 4 monitors a 6970 can drive). Also, 2GB framebuffer ensures that the cards can perform at high resolutions, especially in crossfire.

Come on Rollo... try harder.

Roadhog
02-02-11, 03:04 PM
I agree. 6970's by themselves aren't weak, and in Crossfire, they're pretty frikin good.

The GTX 580 is better, and it's more expensive. The GTX 570 is about the same as the 6970, but it can't drive more than 2 monitors by itself (compared to the 4 monitors a 6970 can drive). Also, 2GB framebuffer ensures that the cards can perform at high resolutions, especially in crossfire.

Come on Rollo... try harder.

Agreed. They also aren't $500 by themselves. People keep expecting them to compete against a 580. But hey, if you want the absolute fastest single gpu card right now, that is easily nvidia.
http://www.hardwareheaven.com/reviews/1084/pg17/xfx-radeon-6970-and-radeon-6950-graphics-card-review-crossfire-eyefinity-vs-sli.html

Rollo
02-02-11, 05:33 PM
Rollo,

I have just a 22'inch screen, i think it maxes out at 1680x1050

So your saying it would be dumb for me to get a gtx590, and just go with the gtx580?

That th gtx590, 6790's, etc are all for hi-res or multi screen support?

I think GTX580 is overkill for you and a GTX570 would give you amazing performance for almost $200 less.

A GTX560 would be kick ass at that resolution and be cheaper yet.

Bah!
02-02-11, 05:54 PM
The GTX 570 is about the same as the 6970, but it can't drive more than 2 monitors by itself (compared to the 4 monitors a 6970 can drive).

Come on Rollo... try harder.

This is such marketing BS. Just because the 6970 can drive four monitors doesn't make it viable. You guys need to quit with this multi-monitor/single card stuff because that dog won't hunt. Everyone who plays modern, graphic intensive games, is going to have multiple cards anyway, so the argument isn't meaningless. Unless you buy high end video cards to stare at your desktop.

Roadhog
02-02-11, 05:59 PM
This is such marketing BS. Just because the 6970 can drive four monitors doesn't make it viable. You guys need to quit with this multi-monitor/single card stuff because that dog won't hunt. Everyone who plays modern, graphic intensive games, is going to have multiple cards anyway, so the argument isn't meaningless. Unless you buy high end video cards to stare at your desktop.

Who says they need to play games with 4 monitors? lol

Rollo
02-02-11, 06:00 PM
I agree. 6970's by themselves aren't weak, and in Crossfire, they're pretty frikin good.

The GTX 580 is better, and it's more expensive. The GTX 570 is about the same as the 6970, but it can't drive more than 2 monitors by itself (compared to the 4 monitors a 6970 can drive). Also, 2GB framebuffer ensures that the cards can perform at high resolutions, especially in crossfire.

Come on Rollo... try harder.

Actually I think you guys are the ones who should try harder. Why on Earth would a guy who spent $3000 plus to have the best of best monitor set up settle for any cheap ATi stuff?

He's obviously not poor, so that rules out the "Jinkies I could save a few hundred bucks!" argument.

We all know that the 6970s aren't just slower at DX11, but they're slower at everything else, and it could well be that running 75X16 taxes a video card more than anything else would.

There are no VRAM use graphs for his res that I know of, but as higher res means higher VRAM use, it's probably safe to say that having 50% more VRAM per GPU on the 580s could conceivably have benefits.

No hardware accelerated physics for him with the 6970s, worse multicard drivers.

If you're think Slawter cares about saving $500 on graphics and is willing to compromise on perfromance and all of the above to do it, after spending $3K+ on monitors I don't know what else I can say.

A guy I work with put it well: When you buy the best thing of whatever it is you're buying, you never want anything else.

Ninja Prime
02-02-11, 06:09 PM
No hardware accelerated physics for him with the 6970s, worse multicard drivers.

Didn't you just post a thread about how you had a bunch of problems with SLI/multicard NV...? LOL.:o

Madpistol
02-02-11, 06:23 PM
Actually I think you guys are the ones who should try harder. Why on Earth would a guy who spent $3000 plus to have the best of best monitor set up settle for any cheap ATi stuff?

Lets start with this article: http://www.hardwareheaven.com/reviews/1084/pg1/xfx-radeon-6970-and-radeon-6950-graphics-card-review-introduction.html

I'm thinking I would buy a couple of HD 6970's based on this article, regardless of the price. The fact that each card is roughly $120 cheaper is simply icing on the cake.

Yea, you can throw on the whole "1 powerful card is better than 2 mediocre cards" But, when you throw on 2 of each card, the HD 6970 earns its stripes.


He's obviously not poor, so that rules out the "Jinkies I could save a few hundred bucks!" argument.

Not saying he's poor, but why spend money when you don't need to? Why buy 2 GTX 580's when you can buy 2 HD 6970's, save $240, and have a faster setup?

We all know that the 6970s aren't just slower at DX11, but they're slower at everything else, and it could well be that running 75X16 taxes a video card more than anything else would.

Read the article. As a single card, they're slower, but as soon as you saturate that VRAM with a crossfire setup at higher resolutions and multi-monitor, the HD 6970 pulls ahead. For the Uber enthusiast, AMD has hit a home run.

There are no VRAM use graphs for his res that I know of, but as higher res means higher VRAM use, it's probably safe to say that having 50% more VRAM per GPU on the 580s could conceivably have benefits.

WTF? The 6970's have more VRAM. At lower resolutions and in single card setups, yea, the GTX 580 is going to win. No one in their right mind uses a lower resolution monitor for a single GTX 580, though.

No hardware accelerated physics for him with the 6970s, worse multicard drivers.

Hardware physics is STILL a gimmick. About 2% of games out uses hardware accelerated physics. Even less use it well. I'm pretty sure the 11.1's fixed those outstanding issues too. Now you're just digging, searching for a bone.

If you're think Slawter cares about saving $500 on graphics and is willing to compromise on perfromance and all of the above to do it, after spending $3K+ on monitors I don't know what else I can say.

Compromises? I guess... in your mind. I'm not seeing any compromises from that article.

A guy I work with put it well: When you buy the best thing of whatever it is you're buying, you never want anything else.

Too bad the best (GTX 580) doesn't scale the best in multi-card setups. I guess that makes the HD 6970 the best in that regard. Nothing like getting the better value out of a high-end card. Who says an extra 512mb of vRAM doesn't help?

Keep trying. You've got to earn that paycheck from nvidia. ;)

Roadhog
02-02-11, 06:28 PM
I bet if someone made a thread asking if a gt240 was better than a 6970 Rollo would say get the gt240.

Madpistol
02-02-11, 06:34 PM
Yep. The 6970 might be faster, but the GT 240 can do PhysX and CUDA. That makes the GT240 better. ;)

Rollo
02-02-11, 06:41 PM
I don't comment on articles by Hardware Heaven, SemiAccurate, or AMDZone- because the bias always has been, and always will be there.

I'm talking about the 3gb GTX580s by Gainward and Palit, not the 1.5GB ones. If you think a pair of 6970s will out perform them, well, sucks to be you I guess.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814261098&cm_re=Palit_GTX580-_-14-261-098-_-Product

You're basically comparing a Hyundai to a BMW, and yelling "He might want to save money!".

:headexplode:

Rollo
02-02-11, 07:06 PM
I bet if someone made a thread asking if a gt240 was better than a 6970 Rollo would say get the gt240.

Instead of making ridiculous suppositions, why not try to disprove something I've posted.

Facts are facts: the 3GB GTX580s are simply a more reasonable buy for a person running 75 X16 resolution on the basis of their 50% higher VRAM alone.

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_3

VRAM use at 60X12 4XAA at AVP at 1.8GB, and that's a much lower res than Slawter is running. Still thinking 2GB is the way to go for the guy I was speaking too?

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_13

Wow, Stalker uses 1.8GB of VRAM at 60X16 4X too. Gee those "great" 6970s have a whole .2GB to spare to run that much higher res, but those magic 6970s can pull it off!

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_12

And Need for Speed Shift at 1.7GB, what a trend!

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_11

JC2 at 1.9GB at the lower res with some higher AA, but that .1GB the 6970s are packing would surely have him covered at 75X16.

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_8

Huh, even FC2 at the lower res at 1.8GB VRAM use, and at 8X AA at the 2GB a lowly 6970 has.

http://www.widescreengamingforum.com/wiki/NVIDIA_GTX460_in_Surround:_1GB_vs._2GB_-_Featured_Review_-_Page_7

And Dirt 2 at the same 1.8 and 2GB VRAM use at the much lower res.

Here's the thing: I'm just RIGHT. You guys can yell about "value" and link to the review sites that go out of their way to find a game and setting where the ATi card wins, but not one thing you can say trumps the above.

The guy I was talking to (Slawter) has a 75X16 rig. 6970s have 2GB of RAM. I just proved beyond a shadow of doubt that if he wants to use some AA 6970s don't have enough VRAM to do the job.

So what's next? Are you going to post "Maybe he wants to run the montors he recently spent over $3000 at lower resolutions so he can save $500 on his graphics!"

:retard::retard::retard::retard::retard:

Madpistol
02-02-11, 10:11 PM
I don't comment on articles by Hardware Heaven, SemiAccurate, or AMDZone- because the bias always has been, and always will be there.

I'm talking about the 3gb GTX580s by Gainward and Palit, not the 1.5GB ones. If you think a pair of 6970s will out perform them, well, sucks to be you I guess.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814261098&cm_re=Palit_GTX580-_-14-261-098-_-Product

You're basically comparing a Hyundai to a BMW, and yelling "He might want to save money!".

:headexplode:

You're comparing a $600 card to a $380 card? What have you been smoking lately?

Discrediting my evidence simply because you think those sites are AMD biased? What about all the other sites that are Nvidia biased? It works both ways, bud. You seriously think that because you don't like the site that you can suddenly throw the evidence out? Nope, doesn't work that way. The site is credible, and the evidence is the results of testing in a controlled setting that is designed to NOT play favoritism toward one brand or another. You want to play Nvidia fanboy? That's fine. I could care less about all the hardware you get for painting your face green, and then getting on these forums and making yourself look like God's gift to the internet simply because you get free hardware. What a sorry life you live.

And then you use evidence from widesceengamingforum.com as your rebuttal? And you're STILL comparing a $600 card to a $380 card. If the $600 card doesn't win, something is wrong. Too bad Nvidia's reference board doesn't have 3GB of memory. Otherwise, it might have been the champ in multi-GPU. Instead, it ties in some cases, and flat out loses in others to a less expensive card from a company that you like to call the Hyundai of video card manufacturers. News flash... there's only 2 major competitors in this segment. Your logic is flawed to the point of near insanity if you think that is accurate. A more reasonable comparison might be Audi vs. BMW. Funny enough, Audi's new A4's seem to have stolen the show from BMW's 3 series.

http://store.steampowered.com/hwsurvey/videocard/

If AMD is so much LOWER than Nvidia at the moment, why does AMD have the 2 top cards in the DX11 category? In fact, AMD has a lot more DX11 cards than Nvidia. From what I can tell, that means AMD controls the market at the moment. Funny how times change. Nvidia won DX10, but AMD is winning DX11. Sales numbers don't lie, and Nvidia is still recovering from their GF100 fiasco.

Keep touting a $600 card... It's ok. That's a lot of money for a single-GPU video card. You can buy a couple of HD 6950's for that price and absolutely CRUSH a 3GB GTX 580. Who in their right mind would EVER buy a 3GB GTX 580? What a waste of money.

Here's the thing: I'm just RIGHT. You guys can yell about "value" and link to the review sites that go out of their way to find a game and setting where the ATi card wins, but not one thing you can say trumps the above.

Here's one thing that trumps what you say: Value dictates the market.
Here's something else: Your "evidence" is using a 2GB GTX 460. The 2GB HD 6950 and 6970 are both faster and use their resources better.

All you're proving is that a couple HD 6970's in Crossfire, with 2GB of framebuffer each, is a better deal than a couple GTX 580's in SLI, each with 1.5GB of framebuffer. I don't think the 3GB GTX 580 falls into this category because it's $600. That's roughly 40% more expensive than the HD 6970 2GB, yet the performance difference between the 1.5GB GTX 580 and the 3GB GTX 580 in high resolution game play is marginal. It's like saying that a Geforce 9500 GT with 1GB framebuffer is superior to a GTX 460 with 768mb framebuffer. Memory means NOTHING unless you've got the architecture to handle it. Your evidence shows that games benefit from 2GB of framebuffer at super high resolutions. Sweet! I'll have 2 HD 6970's please. They're the clear winner in multi-monitor situations. I'll gladly take the faster system than a single 3GB GTX 580. :D

If everybody was willing to pay $600 for a 3GB GTX 580, Nvidia would be owning the market.... IF... but that doesn't matter. All that matters is what people ARE going to buy, and a 3GB GTX 580 will always be an uber enthusiast product that will only sell a handful of units. Get it? This is simple:

3GB GTX 580: $600
2x HD 6950 2GB: $600 - this one is A LOT faster.

Keep trying tRollo. You might have a break through soon.

Roadhog
02-02-11, 10:30 PM
meh nothing wrong with Hyundai any more. They actually make some nice cars now.

Rollo
02-03-11, 06:31 AM
You're comparing a $600 card to a $380 card? What have you been smoking lately?
I'm smpoking the stuff that makes me able to see what the only reasonable recommandation is to a guy who is trying to run three 25X16 monitors. You're apparently smoking the stuff that makes you think 6970s are a good solution for a best of best rig that needs more VRAM than they actually have. I'll stick with my brand.


Discrediting my evidence simply because you think those sites are AMD biased?
Let's not forget that your "evidence" used a NVIDIA card I wasn't recommending and you didn't realize NVIDIA cards exist that make the 2GB framebuffer on the 6970 seem small.


The site is credible, and the evidence is the results of testing in a controlled setting that is designed to NOT play favoritism toward one brand or another.
Back when I used to read Driver Heaven, they were doing things like having contests to decide how to flamboyantly destroy high end NVIDIA cards to amuse the ATi fanboys who make up their readers.
On the top of their site is a banner that says "Click here to visit our AMD center!" Yeah, they are not biased.:barf:


You want to play Nvidia fanboy? That's fine. I could care less about all the hardware you get for painting your face green, and then getting on these forums and making yourself look like God's gift to the internet simply because you get free hardware.
Why would you care who I have press privileges with? In the context of this thread, your only concern should be whether my advice was valid. Given my links to VRAM usage at resolutions far below what the buyer has, it's pretty obvious it was. You yelling "But that core mated to 1.5GB of VRAM can be slower than an ATi card with 2GB of VRAM, those NVIDIA cards with 3GB are too expensive!" isn't helping. People willing to spend $3K on monitors aren't pinching pennies and making sacrifices.


What a sorry life you live.
I'm not going to go into what my life is, but my guess is a most people would trade me.


And then you use evidence from widesceengamingforum.com as your rebuttal? And you're STILL comparing a $600 card to a $380 card. If the $600 card doesn't win, something is wrong. Too bad Nvidia's reference board doesn't have 3GB of memory.
Too bad you fail at the "logic". NVIDIA's reference design targets 99% of buyers and wins for them. For the person I was talking to, the 3GB version of the GTX580 is really the only option he has for best performance.


Otherwise, it might have been the champ in multi-GPU. Instead, it ties in some cases, and flat out loses in others to a less expensive card from a company that you like to call the Hyundai of video card manufacturers.
But we're not talking about their reference design. What's next from you- "The 6850 beats a GTX240 so he should buy that!"
As far as reference designs go, I'd take NVIDIA's PhysX, 3d Vision, new games launching with SLi profiles instead of waiting, and the ability to write my own profiles over ATi's ability to run 8X AA better at 57X10 on some games any day. ATi tries to follow in NVIDIA's footsteps with their high school science project 3d and by begging devs to use OpenVCL w/o their support, but fall short as always.


News flash... there's only 2 major competitors in this segment. Your logic is flawed to the point of near insanity if you think that is accurate. A more reasonable comparison might be Audi vs. BMW. Funny enough, Audi's new A4's seem to have stolen the show from BMW's 3 series.[/qupote]
Being one of two competitors doesn't mean you have a good product. ATi survives like their owner AMD does- they're cheap.

[QUOTE=Madpistol;2386792]
If AMD is so much LOWER than Nvidia at the moment, why does AMD have the 2 top cards in the DX11 category? In fact, AMD has a lot more DX11 cards than Nvidia. From what I can tell, that means AMD controls the market at the moment. Funny how times change. Nvidia won DX10, but AMD is winning DX11. Sales numbers don't lie, and Nvidia is still recovering from their GF100 fiasco.
You see a lot more Kia's and Hyundai's on the road than BMW's and Lexus's- doesn;t mean they're better cars. (besides the fact ATi had a 7-8 month head start selling "DX11" cards that can barely run DX11.


Keep touting a $600 card... It's ok. That's a lot of money for a single-GPU video card. You can buy a couple of HD 6950's for that price and absolutely CRUSH a 3GB GTX 580. Who in their right mind would EVER buy a 3GB GTX 580? What a waste of money.
The guy I was talking to needs a 3GB card due to his native resolution, you keep ignoring this in your attempt to sell the KMart Blue Light special 6970. The waste of money would be if he listened to you and watched his video thrash as it ran out of VRAM.


Here's one thing that trumps what you say: Value dictates the market.
Here's something else: Your "evidence" is using a 2GB GTX 460. The 2GB HD 6950 and 6970 are both faster and use their resources better.
Not everyone has to pinch pennies, some people don't care about saving a couple hundred per card. I recommended the best solution, you've posted no evidence that 2GB is enough for 75X16. Your guess is irrelevant. Unless you can post benches that show 2GB is plenty for 75X16, your guesses mean nothing.


LOL you do love to go on and on and on. Sorry have to go be the team lead at a software company this morning, then I'm taking the afternoon off to enjoy my hobbies. Carry on yelling about "value!" to people who are talking about "best of best".

grey_1
02-03-11, 07:28 AM
Rollo,

I have just a 22'inch screen, i think it maxes out at 1680x1050

So your saying it would be dumb for me to get a gtx590, and just go with the gtx580?

That th gtx590, 6790's, etc are all for hi-res or multi screen support?

See what you started here? :lol: Just joking....I gamed at 1680x1050 up until a little over a month ago. Some games utilize the gpu much more heavily than others, then there's AA support etc which varies between games and which gpu you buy.

My humble recommendation is grab either the GTX 570 (preferred simply because I like Nvidia's drivers better) or the 6970 and call it a day. Either card will give you good support and a great gaming experience, even if you should move up to 1920x1080 with AA.

While a 580 would be more future proof, the future is never far off in the gaming/hardware world. I grabbed a 480 not long after release and it will last me just fine for at least a year at my gaming rez of 1920x1080 with a second 22" screen hooked up as well.

Buy one, enjoy it or sell and get the other if you aren't happy.

Just my humble $.02

wheeljack12
02-03-11, 07:48 AM
if I were to buy amd, I would buy 1st gen dx11 amd (5870 2gb or 5970 4gb)or possibly a gtx 590. I don't like how there is less stream processors in gen 2 of amd along with less memory bandwidth. The higher clocks aren't making up for it. I thought that a new generations of cards are supposed to supercede the last. Why amd is taking a step back would make me a little worried that they can't outdo themselves. All nvidia does is make things faster and more efficent in their next gen, that's all. Yes, the gtx 580 is premium priced, but premium cards deserve their value for what they can do.

Madpistol
02-03-11, 09:59 AM
I'm smpoking the stuff that makes me able to see what the only reasonable recommandation is to a guy who is trying to run three 25X16 monitors. You're apparently smoking the stuff that makes you think 6970s are a good solution for a best of best rig that needs more VRAM than they actually have. I'll stick with my brand.

The 6970 scales better on higher resolutions for less money, especially in Crossfire. Period. End of discussion. Keep arguing about it. It doesn't change the facts.


Let's not forget that your "evidence" used a NVIDIA card I wasn't recommending and you didn't realize NVIDIA cards exist that make the 2GB framebuffer on the 6970 seem small.

Really now? I'd love to take a poll and see how many people would get a 3GB GTX 580. Then I would love to show said people what you can get for $600. Even a pair of GTX 560 TI's in SLI would trounce it. Honestly, I don't understand why you keep using that example. It's pathetic really. Try using something that is actually endorsed by Nvidia... oh wait. 2 HD 6970's > 2 GTX 580 1.5GB cards. Guess that's why. Nothing Nvidia offers can compare without spending an extra $100 on said product.


Back when I used to read Driver Heaven, they were doing things like having contests to decide how to flamboyantly destroy high end NVIDIA cards to amuse the ATi fanboys who make up their readers.
On the top of their site is a banner that says "Click here to visit our AMD center!" Yeah, they are not biased.:barf:

Keep trying dude. Do you want me to go find every single Nvidia biased site? I bet there's a lot more of them. All you have to do is read the conclusion of a review to figure that one out. Kind of like what you keep writing...


Why would you care who I have press privileges with? In the context of this thread, your only concern should be whether my advice was valid. Given my links to VRAM usage at resolutions far below what the buyer has, it's pretty obvious it was. You yelling "But that core mated to 1.5GB of VRAM can be slower than an ATi card with 2GB of VRAM, those NVIDIA cards with 3GB are too expensive!" isn't helping. People willing to spend $3K on monitors aren't pinching pennies and making sacrifices.

It's not about making sacrifices, bud. Getting a couple of HD 6970's in crossfire isn't considered a sacrifice when it matches and exceeds performance of a couple cards that cost hundreds of dollars more. Fact. Your press privileges simply make you open to attack from somebody that actually lives in the real world, reads about other products, and is able to come up with a reasonable conclusion. In this case, that conclusion isn't as cut and paste as you would like it to be, so you fight about it. It's funny really.


I'm not going to go into what my life is, but my guess is a most people would trade me.

Material goods doesn't dictate a good life. The fact that you troll these forums continuously proves that.


Too bad you fail at the "logic". NVIDIA's reference design targets 99% of buyers and wins for them. For the person I was talking to, the 3GB version of the GTX580 is really the only option he has for best performance.

It only wins in single card performance, and the market they sell to doesn't include 99% of buyers. The segment that they sell to is a mid-high end segment where only about 10% of the market is held. AMD controls the segment below that. Steam's hardware survey confirms that.

As for single card performance, Nvidia wins. Of course, that's to be expected of a product that COSTS MORE! When you add a couple cards together, though, it gets matched and beaten by the cheaper solution. That's the nail in the proverbial coffin. Nvidia's hit a limit with their 1.5GB design GTX 580 at higher resolutions. It took OEM's adding an extra 1.5GB of memory and an extra $100 for them to overcome that limitation. What a mess.


But we're not talking about their reference design. What's next from you- "The 6850 beats a GTX240 so he should buy that!"
As far as reference designs go, I'd take NVIDIA's PhysX, 3d Vision, new games launching with SLi profiles instead of waiting, and the ability to write my own profiles over ATi's ability to run 8X AA better at 57X10 on some games any day. ATi tries to follow in NVIDIA's footsteps with their high school science project 3d and by begging devs to use OpenVCL w/o their support, but fall short as always.

See, your title makes you susceptible to preference on the side of green. Got it. No need to go flaunting it around. Everybody already knows. The way I see it, I suggest a GTX 570 to anyone that's looking for a new card in that price range. Why? Because it performs better. As soon as you jump on a higher solution involving SLI or crossfire, my opinion changes to a couple HD 6950 2GB or HD 6970's. Why? Because it performs just as good or better than a couple GTX 580's. Again, Fact. Your arguments are somewhat convincing, but flawed. I can tell you're an intelligent person, but your stance makes people hate you, and it makes them take your recommendations with a grain of salt. When you get it right, people notice, and when you don't, crap like this thread happens. Keep it up.


Being one of two competitors doesn't mean you have a good product. ATi survives like their owner AMD does- they're cheap.

If cheap works, it works. It makes people buy their products. The fact that their products compete well with Nvidia doesn't mean they're cheap. I could call you cheap. Doesn't mean it's right. It's my opinion. The little club you belong to doesn't help your credibility in this situation. Throwing titles on companies that you consider "competition" hurts you a lot more than it helps.


You see a lot more Kia's and Hyundai's on the road than BMW's and Lexus's- doesn;t mean they're better cars. (besides the fact ATi had a 7-8 month head start selling "DX11" cards that can barely run DX11.

Barely run DX11, you say? My 5870 runs DX11 titles just fine. There's no barely to it. And you're right, AMD got a massive head start on Nvidia in DX11, which is why they control the market at the moment. Fact. The better product has nothing to do with it. It's all about your position in the market. That little 2/3 of a year head start was Nvidia's biggest thorn. Not the product... the time it took for them to compete.

Your comments about Hyundai are pretty laughable, though. Hyundai is a respectable car manufacturer now. So is Samsung. Both are made in Korea. Funny how South Koreans are getting it right so often these days. If only Germany was that good on the electronics front.


The guy I was talking to needs a 3GB card due to his native resolution, you keep ignoring this in your attempt to sell the KMart Blue Light special 6970. The waste of money would be if he listened to you and watched his video thrash as it ran out of VRAM.

So? If he were to buy a couple 6950 2GB or 6970's, it would be roughly the same price as a GTX 580 3GB, and it would perform better. Blue light special, eh? You mean you actually shop at Kmart? 10 points from Slytherin for that one. Decent analogy though. Too bad it's nothing but a half assed insult.


Not everyone has to pinch pennies, some people don't care about saving a couple hundred per card. I recommended the best solution, you've posted no evidence that 2GB is enough for 75X16. Your guess is irrelevant. Unless you can post benches that show 2GB is plenty for 75X16, your guesses mean nothing.

It's not about pinching pennies. It's about getting the best solution for the best money. In this case, AMD wins it. If you want to spend a ton of money on a solution that isn't as good, go for it. But the fact remains: it's not as good, especially when money is a factor. To some people it's not, but you don't always get what you pay for. AMD wins in high-end multi-card solutions. The fact that it costs less is simply icing on the cake.


LOL you do love to go on and on and on. Sorry have to go be the team lead at a software company this morning, then I'm taking the afternoon off to enjoy my hobbies. Carry on yelling about "value!" to people who are talking about "best of best".

Here's some more of it too:

"Value dictates the market."
"Value dictates the market."
"Value dictates the market."
"Value dictates the market."
"Value dictates the market."


As soon as you swallow that pill of yours and admit that statement is correct, perhaps we can start moving you back to reality. It just so happens that in this case, the value solution wins. Therefore, I guess that makes the "value" the "best of best" as well. Thanks for pointing that out. :)


Now I remember why I don't reply to your comments. It always ends up wasting a lot of my time. What a joke. I'm going to go enjoy my life now, rather than trying to disprove the spinach you seem to regurgitate in this section of the forums.