PDA

View Full Version : NVIDIA GF100 Previews


Pages : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 [40] 41 42 43 44 45 46 47 48 49 50 51 52

shadow001
03-21-10, 02:28 PM
Fool! Microstutter is a myth!

Heh- micro stutter is likely one of the reasons JHH says it's always better to achieve perfromance with one GPU if possible.


Is that why Nvidia has already released 3 dual GPU cards in the past 3 1/2 years or so(7950GX2,9800GX2,295GTX),and might even release a dual fermi based card to beat the HD5970?


Let's just say that what he says,and what his company ends up doing in the end,aren't quite the same thing to say the least.

Rollo
03-21-10, 02:28 PM
If the GTX has a roughly 10% performance lead relative to the HD5870 card when the ATI card wasn't using these drivers,it might not be the case anymore...Might just break pretty much even in fact.


Even the older HD4*** series see some performance speedups,and those are almost 2 years old now.

I've seen many situations where ATi PR claims huge gains in performance and independent tests show their driver performance the same across many revisions. I'll wait for real benches.

Even if the performance was the same or slightly better the cards would be a worse deal due to the lack of features, the horrible microstuttering, and dodgy drivers for multi GPU.

Rollo
03-21-10, 02:30 PM
Is that why Nvidia has already released 3 dual GPU cards in the past 3 1/2 years or so(7950GX2,9800GX2,295GTX),and might even release a dual fermi based card to beat the HD5970?


Let's just say that what he says,and what his company ends up doing in the end,aren't quite the same thing to say the least.

Multi GPU is used to get higher performance with tradeoffs, single GPU is optimal. No mystery there.

Xion X2
03-21-10, 02:30 PM
I posted this already but I guess it's the most recent test (about a week ago). It even includes the Hydra solution, which does a better job than CF.

http://www.computerbase.de/artikel/hardware/mainboards/2010/test_msi_big_bang_fuzion_hydra-chip/5/#abschnitt_hydra_und_die_mikroruckler

Can't read German, and when I try to translate I get this:

http://img401.imageshack.us/img401/6126/computerbase.jpg (http://img401.imageshack.us/my.php?image=computerbase.jpg)

Vardant--I'll check out those links that you posted.

Vardant
03-21-10, 02:31 PM
I think that really was enough :)

Now for something completely different. I got mail few seconds ago.

http://i40.tinypic.com/33p3cc0.png

Seems like GPU-Z isn't reading the clocks properly, but you should be able to figure out, where the numbers are supposed to be.

lee63
03-21-10, 02:48 PM
I think that really was enough :)

Now for something completely different. I got mail few seconds ago.

http://i40.tinypic.com/33p3cc0.png

Seems like GPU-Z isn't reading the clocks properly, but you should be able to figure out, where the numbers are supposed to be.So 20,000 plus for a 480...pretty good.

shadow001
03-21-10, 02:54 PM
Multi GPU is used to get higher performance with tradeoffs, single GPU is optimal. No mystery there.


Then he shouldn't even consider it if there's tradeoffs involved,as we want the best PC experience possible according to him,yet his company has released as many dual GPU cards as ATI did so far,with all the possible microstuttering,constant SLi driver updates and higher power/cooling requirements such cards demand,yet when they are released on the market,he has the audacity to claim it's the best card ever.


Such tradeoffs rarely get mentioned in those times when cards are being launched don't they?,and let's not forget that apart from 3Dfx in the late 90's,it was Nvidia who introduced multi GPU setups with their version of SLI way back in 2004,which was quite a bit before ATI did with crossfire(about 2 years before if i remember correctly).


When viewed in a long enough period of time,and having seen many cards released by both companies,interesting patterns seem to emerge that marketing depts would rather not have people remember them.

Xion X2
03-21-10, 02:59 PM
HD 5970 (http://translate.google.de/translate?js=y&prev=_t&hl=de&ie=UTF-8&layout=1&eotf=1&u=http%3A%2F%2Fwww.pcgameshardware.de%2Faid%2C6995 99%2FTest-Radeon-HD-5970-Hemlock-Die-schnellste-DirectX-11-Grafikkarte-der-Welt%2FGrafikkarte%2FTest%2F&sl=de&tl=en)

Thanks for providing something more recent. Despite your rather childish designation of "pwnage" coming my way, I think I'll reply back to you like an adult.

I think this here captures the point that I'm making--from your link--

Here is one advantage of the Radeon HD 5970: Thanks to its abundant processing power, they ranked in the usual settings rarely Fps in these regions.

Even the review sites admit that the card will "rarely" enter these microstuttering regions that you all claim is such a problem. Therefore, this doesn't represent daily usage with this card.

How many apps do you see stress the 5970 enough to bring it down to 20-30fps? Not even Crysis does it with 8xAA. Now if you go in and start enabling 24x edge detect on a game as demanding as Crysis sure, you're bound to run into problems. But you wouldn't think about doing that on a single GPU card, either. In this case, you're adapting your approach just to try to prove a point to yourself.

From one of Vardant's links, again, on XS:

http://plaza.fi/s/f/editor/images/grid_difference_ati.png

Again, I'm wondering what some have to say about tests like this that show the same or more variation on a single GPU. My belief is that a lot of these claims of MS are due to normal variations in performance.

Anyways. I'm not going to argue with 10 of you at once. I've said my piece on the subject and will leave it at that. Have fun.

Iruwen
03-21-10, 03:00 PM
Nvidia managed to improve the situation so that the GTX 295 was an excellent card though (except for the fan noise). The HD 5970 unfortunately still sucks ;(

/e: sorry, but even the 5970 definitely is in FPS ranges where microstuttering matters when it is used like a card like that should be used. You don't buy a high end graphics card to see >100FPS constantly but max out everything. And that's possible with any game, not just Crysis. Also the felt loss of FPS makes it come closer to the HD 5870, especially in games which don't scale well, so the card is pretty useless at all.

Xion X2
03-21-10, 03:12 PM
/e: sorry, but even the 5970 definitely is in FPS ranges where microstuttering matters when it is used like a card like that should be used. You don't buy a high end graphics card to see >100FPS constantly but max out everything. And that's possible with any game, not just Crysis. Also the felt loss of FPS makes it come closer to the HD 5870, especially in games which don't scale well, so the card is pretty useless at all.

No, you simply establish an optimal area.

It's not black or white. It's not 100fps or 30. There are a range of settings inbetween that can be set so as to keep the card easily out of MS range while still having lots of eye candy.

Just because you may have bought the strongest card on the market does not mean you should just check off all the extreme settings in your control panel for every game. I would call that the sledgehammer approach. Use a little common sense. If you're running a game that has DX11 effects and some intense lighting, for example, then it's probably not a good idea to enable 24xAA, alpha AA, etc.

shadow001
03-21-10, 03:15 PM
Nvidia managed to improve the situation so that the GTX 295 was an excellent card though (except for the fan noise). The HD 5970 unfortunately still sucks ;(

/e: sorry, but even the 5970 definitely is in FPS ranges where microstuttering matters when it is used like a card like that should be used. You don't buy a high end graphics card to see >100FPS constantly but max out everything. And that's possible with any game, not just Crysis. Also the felt loss of FPS makes it come closer to the HD 5870, especially in games which don't scale well, so the card is pretty useless at all.


That's me playing with Crysis cranked to their maximum quality settings and 4X to 8X antialiasing on top,on a single 24" display,and here's my benchmark result with crysis,using older drivers though.

http://i765.photobucket.com/albums/xx298/Superfly101_02/Crysis%20results%2016X%20supersample/Crysis8XAAmultisample.jpg


At those crazy settings,which 8X AA is completely overkill for btw,i still manage 60 FPS average and closer to 80 FPS at 4X AA settings.


The only time when i'm might be anywhere near 20~30 FPS and worry about microstuttering potential is when driving 3,24" LCD's at their native resolution of 1920*1200 on each,which totals 5760*1200 in total in eyefinity,and crank a demanding game graphics settings to maximum levels,so it is making the cards work a lot harder.


Driving a single display,there isn't game out there this setup won't eat alive performance wise basically.

Enrico_be
03-21-10, 03:37 PM
http://www.we-wi.be/Images/ontopic.gif

Naaaah, j/k guys ;):p I just wanted to post that smiley :)

--

Btw Vardant, that 20,000+ 3D mark vantage score looks quite good :cool: My guess : 22,500 :)

shadow001
03-21-10, 03:46 PM
http://www.we-wi.be/Images/ontopic.gif

Naaaah, j/k guys ;):p I just wanted to post that smiley :)

--

Btw Vardant, that 20,000+ 3D mark vantage score looks quite good :cool: My guess : 22,500 :)


3Dmark vantage at the performance setting has been CPU limited for quite a while now,so for high end cards at least,it's the extreme settings that are the most reveiling(1920*1200 resolution,4X AA and post processing effects maxed out).

Toss3
03-21-10, 04:06 PM
Guys seriously read the thread on microstuttering over at xtremesystems. Several members have done their own benchmarks that have proven without doubt that there's little to no difference between a single 5870 and 5970. Most of those sites you linked to are either using old drivers or have turned Catalyst AI off none of which help improve the situation.

So unless you have any graphs of your own that refute what has been said, could we get back on subject please?

shadow001
03-21-10, 06:01 PM
Guys seriously read the thread on microstuttering over at xtremesystems. Several members have done their own benchmarks that have proven without doubt that there's little to no difference between a single 5870 and 5970. Most of those sites you linked to are either using old drivers or have turned Catalyst AI off none of which help improve the situation.

So unless you have any graphs of your own that refute what has been said, could we get back on subject please?


Then it's waiting another week for actual benchmark reviews,as it looks like having game performance results on the same day that the GTX470/480 cards are officially unveiled next friday,isn't going to happen.

Rollo
03-21-10, 07:21 PM
Then it's waiting another week for actual benchmark reviews,as it looks like having game performance results on the same day that the GTX470/480 cards are officially unveiled next friday,isn't going to happen.

Why do you say that Shadow? Source?

shadow001
03-21-10, 09:06 PM
Why do you say that Shadow? Source?


Just another rumor floating about websites only receiving their cards just a couple of days before the 26th and not having enough time to actually do the reviews and publish the articles on websites by the time PAX happens this friday....I'll check where i saw it,but it was mentioned a few days ago anyhow.


Obviously by web sites,i mean the vast majority of the more known ones,including anandtech and Hard OCP and the tech report,as well as Xbit-labs,so it's a fair amount of cards to supply so that all of them have reviews up on time by the 26th rolls around,meaning they have to have the cards right now.

CaptNKILL
03-21-10, 11:28 PM
:nono:

Will you quit stirring the pot shadow? Seriously man...

lee63
03-21-10, 11:38 PM
Only four more days til this is all over :headexplode:

shadow001
03-21-10, 11:38 PM
:nono:

Will you quit stirring the pot shadow? Seriously man...


I haven't stated anything as fact,just another rumor among the endless ones out there,and i'm still hoping to see actual GTX470/480 reviews from all the major sites when the 26th of march rolls in,using as many games and settings as possible,with both Nvidia and ATI cards using the latest drivers,that's all.

CaptNKILL
03-22-10, 12:16 AM
I haven't stated anything as fact,just another rumor among the endless ones out there,and i'm still hoping to see actual GTX470/480 reviews from all the major sites when the 26th of march rolls in,using as many games and settings as possible,with both Nvidia and ATI cards using the latest drivers,that's all.

You are the one starting the rumors on this website.

Give it a rest already.

shadow001
03-22-10, 01:15 AM
You are the one starting the rumors on this website.

Give it a rest already.


Did you actually check what the major tech sites have been saying about fermi over the last 6 months by any chance....The vast majority isn't good news,and i've simply reported some of it with links to the site's that wrote the articles.


With march 26th being the official launch for fermi,the time for more talk is over,and it's time to see the major review sites with articles going up that day that are testing the cards extensively,just like it's been the case with pretty much every new hardware release so far.


4 days and counting down.

MUYA
03-22-10, 01:41 AM
Can you not take this to PM with Capt? I think he would prefer that.

scubes
03-22-10, 03:02 AM
You are the one starting the rumors on this website.

Give it a rest already.



wot happend to BANNING HIM FOR 2 WEEKS this is getin silly now its pointless even reading this to me now.cant believe its still going on and after its just a piece of hardware.

halduemilauno
03-22-10, 03:22 AM
5970,5870,5870 2GB 5850 VS. 480, 470 final scores:
http://bbs.pczilla.net/attachments/month_1003/1003221608ade6b763b9b97a63.jpg