PDA

View Full Version : NVIDIA GF100 Previews


Pages : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 [39] 40 41 42 43 44 45 46 47 48 49 50 51 52

Toss3
03-21-10, 10:32 AM
I agree that no multi GPU is free of microstutter, it is good NVIDIA has diminished it though.

I have confirmed no pricing information. I'm only referencing online leaks I've seen. I've honored NDA for 6 years, won't violate it for this launch or any other. :)

I'll just do things like say "IMO, do not buy a 5850/5870 at this point in time. My hunch is better deals are coming soon!" or "Buy in haste, repent in leisure".

You can say a whole lot without saying anything at all. :)

Yeah now wouldn't be the best time to buy a 5000-series card, but I got mine back in November and have been enjoying it since then and am now ready to join the green-team if they bring the goods! :)

Btw. according to a guy at EVGA's forums (http://www.evga.com/forums/fb.ashx?m=121702) those MS graphs aren't accurate as he wasn't able to reproduce the results and he is running 3x5870 in CF.

EDIT: Apparently he was running it with VSYNC on which eliminates MS totally so if you want to go multigpu VSYNC is definitely a must! 120Hz Samsung/Acer + SLI/CF = <3.

EDIT2: More results comparing one 5870 vs two (http://www.xtremesystems.org/forums/showpost.php?p=4200325&postcount=96)

EDIT3: MS on nvidia one vs two (http://www.xtremesystems.org/forums/showpost.php?p=4201593&postcount=146)

Iruwen
03-21-10, 10:55 AM
Were they 197.17?

Should be 256.xx because of games like Sims 3.

shadow001
03-21-10, 01:09 PM
No AFR setup is free from microstutter,and it's more noticable for inter frame rendering time variations when the FPS is fairly low(think 30~40 Fps),and both Nvidia and ATI suffer from that,and it's one of the reasons i went for a full Quad crossfire setup,basically,there isn't a game out there that can drop the FPS performance that low,even with some pretty insane settings enabled in game.


As for the GTX470 and GTX480 cards,as far as the official reviews are concerned,it seems ATI pulled a page off Nvidia's rule book and released the 10.3a preview drivers 4 days ago,and depending on the HD5000 card in question,the performance went up anywhere from 5 to 30% faster depending on the game being tested....Here's a partial list:


In addition to the key new features in ATI Catalyst™ 10.3 listed below the ATI Catalyst™ 10.3 Preview Update can provide the following performance improvements.

3DMark Vantage

* Overall scores increase by up to 8% with ATI Radeon HD 5970 graphics products
* Overall scores increase by up to 4% on ATI Radeon HD 5800 Series products and up to 3% on ATI Radeon HD 5700 Series products

Aliens vs. Predator

* Overall performance increases 5% on ATI Radeon HD 5000 Series products

Battleforge

* Improves up to 8% on ATI Radeon HD 5000 Series products
* Improves up to 3% on ATI Radeon HD 4800 Series products

Call of Duty: World at War

* Improves up to 2% on ATI Radeon HD 5800 Series products
* Improves up to 6% on ATI Radeon HD 4800 Series products

Company of Heroes

* Improves up to 6% on ATI Radeon HD 5000 Series products
* Improves up to 3% on ATI Radeon HD 4800 Series products

Crysis and Crysis Warhead

* Improves up to 6% on ATI Radeon HD 5000 Series products
* Improves up to 2% on ATI Radeon HD 4800 Series products

Devil May Cry 4

* Improves up to 10% on ATI Radeon HD 5000 Series products
* Improves up to 6% on ATI Radeon HD 4800 Series products

DiRT 2

* Improves up to 30% on ATI Radeon HD 5970 graphics products
* Improves up to 20% on ATI Radeon HD 5800 Series and ATI Radeon HD 5700 Series products
* Improves up to 10% on ATI Radeon HD 4800 Series products

Enemy Territory: Quake Wars

* Improves up to 5% on ATI Radeon HD 5800 Series products
* Improves up to 3% on ATI Radeon HD 5700 Series products
* Improves up to 2% on ATI Radeon HD 4800 Series products

Far Cry 2

* Improves up to 6% on ATI Radeon HD 5000 Series products
* Improves up to 4% on ATI Radeon HD 4800 Series products

Left 4 Dead and Left 4 Dead 2

* Improves up to 3% on ATI Radeon HD 4800 Series products

S.T.A.L.K.E.R. – Call of Pripyat Benchmark

* Improves up to 10% with Anti-Aliasing enabled on ATI Radeon HD 5000 Series products

S.T.A.L.K.E.R. – Clear Sky

* Improves up to 2% with ATI Radeon HD 5970 graphics products
* Improves up to 2% on ATI Radeon HD 5800 Series products

Resident Evil 5

* Improves up to 5% on ATI Radeon HD 5000 Series products
* Improves up to 3% on ATI Radeon HD 4800 Series products

Tom Clancy’s H.A.W.X.

* Improves up to 15% with ATI Radeon HD 5970 graphics products
* Improves up to 20% on ATI Radeon HD 5800 Series products and ATI Radeon HD 5700 Series products
* Improves up to 3% on ATI Radeon HD 4800 Series products

Unigine Tropics

* Improves up to 5% on ATI Radeon HD 5000 Series products

World in Conflict

* Improves up to 5% on ATI Radeon HD 5800 Series products
* Improves up to 3% on ATI Radeon HD 5700 Series products
* Improves up to 5% on ATI Radeon HD 4800 Series products

Wolfenstein

* Improves up to 4% on ATI Radeon HD 5000 Series products
* Improves up to 4% on ATI Radeon HD 4800 Series products

Note: This list is not exhaustive and more improvements are possible


Key new features in ATI Catalyst™ 10.3:

• ATI Catalyst™ Mobility - monthly updates for the mobile crowd
• ATI Eyefinity Technology Bezel Correction - compensate for your bezels so objects in your game move from one display to the other smoothly
• ATI Eyefinity Technology Per Display Control - adjust for differences in each individual displays brightness, color and contrast
• ATI Eyefinity Technology Multiple Display Groups - more display groups to give further control of your desktop with three displays or more connected
• ATI Eyefinity Technology Display Configuration Switching - easily switch from one display mode to another
• 3D Stereoscopic 3D driver hooks - enables 3rd party middleware vendors to bring more 3D stereoscopic gaming options


If the GTX has a roughly 10% performance lead relative to the HD5870 card when the ATI card wasn't using these drivers,it might not be the case anymore...Might just break pretty much even in fact.


Even the older HD4*** series see some performance speedups,and those are almost 2 years old now.

shadow001
03-21-10, 01:37 PM
Hate to break it to you but Quad Crossfire is the worst according to Computerbase. Microstuttering is present all the time using 4 ATI GPUs, even in situations above 30 FPS.

http://www.computerbase.de/artikel/hardware/grafikkarten/2010/kurztest_ati_radeon_hd_5970_crossfire/8/

Translated by Google :p :


Guess you're lucky by not being picky about it.


Trust me i am,and well,i've been living with the setup for over 4 months,though i'm not much into racing games overall,which is the type of games used most often by web sites to show it,with the exception being dirt 2,but it's flying at 100+ FPS on average anyhow.


Think about that for a while,100 FPS means each frame is rendered at 1/100th of a second,or 10 miliseconds,and you'd have to have a pretty mean set of eyeballs to notice any microstuttering,the kind of eyeballs that would make an Eagle proud....:D

Xion X2
03-21-10, 02:12 PM
Regarding microstuttering...

Crossfire 5870:

http://img100.imageshack.us/img100/8927/microxfireresized.jpg (http://img100.imageshack.us/my.php?image=microxfireresized.jpg)

Single 5870:

http://img100.imageshack.us/img100/7199/microsingleresized.jpg (http://img100.imageshack.us/my.php?image=microsingleresized.jpg)

http://www.xtremesystems.org/forums/showpost.php?p=4200325&postcount=96

Crysis 64bit DX10 1920x1200 4xAA running on an AMD X4 965 @ 4.4GHz Core, 2.88GHz NB, 2.88 GHz HTT Link, 1600MHz DDR3 RAM. I have two ATI 5870's at 950/1250. I'm running Windows 7 64bit.


The graphs are nearly identical. This was ran by a guy within that thread that rollo linked earlier. While no one else used a control measure (single GPU), he did so that an accurate comparison could be drawn. As you can see, the performance is nearly identical between a single 5870 and Crossfire 5870.

Microstuttering is, for all intents and purposes, a myth. If it was a legitimate thing, then you would see more hardware review sites talking about it. But you don't, and that should speak volumes.

At the very least, it should never be anything that someone bases their purchase on.

I wish that those of you (you know who you are) who are trying to influence people into making decisions based on such fickle things would stop. It's a serious issue when you mess with people's hard earned money. If anyone wants to prop Fermi up as being a better card than 5870, then they should do it based on its real world advantages such as its GPGPU capability or 3D Vision and leave it at that.

Ninja Prime
03-21-10, 02:16 PM
K007:

I think there's a GTX470 in your future. Time to get some AA in UE3, PhysX, and CUDA going ol buddy!

Sell your fine 4870 for $100, and we're talking $250 for a big upgrade!

Did you just break NDA? OOOPS.

Toss3
03-21-10, 02:32 PM
Regarding microstuttering...
The graphs are nearly identical. This was ran by a guy within that thread that rollo linked earlier. While no one else used a control measure (single GPU), he did so that an accurate comparison could be drawn. As you can see, the performance is nearly identical between a single 5870 and Crossfire 5870.

Microstuttering is, for all intents and purposes, a myth. If it was a legitimate thing, then you would see more hardware review sites talking about it. But you don't, and that should speak volumes.

At the very least, it should never be anything that someone bases their purchase on.

I wish that those of you (you know who you are) who are trying to influence people into making decisions based on such fickle things would stop. It's a serious issue when you mess with people's hard earned money. If anyone wants to prop Fermi up as being a better card than 5870, then they should do it based on its real world advantages such as its GPGPU capability or 3D Vision and leave it at that.

Already linked to those graphs in my earlier post. :) Yeah what we perceive as microstuttering is present on all GPUs on the market today. AFR just makes it a bit worse in certain games(on the GTX 295 it seemed to make it better). Also it's important to keep in mind that MS is different on each setup due to it being a result of all the components in your pc.
Vsync should eliminate all traces of it though and this has been proven many times.

Xion X2
03-21-10, 02:39 PM
As a closer, the thread that was linked earlier by Rollo was started by annihilator over on XS. Well, after some more research, this is what he had to say about microstuttering:

Indeed. Sorry for this, guys, I never thought a single GPU system could have so much microstuttering. I did a test myself with Crysis and can confirm similar results. Should have done this BEFORE opening the thread.


http://www.xtremesystems.org/forums/showpost.php?p=4200971&postcount=137


:beer3:

Enrico_be
03-21-10, 02:40 PM
Were they 197.17?

Could be :)
Awesome GTX 480 pictures you posted btw :p They have this nice industrial kind look :cool:

--

Only 5 days left now till launch :cool:!

--


I found something an hour ago, I tried to sign up on their site (Pcinlife) to see what the attachment of his post was but my Chinese is not so good :D I like the 20%~40% title ^^ I hope these are new benchmarks but it could also be the FC2 and Dirt2 benchmarks that were leaked yesterday (if those are true).

http://www.we-wi.be/Images/rumor.png

Vardant
03-21-10, 02:45 PM
Microstuttering is, for all intents and purposes, a myth. If it was a legitimate thing, then you would see more hardware review sites talking about it. But you don't, and that should speak volumes.
You're kidding right?
There's at least one article about microstuttering on every major HW site. Some people don't see it, some can live with it, some can't stand it. If anyone is spreading myths, it's you.

If you don't see it, good for you. But just because of that, you can't say it doesn't exist.

Xion X2
03-21-10, 02:56 PM
You're kidding right?
There's at least one article about microstuttering on every major HW site.

Link them, then (and from more than one website, please.) Talk is cheap.

And tell me why users who test this themselves (like in the graphs that I just showed) show nearly identical performance between a single/dual GPU setup on the most demanding games on the market? Are these sources that you guys mention actually using a control measure (single GPU) in their testing to confirm that it's not just a standard delay seen on all setups?

At the very least you guys need to admit that this is a highly unresolved issue with little definitive proof in its favor at the moment. There are mixed results coming from a few different directions, so chances are that the issue is more complex than people think. There are a number of things that can cause latency in a system.

Enrico_be
03-21-10, 02:56 PM
* Zotac GTX480 countdown clock for launch (http://topic.pcpop.com/newtopics/zotac1003/) :)

Iruwen
03-21-10, 03:03 PM
As for the GTX470 and GTX480 cards,as far as the official reviews are concerned,it seems ATI pulled a page off Nvidia's rule book and released the 10.3a preview drivers 4 days ago,and depending on the HD5000 card in question,the performance went up anywhere from 5 to 30% faster depending on the game being tested...

This simply shows that ATI needed at least six months to fix just the worst drivers bugs (many of which still exist though) and they couldn't manage to actually use the power of their own chips. This is especially bad when looking at Dirt 2 which was developed in close cooperation with ATI. They had to release at least two hotfixes to make it actually run after launch and now, six months later, they finally fix the performance problems.
Also, ATI obviously focuses on benchmark (= review) relevant applications. And I wonder if they didn't sacrifice some image quality for that...
Finally, Nvidia constantly (!) improves their driver performance, not only once in six months when it's review time. And ATIs 5000 series is just a recycled architecture they've been working with for a long time, Fermi is a completely new architecture developed from scratch so we'll see tons of performance improvements once the driver developers get more used to its specific quirks.

Iruwen
03-21-10, 03:07 PM
Think about that for a while,100 FPS means each frame is rendered at 1/100th of a second,or 10 miliseconds,and you'd have to have a pretty mean set of eyeballs to notice any microstuttering,the kind of eyeballs that would make an Eagle proud....:D

You cannot actually "see" microstuttering in that range, it just makes the FPS feel a lot lower than they actually are.
You also cannot "hear" overtones, they are what makes a conventional audio CD sound worse than a good analogue recording though (because of the low sampling frequency).

Iruwen
03-21-10, 03:12 PM
Microstuttering is, for all intents and purposes, a myth. If it was a legitimate thing, then you would see more hardware review sites talking about it.

Every single MGPU review mentions this, no matter what brand. At least in germany. It's not a myth. Here's even a video from the biggest german hardware mag:
http://www.pcgameshardware.de/aid,631607/Videobeweis-Mikroruckler-zerstoeren-den-leistungssteigernden-Effekt-von-Multi-GPU-Loesungen/Grafikkarte/Test/
("Microstuttering destroys the performance advantage of MGPU solutions")

Xion X2
03-21-10, 03:13 PM
Finally, Nvidia constantly (!) improves their driver performance, not only once in six months when it's review time.
You must not have been around during the early launch days of G80. Nvidia held out on updating drivers for a little over four months until right before 2900 XT was released.

Boy, was that fun.

Rollo
03-21-10, 03:15 PM
Link them, then (and from more than one website, please.) Talk is cheap.



http://www.pcgameshardware.de/aid,631668/Video-proof-Micro-stuttering-may-destroy-the-performance-gains-from-current-multi-GPU-technologies/Grafikkarte/Test/

http://www.ngohq.com/news/14380-micro-stuttering-on-radeon-hd-4870-x2.html

Here's two- I know HardOCP has looked at this as well.

Xion X2
03-21-10, 03:16 PM
Every single MGPU review mentions this, no matter what brand. At least in germany. It's not a myth. Here's even a video from the biggest german hardware mag:
http://www.pcgameshardware.de/aid,631607/Videobeweis-Mikroruckler-zerstoeren-den-leistungssteigernden-Effekt-von-Multi-GPU-Loesungen/Grafikkarte/Test/
("Microstuttering destroys the performance advantage of MGPU solutions")

A review of 3870X2s? Got anything more recent? I don't think we can compare that to current solutions as quite a lot has changed.

Iruwen
03-21-10, 03:19 PM
A review of 3870X2s? Got anything more recent? I don't think we can compare that to current solutions as quite a lot has changed.

HD 5970 (http://translate.google.de/translate?js=y&prev=_t&hl=de&ie=UTF-8&layout=1&eotf=1&u=http%3A%2F%2Fwww.pcgameshardware.de%2Faid%2C6995 99%2FTest-Radeon-HD-5970-Hemlock-Die-schnellste-DirectX-11-Grafikkarte-der-Welt%2FGrafikkarte%2FTest%2F&sl=de&tl=en)

Even with the Radeon HD 5970, it is more the rule than the exception that the displayed frame rate does not correspond to the perceived outcome.

Vardant
03-21-10, 03:20 PM
Link them, then (and from more than one website, please.) Talk is cheap.
http://www.anandtech.com/printarticle.aspx?i=3518
http://www.pcgameshardware.de/aid,653475/PCGH-beweist-Mikroruckler-auf-der-Radeon-HD-4870-X2/Grafikkarte/Test/
http://www.tomshardware.com/reviews/msi-fuzion-lucidlogix-hydra,2526-3.html?xtmc=microstutter&xtcr=1
http://www.computerbase.de/artikel/hardware/grafikkarten/2008/test_nvidia_3-way-sli_triple-sli/7/#abschnitt_das_multigpuproblem_mikroruckler
http://www.xtremesystems.org/Forums/showthread.php?t=194808

They all mention it in one way or another.

Iruwen
03-21-10, 03:22 PM
You must not have been around during the early launch days of G80. Nvidia held out on updating drivers for a little over four months until right before 2900 XT was released.

Boy, was that fun.

I guess they never did that again because the 2900 sucked so hard that it was completely unnecessary ;(

/e: I guess that was enough pwnage for one page :p

Enrico_be
03-21-10, 03:23 PM
lololol guys, are we really talking about microstuttering here in a GF100 preview thread, hehe :p

--

Personally I don't think that microstuttering is a myth :)

http://forums.guru3d.com/showthread.php?t=306424#post3306267
(HD5870 CF)
"Man I have been playing today with the CF, but microstuttering is very bad in 2142, Crysis and STALEKR for example. I tried running even Crysis and all games with one GPUand despite much lower average fps I actually enjoyed more and the smoothness was there. Thinking of selling on of the cards. Its not worth it. I jsut buy one of those new ATis when they come out late next year if I need more performance. :/"

Rollo
03-21-10, 03:23 PM
http://www.anandtech.com/printarticle.aspx?i=3518
http://www.pcgameshardware.de/aid,653475/PCGH-beweist-Mikroruckler-auf-der-Radeon-HD-4870-X2/Grafikkarte/Test/
http://www.tomshardware.com/reviews/msi-fuzion-lucidlogix-hydra,2526-3.html?xtmc=microstutter&xtcr=1
http://www.computerbase.de/artikel/hardware/grafikkarten/2008/test_nvidia_3-way-sli_triple-sli/7/#abschnitt_das_multigpuproblem_mikroruckler
http://www.xtremesystems.org/Forums/showthread.php?t=194808

They all mention it in one way or another.

Fool! Microstutter is a myth!

Heh- micro stutter is likely one of the reasons JHH says it's always better to achieve perfromance with one GPU if possible.

Xion X2
03-21-10, 03:24 PM
[QUOTE=Xion X2;2214205]Link them, then (and from more than one website, please.) Talk is cheap.

QUOTE]

http://www.pcgameshardware.de/aid,631668/Video-proof-Micro-stuttering-may-destroy-the-performance-gains-from-current-multi-GPU-technologies/Grafikkarte/Test/

http://www.ngohq.com/news/14380-micro-stuttering-on-radeon-hd-4870-x2.html

Here's two- I know HardOCP has looked at this as well.

Where is the single GPU control measure? As has been shown in those graphs from XtremeSystems when users have tested this, there is frame delay with single GPU as well. So you can't just take a graph with frame delay and say "Hey, look, MS!"

You need a side-by-side analysis of a single GPU vs multi-GPU running the same bench at the same settings. Otherwise, these tests are useless.

shadow001
03-21-10, 03:25 PM
This simply shows that ATI needed at least six months to fix just the worst drivers bugs (many of which still exist though) and they couldn't manage to actually use the power of their own chips. This is especially bad when looking at Dirt 2 which was developed in close cooperation with ATI. They had to release at least two hotfixes to make it actually run after launch and now, six months later, they finally fix the performance problems.
Also, ATI obviously focuses on benchmark (= review) relevant applications. And I wonder if they didn't sacrifice some image quality for that...
Finally, Nvidia constantly (!) improves their driver performance, not only once in six months when it's review time. And ATIs 5000 series is just a recycled architecture they've been working with for a long time, Fermi is a completely new architecture developed from scratch so we'll see tons of performance improvements once the driver developers get more used to its specific quirks.


Drivers are always being improved in some way,be it squashing bugs,increasing performance or adding new features,if the hardware can support it obvously,and done so thru the life of a video card,not just the first few months,and if either Nvidia or ATI were able to release the perfect driver,that extracts the maximum performance out of the hardware,with no bugs,and supports all the features that the hardware is technically capable of,you'd see both companies releasing a single driver and that would be the end of it for the life of that card.


There's simply too many applications,running at different settings,with way too much hardware variation,from CPU used,to motherboard chipset variations,to be able ever release the perfect driver in one shot,and that's not even including when new O/S's are released to boot....It's dozens if not hundreds of diffrent combinations.


In this particular case,ATI did the same thing that Nvidia did in past detonator driver releases whenever ATI was getting ready to release new hardware,basically releasing a driver where the primary concern is extracting more performance out of the hardware,and not just in synthetic benchmarks,but in many games and yes a lot of those are used in benchmark reviews.


You didn't think that ATI wasn't going to do anything to put a damper on Nvidia's cards finally getting released after all this time,and i'm still sure that not all the performance is still getting uncorked anyhow,if you look at the HD4*** series results with that driver,and it's a card not far from 2 years old,so it wasn't just the latest generation getting a boost,but even the previous one too.