PDA

View Full Version : Toms Hardware Farycry Performance Review


Socos
07-06-04, 08:28 AM
Now I usually don't even visit Tom's, but I had to look at their Farcry patch comparision just to see how they compared. I had already reviewed Anandtech's and the performance difference between SM2.0 and SM3.0 were not all that siginificant although I couldn't argue that there is a performance gain in some situations.

Now with Tom's comparision it looks as though there is a sizeable gain on some of the Nivida supplied benches, although the 5950 looks like the POS everyone has said it was.

My question is why did Tom's only check performance at 4xAA, 4xAF??? Why did they limit testing to this setting?

Anyone know??? I think I have an idea, but was curious to what everyone else thought.. :lame:

OWA
07-06-04, 08:32 AM
Isn't 4xAF the max if you set it from in the game (i.e. don't edit the files)?

JayrodTremonki
07-06-04, 08:40 AM
Why did other sites test at 4xAA and 8xAF? Why didn't they go for 16xAF? Why didn't Tom's test 1600x1200? Just because something is the norm doesn't mean every site has to do it that way. I'd hate to see it if every site did everything the same. Just regurgitating the same tests and results. They've just decided that the most playable rates/quality ratios are achieved like that.

MUYA
07-06-04, 08:53 AM
BTW Firingsquad have updated their Far Cry analysis..by including previous generation cards such as the nv38s/nv35s and r3XX ;) Gives u what you are missing out really.

jbirney
07-06-04, 09:22 AM
Why did other sites test at 4xAA and 8xAF? Why didn't they go for 16xAF? Why didn't Tom's test 1600x1200? Just because something is the norm doesn't mean every site has to do it that way. I'd hate to see it if every site did everything the same. Just regurgitating the same tests and results. They've just decided that the most playable rates/quality ratios are achieved like that.

Because if they had tested at those cases then you would have seen a change in the scores showing the two cards are much closer with the ATI cards still being faster:

http://techreport.com/etc/2004q3/farcry/index.x?pg=4

Look at the valcano scores with and with out AA/AF and notice at 1600x1200 one is faster until AA/AF applied. All of the other sights have simular results. Not nocking any result or saying PS3.0 does not help (it does). It just makes you ask the question why had they always inclucded these settings in every review but this one? Interesting question....

Templar
07-06-04, 11:31 AM
Well it's more then fast enough. I'm sure we'll see more speed increases and then we'll see some nice effects later on.

Simon

Draconis
07-06-04, 12:00 PM
I <3 toms hardware this review is awesome

Socos
07-06-04, 12:02 PM
Well I guess I agree. It mostly depends on what side of the fence your on. I have historically prefered ATI, but just because that is what I am used too. I have owned a couple of Nivida cards and they were realtively problem free, although I can say the same for my ATI cards except for my 8500 purchase when it first came out.
I am still using my trusty 8500. It has served me well.

Just the whole 5800/5900 thing leaves a very bad taste in my mouth. How they tried to make a POS look out to be so good. I don't remember ATI doing that with the 8500. They just kind of kept quiet and fixed the drivers over time. Nivida shouted from the hill tops how great that thing was and only through hacks and re-writing driver code were they able to make it look good. Now with these Farcry performance numbers Nivida is hoping everyone will forget the 5800/5900 and just run out and buy a 6800.

I see alot of people doing just that and I cannot understand why... Screw me once shame on you. Screw me twice shame on me...

OWA
07-06-04, 12:18 PM
I don't think anyone is rushing out to buy the 6800NU just because of Tom's review. It has looked pretty good in other reviews/previews especially when using 1024x768 which is probably the res it should be used in. If you want more info about the 6800NU, check ChrisRay's thread in this forum. He provides real world results and they seem pretty good.

Nv40
07-06-04, 12:28 PM
I don't remember ATI doing that with the 8500. They just kind of kept quiet and fixed the drivers over time.


you forget a great part of ATI history..


in the 8500
Quack,3dmark2001..the hack they came with bilinear AF angle dependant (an "hardware limitation".. that 4 year laters still exist in today "new" low end hardware) versus full Trilinear+non dependant AF. low end Nv hardware doesnt have those problems.

in the R3xx.. they began to hide HQ settings from the control panel with some catalist drivers. you need game registry hacks or third party tools to enable them.

and now with their new X800 line.. they have been caugh doing something fishy behind the scenes when Trilinear was requested ,and not allowing to disable optimizations with AF.

IMHO im very skeptic about any game benchmarks that shows X800 Xtpe higher in performance than the 6800ultra. why? because ultras' wins in pure performance /wins at highest resolutions/wins with 4xAA sometimes by significant lead ,*its only* when AF and Trilinear are used that ATi manage to keep the pace and even come ahead in benchmarks and we already know about ATI gazillion of tricks there.. tryliniear/stageoptimizations/lod tricks and "smart drivers" that switch on the fly from 16x-8xAF or lower when ATI thinks 16x is "not needed". (this have been said by ATI themselves) and if you add to that reviewers (http://www.hothardware.com/viewarticle.cfm?articleid=550) that doesnt explain what "optimizations" settings they are using . its very easy for ATi to cut curners if reviewers are not responsible with their job.
benchmarking Full trilinear and non optimized AF vs ATI "optimizations" still happens in many reviews.

not saying that the XTpe is not a fast card ,just that there is so much "optimizations" going on there behind the scenes (and REviewers benchmarking apples vs oranges settings) that makes me skeptic about them being any faster in any benchmark than similar priced Nvidia cards. if ATI were *so confident* about their performance ,they -will not be doing their questionable "optimizations"- and allowing reviewers to use all their high quality settings from the Control Panel and not disable them ,giving -no choice- to use them. IMHO.

Arioch
07-06-04, 12:30 PM
Ok before this turns in to flamefest let's just get it out of the way - both companies are sneaky and do some questionable things that can be considered cheats.

Templar
07-06-04, 03:00 PM
ATI's tricks is the reason I went with NV this round. I prefer IQ over insane FPS.
Most reviews never even say if they're using "quality" or "performance" setting in the AF tab. HUUUUGE performance diffrence.

Simon