PDA

View Full Version : AnandTech's Part 2 Video Shootout


Pages : [1] 2 3 4 5 6 7 8

Evan Lieb
10-07-03, 05:54 PM
Part 2 (http://www.anandtech.com/video/showdoc.html?i=1896)

Enjoy. :)

I started this thread because the other one was getting a bit long.

The Baron
10-07-03, 06:13 PM
Meh... could be better. Helpful hint, use custom UT2003 demos rather than the flybys. Icetomb, Face3, and Rustatorium are the best benchmark maps. In addition, you never mentioned what a lot of the benchmark scores represent (for example, what flyby in UT2003?).

PreservedSwine
10-07-03, 06:19 PM
The screenies are tiny

Are you sure AA is on in F1 on the NV cards here (http://www.anandtech.com/video/showdoc.html?i=1896&p=12)?


EDIT:

I am unable to view full size screenies when I click on the screenies..I just get a page that says something like: This page is accessed from articles only, and no screenie:confused:

ANother EDIT:

No numbers for T.R, only a percentage of lost fps w/ 2.0 shaders? When your'e finished with whatever it is your smoking, can I have some?:smoking2:

nForceMan
10-07-03, 06:29 PM
Quote:

"With the 52.14 drivers we have been using, we didn't notice any image quality issues on a visual inspection"

Damn it! Where are those cheats? :rofl

Hellbinder
10-07-03, 06:42 PM
Ok.. But i am still not convinced. Simply becuase its *impossible* Based on sheer engineering for them to have miraculously improved Dx9 performance the way they have. I am honesly EXTREMELY suspicious and will continue to be untill many more sites have the oportunuty to work these through the Ringer.

I ahve issues with the entire last page of Conclusion. But.. Ill Zero in on this statement.

If you made it all the way through the section on TRAOD, you'll remember the miniboss named compilers. The very large performance gains we saw in Halo, Aquamark3, X2 and Tomb Raider can be attributed to the enhancements of NVIDIAs compiler technology in the 52.xx series of drivers. Whether a developer writes code in HLSL or Cg, NVIDIAs goal is to be able to take that code and find the optimum way to achieve the desired result on their hardware. Eliminating the need for developers to spend extra time hand optimizing code specifically for NVIDIA hardware is in everyone's best interest.

Complete BS marketing CRAP. Enhancements in "compiler" technology. Bull***. Its called completely hijaking a Developers code and replacing it at runtime with the closest approximation they can get away with.

I want to see these drivers on levels that are not well known benchmark levels. I want to see multip[le custom demos and Fraps results. I also want to see a new game come out in a few days and watch these Drivers "Tank" in perfomance.. Why?? Becuase they have to be engineered to perform PER GAME.

I admit that i may be a bit overzealous on this. But im sticking to my guns here until clearly proven wrong.

Lets get some really detailed IQ analasys going and see what washes out and what sticks.

Hellbinder
10-07-03, 06:47 PM
Originally posted by nForceMan
Quote:

"With the 52.14 drivers we have been using, we didn't notice any image quality issues on a visual inspection"

Damn it! Where are those cheats? :rofl
hahahahha... oh yeah thats right the card still lost every DX9 test and some other Dx tests by 50%. hahahaha.. yeah thats really Funny.

The few wins it did pull were mostly in the single digits. or 49.5 Vs49.2 type results. This again is without the XT getting to use overdrive wich you will see tomorrow and you wont be laughing at the results.

As i already said. Lets see what B3D, Tech Report and other sites have to say about the IQ. Also be advized that these are application specific driver enhancements. Not General driver enhancements. Honest Runtime compiler enhancements do not doubble scores.

nForceMan
10-07-03, 06:49 PM
It is important to note that anyone out there running the current 45.23 Detonators will get something along the lines of a 65% performance increase with no loss in image quality for the FX series of cards with the new 52 series of drivers due out late this month.

Sssssmmokinn! :headbang:

The Baron
10-07-03, 06:49 PM
Indeed, ATI Satchmo. In fact, how many of those games even use PS2.0 shaders?

skoprowski
10-07-03, 06:52 PM
Nvidia will never win now will they? Hellbinder, want to buy my Radeon 9800 Pro since you are so convinced? I see your posts all over different forums. The thing that impresses me the most is how an authority you are on hardware you don't have.

This is getting out of hand. THESE ARE JUST VIDEO CARDS! Anyone that wants to buy a Radeon 9800 Pro - let me know before it goes into my wife's computer. Oh BTW- some people swore Nvidia couldn't fix the flicker problem with a driver fix either.

PreservedSwine
10-07-03, 07:08 PM
Originally posted by nForceMan
Quote:

"With the 52.14 drivers we have been using, we didn't notice any image quality issues on a visual inspection"



Other than the jerkiness/motion issues of the NVIDIA card, we didn't notice any visual quality problems on either card for this test. The first thing that needs to be mentioned about this game is that NVIDIA has some known issues that we saw pop up. Again, we have to note that NVIDIA has known issues with this game, but we will clarify this in a few pages. Since NVIDIA can't do floating point textures, PS2.0 shadows were left offThere are some differences between ATI and NVIDIA on this one. While scrolling, one way Maxis handled all the data that SimCity has to deal with was to not render everything completely as it entered the screen (if you hadn't seen it recently), but to kind of approximate the detail. This leads to a bit of blocky-ness while scrolling. This isn't a big deal because as soon as the scrolling stops or changes directions the enter screen is drawn as it should be. Of course, ATI antialiased this blocky-ness like crazy while the NVIDIA card left it alone for the most part. I'm not really sure which one is right or better, so I'm going to leave that decision up to the user. For the purpose of comparing image quality during normal gameplay, we took screenshots while stationary.For our tests, the only really important information is that we use the NVIDIA Cg compiler rather than the DX9 HLSL default compiler (there was no performance difference between the two on NVIDIA cards for the most part, only image quality improvements). We used this demo in Part 1 as well and mentioned that there were some smoothness issues with NVIDIA's cards. The problem seems to be that every couple seconds, when something overly graphically intensive is going on, the card will drop a few frames and move on to the next ones.

To be fair:This time we have to point out that ATI has known image quality issues. There have been many complaints from problems with AA to missing/flickering shadows and a lack of shiny water. but later....We didn't notice the problems with water...but we did see some shadow flickering occurring. There weren't any missing shadows as we can see from the screenshots. In reading some forums across the web... So, apparently, if they read about a problem w/ ATI, it's a known issue, but they must expierence any issue w/ Nvidia 1st hand for it to be mentioned........:eek:

I am looking forward to the actual release of this driver set. It seems weird to publish a review and comparison from a set of drivers that may not be released, ever, for some hardware that is still some time away......

reever2
10-07-03, 07:17 PM
I'm sorry but that IQ comparison was just weak, one of the worst i've seen

digitalwanderer
10-07-03, 07:19 PM
It is also nice to see that there isn't any image quality difference between NVIDIA and ATI cards with this game. (http://www.anandtech.com/video/showdoc.html?i=1896&p=17)
Funny how Anand failed to notice how the FX cards don't render the predator effect at all while ATi does...it's a slight image difference in my opinion. :rolleyes:

Evan Lieb
10-07-03, 07:24 PM
Originally posted by reever2
I'm sorry but that IQ comparison was just weak, one of the worst i've seen

Really, how?

Funny how Anand failed to notice how the FX cards don't render the predator effect at all while ATi does...it's a slight image difference in my opinion. :rolleyes:

Huh? What are you referring to?

CtB
10-07-03, 07:24 PM
Interesting stuff, it seems Nvidia managed to pull this rabbit out of the proverbial hat.

I have a question in case Evan or Anand reads this thread : Gabe Newell made mentions of drivers intercepting screen capture calls and improving IQ on the fly just for screenshots. Did you test this in your article ? Are your IQ comparisons done mostly by comparing screenshots, or do you also try to compare screenshots with in game visuals ?

If those improvements are all genuine (which I doubt considering the past months), then I'm very impressed.

reever2
10-07-03, 07:34 PM
Really, how?

1. You pick odd sections
2. You have tiny pictures
3. There is no option for me to see the full size picture
4. 90 percent of the pictures do not give the readers any idea of how the FX cards are filtering the ground, which has bene the root of a lot of controversy
5.You really should include full size flash comparisons if you want to do any real comparisons

PreservedSwine
10-07-03, 07:38 PM
Was 3 pages really needed (The TR section) to say the CG was needed for the graphics to look like ATI's? No performace increase, just "improved" graphics?

The what is David Kirk talking about here? (http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=18843)

Evan Lieb
10-07-03, 07:40 PM
Originally posted by reever2
1. You pick odd sections

Like?

2. You have tiny pictures

Yeah, that would have been nice. But if you need a full screen shot to actually find any differences, well, that tells you something right there.

3. There is no option for me to see the full size picture

Right, that would be the same as #2.

4. 90 percent of the pictures do not give the readers any idea of how the FX cards are filtering the ground, which has bene the root of a lot of controversy

Which screen shots are you looking at?

5.You really should include full size flash comparisons if you want to do any real comparisons

OK, this seems to be the same complaint as #2 and #3, but again there's not that much of a difference, if any, even when you're able to maximize the screen shots (AM3 for example).

EDIT: Actually, Anand took out the AM3 full size screen shot option for some reason. But the same logic still applies.

Evan Lieb
10-07-03, 07:41 PM
Originally posted by CtB
Interesting stuff, it seems Nvidia managed to pull this rabbit out of the proverbial hat.

I have a question in case Evan or Anand reads this thread : Gabe Newell made mentions of drivers intercepting screen capture calls and improving IQ on the fly just for screenshots. Did you test this in your article ? Are your IQ comparisons done mostly by comparing screenshots, or do you also try to compare screenshots with in game visuals ?

If those improvements are all genuine (which I doubt considering the past months), then I'm very impressed.

I'm not sure if Anand used hypersnap or not, I think he did.

Originally posted by PreservedSwine
Was 3 pages really needed (The TR section) to say the CG was needed for the graphics to look like ATI's? No performace increase, just "improved" graphics?

The what is David Kirk talking about here? (http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=18843)

The first 1.5 pages were giving a background on compilers in general (Intel and AMD as just one competitive comparison enthusiasts can relate to).

dexiter
10-07-03, 07:47 PM
Originally posted by Hellbinder
hahahahha... oh yeah thats right the card still lost every DX9 test and some other Dx tests by 50%. hahahaha.. yeah thats really Funny.

The few wins it did pull were mostly in the single digits. or 49.5 Vs49.2 type results. This again is without the XT getting to use overdrive wich you will see tomorrow and you wont be laughing at the results.

As i already said. Lets see what B3D, Tech Report and other sites have to say about the IQ. Also be advized that these are application specific driver enhancements. Not General driver enhancements. Honest Runtime compiler enhancements do not doubble scores.

On 9800XT, I doubt Overdrive will help much. Well, maybe you can get a really nice chip and get it oc'ed to 450 and beyond, but I honestly don't think that'll be the case for every 9800XT out there (I mean if they could've raised clock in such a way, they would have released it at 420-425 or higher). The 9600XT, on the other hand, may see more universal improvement with overdrive IMO.

Not really knowing NV3x architecture or what's gone on with Det50 development, I think it's kinda a baseless thing to say all improvement from Det50 is application-specific optimization/cheats. I thought one of the major reasons Anand expanded benchmark suites and tested on different games (some of which aren't the usual suspects in video card benchmarking) was to see if driver enhancement is universally beneficial or application-specific. Judgin from that review, I think it's fair to say that at least some of improvement is due to general driver enhancement. Maybe Anand coordinated the selection of benchmark titles with NVIDIA, (ah.. the neverending conspiracy theories..) but still methinks Anand's probably a tad less biased than you are... Besides, what's not to like? ATI still leads NVIDIA in most cases, but NVIDIA closed the performance gap without sacrificing IQ. ATI owners can continue to claim bragging rights and NVIDIA owners can be happy with the improvement.

TheTaz
10-07-03, 07:48 PM
Well.... after reading that whole thing... it seems fairly obvious to me that games that aren't "Twimtbp" or have specific "driver optimisations for" still run like dawg crap on an FX card.

Granted... *maybe* nvidia can squeeze the performance it needs in DX 9 with compiler technology... but I still have my doubts.

I rather spend my money on a card that will run all games decently, out of the box, and I don't have to wait for a "compiler fix, a driver optimisation, or a game patch".

I mean... the only reason I never bought an ATi card before was cuz the drivers sucked (Past tense). People had to constantly wait for "driver fixes" or "Game Patches". Well... that has changed. The shoe is on the other foot now, and nVidia is in that "boat". Sooooooo, it seems more logical to me to buy ATi, since I don't like spending money, then having to wait for the fricken product to "work right".

Taz

digitalwanderer
10-07-03, 07:58 PM
Originally posted by Evan Lieb
Huh? What are you referring to?
The predator effect does not work on FX cards and does work on ATi cards...the invisible/camoflauge mode.

FX doesn't render it properly, ATi does...go check it yourself. :cool:

ntxawg
10-07-03, 08:03 PM
question were the tomb raider angel of darkness benched with the 52 patch or with the 49?? if they were ran with the 52 could that be why the rendering in it was off, have yah checked it out with the 49 patch to see if there is issues there?

gmontem
10-07-03, 08:09 PM
Originally posted by Evan Lieb
Yeah, that would have been nice. But if you need a full screen shot to actually find any differences, well, that tells you something right there.
It tells us you could be hiding something by not providing us the full image. :p

Socos
10-07-03, 08:17 PM
Originally posted by Evan Lieb
Like?



Yeah, that would have been nice. But if you need a full screen shot to actually find any differences, well, that tells you something right there.



EDIT: Actually, Anand took out the AM3 full size screen shot option for some reason. But the same logic still applies.

Ummm Yeah... Okay Kyle.. Whatever you say... I usually do play my games in something other than full screen mode so you must be right if I need a full size screen shot to see differences I must be the one who is stupid.:rolleyes:

How much did [N] pay you for this? Don't bother replying.. I know you'll double talk the post real political like. Glad I didn't give you any advertising revenue by reading this trash....:mad:

reever2
10-07-03, 08:26 PM
Yeah, that would have been nice. But if you need a full screen shot to actually find any differences, well, that tells you something right there.

Hey where have I heard that line before? :rolleyes:

So is Anandtech strapped for bandwidth, or are you telling me you are going to tell us what the Iq is like without providing a full sized picture, and expect everybody to believe what you are seeing since they cant see it for themselves?