PDA

View Full Version : NVIDIA vs ATI Image Quality - Firingsquad


Pages : [1] 2

Ruined
10-27-03, 03:06 PM
Looks like the new drivers have created a draw between Nvidia and ATI in terms of IQ. Check out the firingsquad image quality analysis:

http://www.firingsquad.com/hardware/imagequality2/default.asp

Firingsquad also repeated that the optimized UT2003 Nvidia filtering makes no difference in actual gameplay image quality (except for making the game run faster), but to cater to those who don't like it out principal that it still exists. They also stated that ATI was rendering smoke darker than the Nvidia cards, which sounds familiar to the ATI AM3 optimizations that were being kicked around in another thread. Lastly they commented that Nvidia's AA seems to be better for far away stuff while ATI's seems to be better for closer stuff. They also commented that ATI's rendering seems to be blurrier than Nvidia's in some areas. In the end, they concluded that they overall liked ATI's AA better but didn't like that ATI had blurrier rendering in some areas, and gave kudos to Nvidia for improving their IQ... No real winner, in other words.

I'm very satisfied with the FW52.16 drivers - they seem to have given a massive boost in the arm to both IQ and performance. It will be interesting to see how Nvidia actually ends up performing and looking against ATI in Half Life 2 and Doom3 with the NV3x path after all is said and done with these new drivers.... Would be ironic if Nvidia pulled better performance with no noticable IQ differences.

digitalwanderer
10-27-03, 03:17 PM
Originally posted by Ruined
Firingsquad also repeated that the optimized UT2003 Nvidia filtering makes no difference in actual gameplay image quality (except for making the game run faster), but to cater to those who don't like it out principal that it still exists.
GRRRRRRR!!!!

That drives me nuts, it strikes me as a BIG visual difference! :mad:

Ruined
10-27-03, 03:23 PM
Originally posted by digitalwanderer
GRRRRRRR!!!!

That drives me nuts, it strikes me as a BIG visual difference! :mad:

Nvidia stated they tested (focus group) the optimization with gamers, and that none could discern a difference between the two while playing... The only time I've seen it become apparent is when walk speed is set to 1 (200x slower than the default walk speed, making the game unplayable), and that honestly makes no difference in my book. It looks like a big visual difference with the mip maps colored, but by A/Bing screenshots and from actual gameplay (at actual gameplay speed) it apparently is not noticable, at least with those who have tackled comparing the two on professional websites... Not to say I don't believe that you think you can see the difference, but I also know many who claimed they could hear a huge difference between high end audio cables and regular ones, but when confronted with blind A/B tests could not tell them apart.

And honestly, if it did make a big visual difference, there would be no reason not to remove it because the FPS gain from having it is minor.

The darker and/or blurrier ATI renderings of parts of scenes or some of the AA exhibited by the Nvidia cards shown in some of the FS screencaps are far bigger issues to me than the filtering. I can actually see where the ATI cards are rendering darker or blurrier in the screenshots, or where the Nvidia AA isn't quite as good, but I can't detect the filtering differences at all.

Razor04
10-27-03, 03:52 PM
Oh come on...a focus group can tell you anything you want it to when you use some clever statistics and all. Besides...does anyone remember the vids that MikeC posted a while back where he didn't tell us which was which...and a lot of people could pick out the NV card easily.

Ruined
10-27-03, 03:56 PM
Originally posted by Razor04
Oh come on...a focus group can tell you anything you want it to when you use some clever statistics and all. Besides...does anyone remember the vids that MikeC posted a while back where he didn't tell us which was which...and a lot of people could pick out the NV card easily.

Right, but as I stated above he recorded those with unplayable settings, the movement speed set 200x slower than the default movement speed. Hence the videos really don't tell anything except that you might see a difference if you set the game to move 200x slower than you are supposed to (which makes the game 100% unplayable). If he recorded videos at normal walking speed (i.e. actual gameplay) and people could pick out the differences, that would be a different story - but he did not. In other words, it was a nice test, but not meaningful.

creedamd
10-27-03, 04:08 PM
Some people will fall for anything. I don't even care if people waste their money anymore, as long as they help keep the prices on Ati hardware down.

ChrisW
10-27-03, 04:10 PM
It's wierd that he does the image quality article using different settings for the cards he used in the performance article. If he is going to change the settings used for each card, he could at least include some fps charts to show how each card is performing using these settings. He could at least enable the on-screen fps counter or something.

Also, if he is going to do reviews, he could at least purchase a license for Hypersnap!

digitalwanderer
10-27-03, 04:15 PM
Originally posted by Razor04
Oh come on...a focus group can tell you anything you want it to when you use some clever statistics and all.
I agree, there just ain't no way the pseudo-trilinear wouldn't be noticeable!

dan2097
10-27-03, 05:56 PM
A review solely discussing image quality is useless as the settings they use may be unusable which IMO invalidates that review.



The thing about pseudo trilinear is maybe going from trilinear to pseudo trilinear isnt a big difference but then going all the way to bilinear may not be a difference, I mean if you kept degrading IQ in small steps people may not notice unless you compared to an image with full image quality.

The point with pseudo trilinear is if you want to semi support Nvidia's use of it you should atleast reduce the quality of af on ATI cards to try and make it more comparable, that is if you cant notice the difference.

If you can you could bench with different settings then comment on IQ

StealthHawk
10-27-03, 05:58 PM
I am going to go out on a limb here and say that the difference between trilinear(40 series) and "brilinear(50 series)" may or may not be noticeable to some people in some games.

We even had Brent admit on B3D that when he switched over to application AF he saw an improvement in UT2003 on an ATI card.

To say that the UT2003 texture stage optimization is un-noticeable is rather absurd. It can range from no difference, to the difference between 2x AF and 8x AF, which is huge.

Clevor
10-27-03, 06:05 PM
I just bought a Chaintech 5900FX card (128 MB) and installed it last night. I also have 9800/9700 Pros, and a Ti500. My complaint with the ATI cards are the washed out colors and blurry aniso. I swear they are cheating with aniso somehow and that's why they get so high frame rates.

Well the 5900FX doesn't look as good as the ole Ti500. Even with DVC on a bit, it's still not as vibrant, nor the aniso as sharp as the Ti500. Unfortunately, the Ti500 is really slow for current games.

Is it possible Nvidia is doing the same tricks ATI is doing to boost speed? Either in hardware/drivers with their latest video cards?

I will say I was impressed with the FSAA on the 5900. I like it better than ATI's. Must be the ex-3DFX engineers at work. Not impressed with the aniso compared to the Ti500 though.

Razor04
10-27-03, 07:26 PM
/me gives Clevor a voucher to use at the eye doctor

If you think that NV FSAA looks better then I think you need a pair of glasses :)

Clevor
10-27-03, 07:40 PM
Maybe it's more like it wasn't as bad as I thought it would be. On the Ti4600 it wasn't very good.

Ruined
10-27-03, 08:09 PM
Originally posted by StealthHawk
To say that the UT2003 texture stage optimization is un-noticeable is rather absurd. It can range from no difference, to the difference between 2x AF and 8x AF, which is huge.

Well, I guess this is why I think it's not absurd to think that. Two unrelated major hardware review sites took on the task of attempting to do an A/B comparison of UT2003 between Nvidia and ATI cards, with the specific intention of attempting to find any differences that Nvidia's filtering optimization may cause in IQ during gameplay. Both sites reached the same conclusion - though the optimization is obvious using colored mip maps, during actual gameplay, they could not detect the difference. These are individuals whose job it is to conduct these tests, and have the spare equipment to do direct, controlled, accurate A/B comparisons. Yet, both major sites could not detect any noticable differences during gameplay with the filtering, even after studying the two side-by-side, taking screenshots, etc...

Since it is the case that two unrelated major hardware review sites who were doing UT2003 IQ investigations specifically in an attempt to find differences between NV and ATI filtering during gameplay and *could not*, even after studying the two side-by-side, how exactly is a person actually playing the game (not studying it) going to detect the differences? That's a pretty difficult point to argue against.

digitalwanderer
10-27-03, 08:19 PM
Originally posted by Ruined
Since it is the case that two unrelated major hardware review sites who were doing UT2003 IQ investigations specifically in an attempt to find differences between NV and ATI filtering during gameplay and *could not*, even after studying the two side-by-side, how exactly is a person actually playing the game (not studying it) going to detect the differences? That's a pretty difficult point to argue against.
Not when I know one of the sites in question was [H]ard's still shameful nVidia infomercial, is this one the other one?

You can take your major sites & shove 'em up your arse, for they are full of ****. :)

gokickrocks
10-27-03, 08:20 PM
Originally posted by Ruined
Well, I guess this is why I think it's not absurd to think that. Two unrelated major hardware review sites took on the task of attempting to do an A/B comparison of UT2003 between Nvidia and ATI cards, with the specific intention of attempting to find any differences that Nvidia's filtering optimization may cause in IQ during gameplay. Both sites reached the same conclusion - though the optimization is obvious using colored mip maps, during actual gameplay, they could not detect the difference. These are individuals whose job it is to conduct these tests, and have the spare equipment to do direct, controlled, accurate A/B comparisons. Yet, both major sites could not detect any noticable differences during gameplay with the filtering, even after studying the two side-by-side, taking screenshots, etc...

Since it is the case that two unrelated major hardware review sites who were doing UT2003 IQ investigations specifically in an attempt to find differences between NV and ATI filtering during gameplay and *could not*, even after studying the two side-by-side, how exactly is a person actually playing the game (not studying it) going to detect the differences? That's a pretty difficult point to argue against.

you assume that a hardware site review is done by thousands of people when in actuality it is most likely done by 1, maybe even 2...so those 2 sites, come down to around 2-4 people that did not notice it...dont know bout you, but to me, thats a small sampling

i would agree that some people may not notice it, then there are those that will...just because a hardware review site that is "dedicated" to reviews cant see it does not mean its conclusion is the end all with everyone

as for how i can argue against it...very easily...i have my own eyes

fivefeet8
10-27-03, 10:28 PM
I've actually done my own comparisons.

http://f1.pg.briefcase.yahoo.com/bc/fivefeet8@sbcglobal.net/lst?.dir=/My+Documents/PC&.view=l

Full Trilinear vs Psuedo Bi/tri isn't very noticeable at all. Even in still shots.

particleman
10-27-03, 10:42 PM
Originally posted by ChrisW
It's wierd that he does the image quality article using different settings for the cards he used in the performance article. If he is going to change the settings used for each card, he could at least include some fps charts to show how each card is performing using these settings. He could at least enable the on-screen fps counter or something.

Also, if he is going to do reviews, he could at least purchase a license for Hypersnap!

I agree, why is he using 8X AA for the 5900 Ultra? He even admitts at the end of the article that 8X AA is unplayable. What is the point in using a setting that no one uses except for maybe some older games. I always play at 4X AA on my 5900 Ultra (and ATi's 4X AA does look A LOT better than nVidia's 4X AA), using 8X AA seems pretty pointless.

StealthHawk
10-28-03, 03:04 AM
Originally posted by Ruined
Well, I guess this is why I think it's not absurd to think that. Two unrelated major hardware review sites took on the task of attempting to do an A/B comparison of UT2003 between Nvidia and ATI cards, with the specific intention of attempting to find any differences that Nvidia's filtering optimization may cause in IQ during gameplay. Both sites reached the same conclusion - though the optimization is obvious using colored mip maps, during actual gameplay, they could not detect the difference. These are individuals whose job it is to conduct these tests, and have the spare equipment to do direct, controlled, accurate A/B comparisons. Yet, both major sites could not detect any noticable differences during gameplay with the filtering, even after studying the two side-by-side, taking screenshots, etc...

Since it is the case that two unrelated major hardware review sites who were doing UT2003 IQ investigations specifically in an attempt to find differences between NV and ATI filtering during gameplay and *could not*, even after studying the two side-by-side, how exactly is a person actually playing the game (not studying it) going to detect the differences? That's a pretty difficult point to argue against.

You posted the thread about [H]'s UT2003 Filtering and I debunked that back then.

I don't know how many times I have to say it before it sinks in. This is what [H] compared: they compared ATI's AF with texture stage optimizations to NVIDIA's AF with texture stage optimizations.

[H] did not do any of the following: they did not look at ATI's application selected filtering, where there are no texture stage optimizations.

They did not look at NVIDIA's full trilinear filtering with no texture stage optimizations(various ways to do this).

They looked at two cases of not best case filtering, and they judged them to be relatively equal. Fair enough. I don't agree with their assessment(again, I point to the famed 3DCenter article) and I am absolutely positive that it would be utterly futile and erroneous to say that ATI's filtering in UT2003 is always going to be at the same level as NVIDIA's, if it is at all.


Think about it this way. If you are trying to prove an equation is true for all values of X, you care most about the values of X where the equation fails. You do not simply stop with 1 value where the equation holds true. It is always possible that your equation is always true, or in this case, that NVIDIA's filtering optimizations have absolutely no effect on IQ. Which is crazy, since this has been proven false. Anyway, stay tuned, this should all be put to rest soon.


Again, I must reiterate that you will probably be hard pressed to tell any difference between NVIDIA's filtering in the 52.xx series with no texture stage optimizations and full trilinear in UT2003 with AF on. Which is hardly the point. Not every game is as fast paced as UT2003. And there may be those gifted(cursed?) among us who can actually tell the difference between trilinear and brilinear even in UT2003.

scott123
10-28-03, 05:33 AM
The problem with the Nascar 2003 test, is the auto config in Nascar 2003 usually gets the texture size set wrong for the render direct X. It usually requires that you go in after the auto config process and edit it manually to insure its set high enough.

If its not high enough, you guess it, the textures look blurry, just as the review describes.

Doesn't look blurry to me (ATI)

EDIT: I can bring in the exact pic on my 5900 Ultra if you like.

Scott
http://members.cox.net/lena372/post.jpg

jbirney
10-28-03, 08:31 AM
They updated their article and removed some of the blurry commits :)

digitalwanderer
10-28-03, 08:46 AM
Originally posted by jbirney
They updated their article and removed some of the blurry commits :)
I wonder if anyone from ATi pointed out their errors to them... :angel2:

StealthHawk
10-28-03, 11:42 AM
Originally posted by scott123
The problem with the Nascar 2003 test, is the auto config in Nascar 2003 usually gets the texture size set wrong for the render direct X. It usually requires that you go in after the auto config process and edit it manually to insure its set high enough.

If its not high enough, you guess it, the textures look blurry, just as the review describes.

Doesn't look blurry to me (ATI)

EDIT: I can bring in the exact pic on my 5900 Ultra if you like.

Scott
http://members.cox.net/lena372/post.jpg

Is it just me or didn't they(FS) do the same thing with the same game in an earlier IQ comparison. Oops.

Hanners
10-28-03, 12:17 PM
Originally posted by StealthHawk
We even had Brent admit on B3D that when he switched over to application AF he saw an improvement in UT2003 on an ATI card.

This is pretty much my take on it. As a standalone issue, it isn't all that noticable, but if you are used to aplying the game with application AF on an ATi card, it sticks out like a sore thumb when you switch to the nVidia card.

Having said that though, I personally found the poor AA on my 5900 far more distracting than the AF issue.

Gouhan
10-28-03, 03:00 PM
OMG I can't believe some people. There are those who never actually play games but rather study them, rub themselves on the chest and grin at how great IQ is on ATI hardware which it is.

This is pretty simple actually for whatever the 52.16 drivers are, they provide a better image and better speed than the older drivers. Be it their on equal footing with ATI's drivers or not is another issue which is irrelevant if you're using an FX card as you can never use ATI drivers.

If I am one who bought a 5900Ultra and have been using 4X.XX drivers and I was happily playing my games at nice resolutions etc... I hear that there are 52.16 drivers available for download, so I d/l them and install. After reboot I load up my fav game and Whoo it looks even better and plays even faster. HOW THE F@#K IS THAT A BAD THING? I don't care how they did it, point is it works and its improving on my game experience which is what I bought the GFX card for isn't it? And more importantly its what drivers are supposed to do, improve the users experience.

Now there seems to be these few individuals who keep telling me that I'm getting cheated even though they don't own the same GFX card I own. They keep forcing their opinions and their concern for my apparent 'Inferior IQ' down my throat at every chance they get.

This is how I see it.
In street drag etc, or in any other motor sport you do what you must to get the quickest car. So if Mazda uses two 655cc engines in ther RX-8 and they get the 0-62mph dash in 6sec or so and Honda uses 1 1997cc engine for their S2000 to get 7sec flat. Honda can't cry foul that Mazda is cheating or that their at a disadvantage because they have one engine etc... Point is no one is stopping you from using any design or config you want, just get the job done and quit Bitching. Don't try and change the competition, if you feel you're at a disadvantage. You change then!

This is how I feel about this whole IQ situation. There it is, nVidia does more with less...and so what does that have to do with you who uses an ATI GFX card? NOTHING! ABSOLUTELY NOTHING! I'm sure you like your purchase and you feel like Einstein for picking the greatest GFX card. Yes you can stop holding your @#ick and put that Hercules Manual down its enough.

Kudos to nVidia for imroving perfomance and IQ. Look forward to the future with some renewed interest. And from Mike's review things are looking a little it better for nVidia. :)