PDA

View Full Version : pics of the 5900 beating a 9800 in DX8 IQ


Pages : [1] 2 3

serAph
10-14-03, 02:27 PM
They're 3MB bmps - if anyone can host them, Ill gladly upload them (T1 here @ work)

~ Rory K

Hanners
10-14-03, 02:45 PM
If you can e-mail them, feel free to send them my way. I'll PM you my e-mail address now.

Kamel
10-15-03, 12:45 AM
why not just compress them into jpg? -- if quality loss is a problem atleast you can use .png format.

just suggesting, i could host some for you but i've only got 5MB of webspace, so i'd only be able to host one of them, lol

particleman
10-15-03, 01:33 AM
What game is it in? If it is UT2003, ATi has a problem with the default detail texture setting and you need to edit the ini file (which is what all review sites do when the benchmark and compare IQ anyways). Depending on the game nVidia may look better in DX8, but more often than not ATi's AA method seems to make the difference.

Hellbinder
10-15-03, 01:34 AM
As long as the Settings and images are Reproducable on someone elses system.

betterdan
10-15-03, 02:19 AM
What do you have to change in the UT 2003 ini for ATI cards to look better? I didn't know about this.

betterdan
10-15-03, 02:24 AM
I saw this on the anandtech website

As you can expect, we didn't have any problems with NVIDIA's latest Detonator drivers (29.42) and the latest build of UT2003.

Surprisingly enough, Matrox's latest Parhelia drivers actually worked better than ATI's CATALYST drivers under UT2003. The problem with ATI's publicly available CATALYST drivers is that Detailed Textures aren't properly supported, meaning they won't be rendered and anywhere that they are used you'll run into annoying flashing textures. ATI has fixed the issue internally and it looks like that latest 7.73 drivers that have been leaked contain the fix as well. We're hoping that they'll make this fix into an official CATALYST release before the official release of the UT2003 demo.


This is pretty old so I guess ATI did fix this already. Seems to look fine to me but I didn't thoroughly test UT 2003 out.

Gaal Dornik
10-15-03, 06:58 AM
As you can expect, we didn't have any problems with NVIDIA's latest Detonator drivers (29.42)

We're hoping that they'll make this fix into an official CATALYST release before the official release of the UT2003 demo.

:lol2: :lol2: :lol2:

@betterdan
Dude, next time newer news OK? :)

@particleman
Where did u get that rumor from. Or is it just your thought?

jimbob0i0
10-15-03, 07:20 AM
Originally posted by Gaal Dornik
As you can expect, we didn't have any problems with NVIDIA's latest Detonator drivers (29.42)

We're hoping that they'll make this fix into an official CATALYST release before the official release of the UT2003 demo.

:lol2: :lol2: :lol2:

@betterdan
Dude, next time newer news OK? :)

@particleman
Where did u get that rumor from. Or is it just your thought?

He is, I'd assume, referring to the trilinear filtering 'issue' in UT2003...

ATI does trilinear filtering on texture stage 0 and bilinear filtering on the rest if you force AF in the control panel for all games.... Normally thi swould probably not raise an issue (since other stages are used for lightmaps etc where level of filtering doesn't affect IQ so much)... however the 'base' texture in UT2K3 is texture stage 1 (I think... loads of detail at B3D if you search) which result sin incorrect levels of AF and noticeable mip-map transistions... however if you set the ATI control panel to 'application preference' and change the UT2K3 file to set the AF manually there trilinear filtering is applied to all levels correctly....

NV does not do trilinear filtering in UT2K3 (and, allegedly, any D3D apps from 52.12 and onwards) at all no matter what you set in the game/control panel but uses a bi/tri pseudo-filter to all levels instead it appears.

particleman
10-15-03, 11:41 AM
Originally posted by betterdan
What do you have to change in the UT 2003 ini for ATI cards to look better? I didn't know about this.

For some reason at the default setting, the certain textures in UT2003 will look very blurry (not using the extra detail textures). It wasn't always like this but somewhere along the way I think @ Cat 3.2 something happened, a bug or something (I can't see them doing it for better performance numbers as review sites use the benchmark program/script anyways which changes the ini and the default settings. Anyways, at the default settings for:

[D3DDrv.D3DRenderDevice]
DetailTexMipBias= X.XXX
DefaultTexMipBias= X.XXX

ATi cards produce a very blurry image on certain textures (particularly visible on the domination SunTemple map). If you change these settings to 0.0 for both on both the R3XX and the NV3X they will look the same on both cards. But if you leave them at their default settings the the NV3X will look much better. But seeing as when sites benchmark they set both of the above settings to -0.8 (the max setting) it's a moot point. Like I said for some reason it only seems to affect the default setting (which is why you will notice that IQ comparisons on review sites don't show this, since they don't use the default ini settings).

As an example here's a link from HardOCP outlining their ini settings from the benchmark program.
http://www.hardocp.com/article.html?art=NDQ0LDY=

serAph
10-15-03, 02:03 PM
well, he ran them @ 4x8x/4x16x and nothing like Digital Vibrance or truform or anything... Hey may have altered the ini file settings, but I doubt it. Anyway, as soon as I can get a hold of Hanners on MSN msgr, he'll post 'em.

-=DVS=-
10-15-03, 02:28 PM
why bother its not gonna turn world arround :lol:

GFFX looks better in some games its nothing new :rolleyes:
Radeon in others.........and so on..

Better show us GFFX beating crap out of Radeon in DX9 games :D

particleman
10-15-03, 03:01 PM
I knew it was UT2003, I've known about the UT2003.ini file thing for a while. Anyways I was thinking about it, and I think I know the exact reason why nVidia cards look better at the default ini settings in UT2003, but they look the same as ATi cards when DetailTexMipBias= X.XXX and DefaultTexMipBias= X.XXX are set the same eg DetailTexMipBias= 0.0 and DefaultTexMipBias= 0.0 or DetailTexMipBias= -0.0 and DefaultTexMipBias= -0.8. The reason is because ATi's drivers ignore the DefaultTexMipBias setting and only seem to respond to the DetailTexMipBias setting (that or the other way around I can't remember which one is lower at default). This is why when you don't edit the ini file, it will look better on nVidia cards because the default settings for DefaultTexMipBias and DetailTexMipBias are different at default DefaultTexMipBias is higher than DetailTexMipBias at default in the ini. But of course this doesn't change any of the benchmarks out there because the benchmark programs set them to the same value when benching.

Anyways, unless the person did actually set DefaultTexMipBias and DetailTexMipBias to the same value, then the ATi cards will not as good. It's definitely a setting thing and I definitely would not draw conclusions as to nVidia cards producing better DX8 quality because of it. If you do what I said and set DefaultTexMipBias and DetailTexMipBias to the same value (eg both 0.0 or both -0.8) the textures will look the same. Like I said this is what the UT2003 benchmark programs and review sites do, so it I doubt your shots will change anything, other than verify that ATi cards ignore the DefaultTexMipBias setting in UT2003. Since review sites use the same value for DefaultTexMipBias and DetailTexMipBias this is probably why we have never seen it brought up by any sites out there.

The reason why the benchmark programs and review sites set DefaultTexMipBias and DetailTexMipBias to the same value is because otherwise UT2003 will dynamically adjust image quality to achieve the minimum desired framerate (another setting in the UT2003 ini file).

Skinner
10-16-03, 07:46 PM
Originally posted by particleman
I knew it was UT2003, I've known about the UT2003.ini file thing for a while. Anyways I was thinking about it, and I think I know the exact reason why nVidia cards look better at the default ini settings in UT2003, but they look the same as ATi cards when DetailTexMipBias= X.XXX and DefaultTexMipBias= X.XXX are set the same eg DetailTexMipBias= 0.0 and DefaultTexMipBias= 0.0 or DetailTexMipBias= -0.0 and DefaultTexMipBias= -0.8. The reason is because ATi's drivers ignore the DefaultTexMipBias setting and only seem to respond to the DetailTexMipBias setting (that or the other way around I can't remember which one is lower at default). This is why when you don't edit the ini file, it will look better on nVidia cards because the default settings for DefaultTexMipBias and DetailTexMipBias are different at default DefaultTexMipBias is higher than DetailTexMipBias at default in the ini. But of course this doesn't change any of the benchmarks out there because the benchmark programs set them to the same value when benching.

Anyways, unless the person did actually set DefaultTexMipBias and DetailTexMipBias to the same value, then the ATi cards will not as good. It's definitely a setting thing and I definitely would not draw conclusions as to nVidia cards producing better DX8 quality because of it. If you do what I said and set DefaultTexMipBias and DetailTexMipBias to the same value (eg both 0.0 or both -0.8) the textures will look the same. Like I said this is what the UT2003 benchmark programs and review sites do, so it I doubt your shots will change anything, other than verify that ATi cards ignore the DefaultTexMipBias setting in UT2003. Since review sites use the same value for DefaultTexMipBias and DetailTexMipBias this is probably why we have never seen it brought up by any sites out there.

The reason why the benchmark programs and review sites set DefaultTexMipBias and DetailTexMipBias to the same value is because otherwise UT2003 will dynamically adjust image quality to achieve the minimum desired framerate (another setting in the UT2003 ini file).

This is only the case with AF on ATI cards, I don't know if its solved yet, as I always play with -0,8 on both settings :)

serAph
10-21-03, 12:39 PM
The 5900 @ 4x/8x quality
http://www.reflectonreality.com/images/ut2003nv.png



The 9800 @ 4x/16x quality
http://www.reflectonreality.com/images/ut2003ati.png

LiquidX
10-21-03, 01:00 PM
Heh for the first time I can actually see a difference in a screen shot of NV5900 vs ATI9800. But also if you look straight ahead the two little blue strobing lights in the wall are more apparent on the ATI.;)

Hellbinder
10-21-03, 01:16 PM
These are IMO invalid IQ comparrisons. Why? because they are taken from different viewpoints. There are clear differences in each Frame that are obviously being caused by the different angle of view in each. The differences affect both screens in positive and negative ways. The flames look better in one, The Strobe lights look better in the other. Some textures are clearer and more visible in one than the other (in both Directions).

I am also finding myself wondering if this is indeed 4x FSAA on the 5900 or 4xs or one of the other Xs modes. There seems to be SSAA going on in the Nvidia screens. What is the FPS at the time each frame was taken?

Just my observations...
http://instagiber.net/smiliesdotcom/otn/happy/ylsmoke.gif

pars_andy
10-21-03, 01:16 PM
And the orange flames are more apparent on the nvidia card...maybe they're variable lights and the screenshot's differed by a few ms?

pars_andy
10-21-03, 01:17 PM
Originally posted by Hellbinder
These are IMO invalid IQ comparrisons. Why? because they are taken from different viewpoints. There are clear differences in each Frame that are obviously being caused by the different angle of view in each. The differences affect both screens in positive and negative ways. The flames look better in one, The Strobe lights look better in the other. Some textures are clearer and more visible in one than the other (in both Directions).

I am also finding myself wondering if this is indeed 4x FSAA on the 5900 or 4xs or one of the other Xs modes. There seems to be SSAA going on in the Nvidia screens. What is the FPS at the time each frame was taken?

Just my observations...
http://instagiber.net/smiliesdotcom/otn/happy/ylsmoke.gif

Just wondering what your observations would have been had the images shown the radeon in front :rolleyes:

Zod
10-21-03, 01:31 PM
The most significant difference appears to be that the nV card is showing bump-mapping on the statue's arm, but the ATi card is not.

Kamel
10-21-03, 01:32 PM
it amazes me how many people don't know the stat fps command in UT2003.

what you should have done is made a server, then had one of the people observe through the other. this would ensure that the screen shots were exactly the same, as opposed to "nearly the same" as seen here.

just how i feel about it.

euan
10-21-03, 01:55 PM
You guys are so harsh. It must have taken this guy months to find such an example. Don't hurt his intarweb feelings. :(

If you really want to show the Radeon Cards in a bad light just take any screen shot from a starting position on any map. But lean to one side, (same as rolling in an aeroplane). Then take a pic. Everyone knows that ATi's AF is dependant on the angle of the surface. :rolleyes:

Hellbinder
10-21-03, 03:02 PM
Originally posted by pars_andy
Just wondering what your observations would have been had the images shown the radeon in front :rolleyes:
There are Clear Differences in both. I Hardly think this is even an example of one looking better than the other. AS i pointed out there are differences in both where one looks better than the other depending on what you are looking at in the frame.

But Judge me however you like.. It makes no difference. The evidence is what it is. You are free to interpret it however you like.

http://instagiber.net/smiliesdotcom/otn/happy/ylsmoke.gif

S.I.N
10-21-03, 03:12 PM
Ok before this guy loses it and this topic goes to waste I will say. Hellbinder is right on every count, it doesnt matter that all matters is the big feet in the middle which cleary looks better on the 5900 (if it is). If it was less than a hundreths of a inch a little over both shots would look just alike and in that case the ATI would be the clear winner. He is right and your wrong for taking screenshots and making us think other wise. How dare you and I bet Nvidia paid you.:mad:

particleman
10-21-03, 03:48 PM
Is no one listening, this is not a AF thing or anything. I have known about this problem for ages. It's a problem with ATi cards and the default ini settings in UT2003!!! (How do you think I knew that the screenshots were from UT2003 even before he said so?) You will not see this in any other games, or even UT2003 if you modify the ini file. Like I said if you do what review sites do and set DefaultTexMipBias and DetailTexMipBias to a value that is the same for both, the problem will disappear. Why do you think this is not visible in the IQ comparisons from reviews? Because the benchmark program automatically sets them both to -0.8!!!