PDA

View Full Version : nv40 as fast or faster than r420 in aa & shaders?


Pages : [1] 2

CaiNaM
05-16-04, 08:09 PM
curious.. when looking at all the benchmarks comparing ati vs nividia, nv40 is as fast or faster in shader tests and on par with r420 when comparing aa performance.

in synthetics nv40 appears faster in pixel shaders, r420 seems to process vertex shaders more efficiently than nv, which seems to even out most benchmarks where both cards are for the most part pretty even at all resolutions when using comparable aa.

the real issue seems to be AF - ati takes much less of a hit when 8x aniso is applied, while in many cases nv40 hte impact is much more significant.. nv performance definately takes a dive when high af is applied. strange considering the filtering has been getting worse with each gen from gf4 to fx and now to the 6 series.. nvidia's AF is now almost as bad as ati's, and it seems ati STILL offers much better AF performance.

which leads me to ask.. has anyone else noticed this? would this still be the case if both cards applied AF similarly (where ati only applies it to the first texture)? is this a driver or design issue?

is the way ati applies AF a "cheat", or just good optimisation? can nv optimise in the same way, or would you even want them to?
what are your thoughts?

Ruined
05-16-04, 08:12 PM
Cainam you must have missed the news. ATi is cheating with AF now, using some sort of bilinear/trilinear hybrid but disabling it when colored mipmaps are turned on to make it appear that they are doing trilinear (!). The brilinear thing isnt disturbing to me, but the fact that they disable it during mipmap tests in an attempt to deceive the user into thinking they are getting trilinear when they arent is a bit shady. check b3d/r3d.

CaiNaM
05-16-04, 08:17 PM
Cainam you must have missed the news. ATi is cheating with AF now, using some sort of bilinear/trilinear hybrid but disabling it when colored mipmaps are turned on to make it appear that they are doing trilinear (!). The brilinear thing isnt disturbing to me, but the fact that they disable it during mipmap tests in an attempt to deceive the user into thinking they are getting trilinear when they arent is a bit shady. check b3d/r3d.

well, i was aware of that.. ati is only applying trilinear to the first texture... but does that explain the performance difference between ati and nv with af applied?

i was pretty disappointed that nv lowered af quality yet again; it's on par with the ati now.. but given that, it's even more disappointing it doesn't appear to be as efficient as ati's method...

Ruined
05-16-04, 08:23 PM
well, i was aware of that.. ati is only applying trilinear to the first texture... but does that explain the performance difference between ati and nv with af applied?

i was pretty disappointed that nv lowered af quality yet again; it's on par with the ati now.. but given that, it's even more disappointing it doesn't appear to be as efficient as ati's method...

cainaim no its not the same old ati thing. ATI's AF is like the FX series now, no full trilinear at all (except worse cuz its also angle dependent). Its full brilinear that can't be shut off, not the old texture stage optimization. The ATI drivers detect mipmap color testers and disable the filter when you use them so that you can't see this, in essence deceiving the user. Was just discovered.

http://www.computerbase.de/artikel/hardware/grafikkarten/versteckspiele_atis_texturfilter/

http://www.rage3d.com/board/showthread.php?s=&threadid=33759063

http://www.beyond3d.com/forum/viewtopic.php?t=12486

ChrisRay
05-16-04, 08:31 PM
curious.. when looking at all the benchmarks comparing ati vs nividia, nv40 is as fast or faster in shader tests and on par with r420 when comparing aa performance.

in synthetics nv40 appears faster in pixel shaders, r420 seems to process vertex shaders more efficiently than nv, which seems to even out most benchmarks where both cards are for the most part pretty even at all resolutions when using comparable aa.

the real issue seems to be AF - ati takes much less of a hit when 8x aniso is applied, while in many cases nv40 hte impact is much more significant.. nv performance definately takes a dive when high af is applied. strange considering the filtering has been getting worse with each gen from gf4 to fx and now to the 6 series.. nvidia's AF is now almost as bad as ati's, and it seems ati STILL offers much better AF performance.

which leads me to ask.. has anyone else noticed this? would this still be the case if both cards applied AF similarly (where ati only applies it to the first texture)? is this a driver or design issue?

is the way ati applies AF a "cheat", or just good optimisation? can nv optimise in the same way, or would you even want them to?
what are your thoughts?


Well it should be noted the Nv40 architecture shares one of its Shader Units with texture sampling When AF is applied, and the r420/r300 series is independent.

CaiNaM
05-16-04, 09:34 PM
Well it should be noted the Nv40 architecture shares one of its Shader Units with texture sampling When AF is applied, and the r420/r300 series is independent.

ahh.. so it's hardware design? and shader processing slows down as when af is applied it uses one of the shader units? somehow in all the tech journals i read i missed that.. makes sense tho why you'd see such a perf dive when af is applied... hmm. sounds like a rather flawed design - robbing peter to pay paul, so to speak...

ChrisRay
05-16-04, 09:36 PM
ahh.. so it's hardware design? and shader processing slows down as when af is applied it uses one of the shader units? somehow in all the tech journals i read i missed that.. makes sense tho why you'd see such a perf dive when af is applied... hmm. sounds like a rather flawed design - robbing peter to pay paul, so to speak...


I should be more specific. Its one of the ALUS. But yes, That does seem to be the case currently. Dont know if this is a hardware "Flaw" however.

CaiNaM
05-16-04, 09:41 PM
cainaim no its not the same old ati thing. ATI's AF is like the FX series now, no full trilinear at all (except worse cuz its also angle dependent). Its full brilinear that can't be shut off, not the old texture stage optimization. The ATI drivers detect mipmap color testers and disable the filter when you use them so that you can't see this, in essence deceiving the user. Was just discovered.

http://www.computerbase.de/artikel/hardware/grafikkarten/versteckspiele_atis_texturfilter/

http://www.rage3d.com/board/showthread.php?s=&threadid=33759063

http://www.beyond3d.com/forum/viewtopic.php?t=12486

hmmm.. i see.. so basically ati parts run "brilinear" unless it detects a mipmap tester, in which case it applies trilinear (thus hiding the fact from the user it runs af "brilinear" optimizations to inflate benchmark scores)?

Clay
05-16-04, 09:57 PM
hmmm.. i see.. so basically ati parts run "brilinear" unless it detects a mipmap tester, in which case it applies trilinear (thus hiding the fact from the user it runs af "brilinear" optimizations to inflate benchmark scores)?Apparently but I'm still reading up on it as well. It'll take a few more days for this to all shake out I'm sure.

SuLinUX
05-16-04, 10:12 PM
AF performance but dramaticly improved in the FX, I remember AF badly hammering my GF4 Ti 4400. The FX5900U performed just as good as my previous Radeon 9700Pro in AF.

Ruined
05-16-04, 10:24 PM
hmmm.. i see.. so basically ati parts run "brilinear" unless it detects a mipmap tester, in which case it applies trilinear (thus hiding the fact from the user it runs af "brilinear" optimizations to inflate benchmark scores)?

thats what it looks like at this point.

hovz
05-16-04, 11:06 PM
yes, ati is cheating along with nvidia now, i wonder how long this has been going on tho. to answer the question, i think atis drivers are seriously not optimized to use the new memory controller of the r420. its overall efficiency is like under 70%.

Blacklash
05-16-04, 11:07 PM
In light of recent events:

Have a look at these 1200x1600 benches paying attention to the blue bars, or just 4xAA. The first chart, or the AoD one, look at the purple bar, it is PS2.0 and 4xAA. The blue is pure PS2.0 in that chart (AoD). The so called extreme is just a 6800u at 450mhz. Keep in mind even the best 6800 is running a full 75mhz behind the X800XT and first release, aka 400mhz, a full 125>


http://members.cox.net/malficar4/tr1600.png

http://members.cox.net/malficar4/ut1600.png

http://members.cox.net/malficar4/aq31600.png

I am eager to get my 6800u and kick her up to 450>>>

hovz
05-16-04, 11:08 PM
thats why i think atis aa performance is so much slower than nvidias. also explains why the fill rate on both pro and xt is so low.

Clay
05-16-04, 11:10 PM
yes, ati is cheating along with nvidia now, i wonder how long this has been going on tho. to answer the question, i think atis drivers are seriously not optimized to use the new memory controller of the r420. its overall efficiency is like under 70%.Please guys, cut out this knee jerk talk. Give this a few days before you start whipping out the pichforks and torches. I've always maintained that no company is without its spots so *shrug* as indiffirent as I was about NVIDIA's spots in the past so am I about any new potential ATI spots.

hovz
05-16-04, 11:12 PM
knee jerk talk? wtf ar eu talking about? they are C H E A T I N G. plain and simple.

Clay
05-16-04, 11:17 PM
knee jerk talk? wtf ar eu talking about? they are C H E A T I N G. plain and simple.Case in point. :) I agree that it does appear to be the case. My point is that we've all seen supposed cheating in the past (as recent as FartCry) that proved to not be the case. Nothing personal, it's just that the rash of people crying foul so quickly is just noise at first.

hovz
05-16-04, 11:21 PM
plz give me even a slightly logical explanation to this that would excuse it from being cheating. i cant even think of one in the realm of possibility

Clay
05-16-04, 11:25 PM
Oh good grief. I just said that "I agree that it does appear to be the case" (that they are cheating) ugh

I'm not trying to excuse it from being cheating. I'm just saying that everybody and their brother echoing each other about this is kind of silly (not to mention premature).

hovz
05-16-04, 11:44 PM
but if u urself agree that there sno explanation with even a shred of logic that coudl excuse this from cheating, then how are we being premature

CaiNaM
05-17-04, 12:23 AM
cause it's like all the DH crap and the fartcry crap where every ati fanboy was jumping on the bandwagon screaming "cheat", when after some further discussion turned out not to be the case.

i think max' point is that some of us weren't ready to roast nvidia over that then, and we aren't gonna make a big deal about ati just yet either (heh.. feel free to correct me if i'm wrong max).

gordon151
05-17-04, 12:49 AM
The brilinear filtering is not cheating since it's similar to what nVidia uses and it's applied in all situations. The only unscrupulous act here is the detection of colored mipmaps and disabling of the trilinear/bilinear filtering optimizations.

I know many are eager to find something to nail on ATI after feeling down about the continuous spate of uncrupulous acts being discovered done by nVidia, but i'd rather wait on an explanation by ATI. They are usually more forthcoming than nVidia whenever people have an issue with something they do.

Nv40
05-17-04, 06:19 AM
first before people here ,begin with their conspirancy theories.. let me clarify a few things..a cheat is when some one do something in secret ,hiding it from others ,intentionally to not show something they will not like other to see.

second AA is not cheateable.. at least not in NVidia cards ,their AA is fixed.. the lower quality AA in the GeforceFX still exist today.and its the same since day1. is not a cheat ,because is openly visible not hidden,neither an optimization is what they has to offer. and their lower quality modes in AA have been know since day 1 ,so noone was cheated . every gamer that is not satisfied with the IQ of AA in NV3c cards bought an ATI card.everyone knew all the time how was the real IQ there.

the same its with NV40 new RG modes. they are not programable .so their pure AA modes 2x/4x in the NV40main modes will be in that way.. forever. they migh have room for tweak in their mixed modes but the pure AA modes will stay the same way.. nothing is hidden there. and here at NVnews there are 2 diferents threads where when asked to pick the best quality AA the majority picked the NV40 screenshots (thinking it was the RAdeon9800Xt). just figure that.
the WEBsites comparisons using microscope/photoshop with 900% zoom is pointless because thats not the way people play. at normal screenshots is the way people play.

and THird the NV40 is overall FAster than the R420 in AA ,this is a fact.and all reviews clearly show this.. this is not a cheat. if NVidia wins in something they are cheating? there is -consistency- in every review here. when it comes to AA ,NVidia is faster most of the time. just remeber that NVidia NV40 is a complete new hardware ,from a the ground up,a new arquitecture ,so they have been working since the NV30 release in it .and implemented a new AA mode ,to compete with ATI R300 IQ. and they have done an excelent JOb there. ATI in the other hand still use their good but past generation AA and AF modes. with the exeption of temporal AA. So is natural that NVidia came with a quality comparable and even slighty better at times AA/AF technques since they were targeting the R300 IQ.

a finally also in Pure Performance in games Nvidia is very close and many times ahead of the R420xt. is mostly when AF is used in games that Nvidia lose (sometimes when their are leading in AA) by a good margin ,they end second even to the PRo at times when AF is used,and we already know of some interesting discussions about why ATI have such as stellar performance with AF.

another thing that people are forgetting is that NVidia launched First.
So if Nvidia was planning to play unfair here or cut curners (do less than what ATI was doing ),i doubt that they will know where to do it and by how much since they never knew until weeks later ,How was the R42x performance or the IQ was.

So no conspirancy theory here.. AA in the NV4x its fixed (not programable) and will have the same quality and same performance until the end of the life of the product.

Clay
05-17-04, 08:15 AM
cause it's like all the DH crap and the fartcry crap where every ati fanboy was jumping on the bandwagon screaming "cheat", when after some further discussion turned out not to be the case.

i think max' point is that some of us weren't ready to roast nvidia over that then, and we aren't gonna make a big deal about ati just yet either (heh.. feel free to correct me if i'm wrong max).Yep, you pretty much nailed it on the head. :)

vitocorleone
05-17-04, 10:36 AM
"cause it's like all the DH crap and the fartcry crap where every ati fanboy was jumping on the bandwagon screaming "cheat", when after some further discussion turned out not to be the case."

Yeah. Nvidia users NEVER jump on the bandwagon and scream cheat!! :screwy: :drooling:

:angel2: