PDA

View Full Version : Digit-life Nv35 image quality comparisons


Pages : [1] 2 3 4 5 6 7 8 9

Nv40
06-01-03, 01:54 AM
take a look at this review ...


http://www.digit-life.com/articles2/radeon/herc-r9800-r7500.html

they have raised many interesting points from the user point of view,about the issues with the diferences 330 patch and the big penalty between old and new patch .

There is no difference in quality between the patches of both cards again. And again FutureMark finds something invisible for an ordinary user, which makes the NV35 speed down and the R350 speed up. By the way, the latter has some artifacts again._

So, the test have revealed the following:_ Obviously, the programmers at NVIDIA use some cheats in the 3DMark03 test, first of all in Game1 and Game4. Whatever the test and no matter how a given company relates to the test, it's bad to accelerate cards at the expense of the image quality, and it must be punished._ We have noticed no quality losses in case of the RADEON 9800 PRO on v3.20, that is why it's hard to say what the test developers accused ATI of. But ATI admitted cheating, and there must be something. But these tricks are unnoticeable (though they correspond to those several percents in speed ATI is accused of). It's just strange that in some tests the new patch v3.30 lifted up the speed of the RADEON 9800 PRO on those drivers._ From the standpoint of an average user, so considerable speed drop of the NV35 is not adequate to the NVIDIA's cheats and tricks. Especially, in Game2, Game3 and Game4. In the first two games we noticed no cheats at all. In the Game4 the speed drop by 50 % is a too cruel punishment for the tricks with rendering of the water surface._ Certainly, we have no right to consider our standpoint the only true because the test developers can find some other cheats which are not noticeable but which can have a great effect on the scene. But still, if the programmers at ATI or NVIDIA have made some optimizations which do not affect the visual quality, then why not? Why to have a grudge if the FutureMark developers couldn't make some optimizations and others could? I feel very disappointed at such situation around 3DMark03 and FutureMark, I can't believe that the patch 3.30 "has put everything back on track", and I can't believe that future patches will objectively reflect hardware capabilities instead of emotions. That is why all our future tests (except of the RADEON 9600 PRO which was tested before this article) will use 3DMark03 but just for the sake of statistics, without comments and analysis._

ReDeeMeR
06-01-03, 02:13 AM
This is fairly obvious that both sides are using dirty tricks, ATi and Futuremark are obviously in bed and Nvidia just doesnt want to give up at any cost.

indio
06-01-03, 02:27 AM
apparently digitlife (who by the way has a vested interest in seeing 3dmark fall ala' RIGHTMARK 3D) doesn't understand off- the rail.
Please consider the source. It's like Nintendo Power Magazine reviewing Playstation 2 consoles . It's ludicrous on it's face.
You can't get around the fact that games and 3dmark are supposed to render the entire scene and the path the user will take is almost always unknown unless it's a side scroller :rolleyes: .
Saying it's a legit optimisation is a farce. Show us a game were a static clip plain was added other than for a benchmark. The only thing that's interesting about the whole thing is how wacked out the are a digitlife.

maybe Nvidia should just make an Mpg of the viewable area in their drivers and just play it back at whatever frame rate they see fit. There will be no difference in quality and you will only be able to see the difference with special tools or if you zoom.
Therefore using a prerecorded mpg of 3dmark tests instead of doing 3d rendering is not cheating.

Sazar
06-01-03, 02:58 AM
:)

it is really nice to see people who have absolutely no idea of what is going on and appear to have been sealed in a vacuum sealed capsule for the past 2-3 weeks pop up and give their version of events...

:D

it is rather unfortunate that no one seems to pay any attention to what was done by nvidia... and rather to try and deflect the attention to other's instead...

perhaps digitlife would like to say something about how the clipping panes work... oh wait they make no mention of them...

nice article indeed... very technical and what not...

:rolleyes:

Nv40
06-01-03, 03:11 AM
Originally posted by Sazar
:)

it is really nice to see people who have absolutely no idea of what is going on and appear to have been sealed in a vacuum sealed capsule for the past 2-3 weeks pop up and give their version of events...

it is rather unfortunate that no one seems to pay any attention to what was done by nvidia... and rather to try and deflect the attention to other's instead...

perhaps digitlife would like to say something about how the clipping panes work... oh wait they make no mention of them...

nice article indeed... very technical and what not...

:rolleyes:


yes.. maybe we can start again the discussion 2-3 more weeks :)
but still the digitlife report is interesting ,because for misterious reasons
nobody have done a report like that . with screenshots side by side of the
diferences the gamers will see. the only thing that will be in the mind of people are the Screenhots of things the gamer will never see.
the people who pointed their fingers to Nvidia cheating never posted screenshots in the same way the end user will see it ,but only zooming ,to over exaggerate and nitpick an invicible IQ drop made in the bechmark..
what can be learned from the Digitlife report is that except for the clipping planes (wich they cant see obviously they dont have the Developer version of 3dmark),everything else done by Nvidia is in fact a valid optimization that can be done in games.

gokickrocks
06-01-03, 03:13 AM
digitlife needs to post reference rasterizer images if they want to do comparisons, otherwise, they shouldnt do it at all

Typedef Enum
06-01-03, 03:53 AM
I think it's pretty funny that anybody would say, "Hey, it's really hard to tell the difference...so, therefore, it's really not that bad, right?"

Give me a break...It makes no difference whatsoever. The fact of the matter is that when comparing the R3xx to the NV3x under DX9 scenarios, the nVidia chip gets completely toasted. At this point in time, there's virtually no disputing this fact.

Head on over to Beyond3d...there's now about 4-5 different threads regarding pitiful performance of the FX using various shader demos/benchmarks...and if it's not bad performance, it's bad quality...

Why am I saying all of this? Because, nVidia obviously took notice just how good the R3xx chip is @ doing the thing they were hyping all those many months...and nVidia has had no choice but to fudge as many benchmarks as possible to at least give people the perception that they're FX line is competitive...

Behemoth
06-01-03, 04:08 AM
give me a break, if you compare 32bit to 24 bit under dx9, lets also compare 16bit to 24bit under doom3 scenario.

StealthHawk
06-01-03, 04:13 AM
Originally posted by Behemoth
give me a break, if you compare 32bit to 24 bit under dx9, lets also compare 16bit to 24bit under doom3 scenario.

Last I checked, without using proprietary extensions in OGL, you would get FP32 with nvidia cards and FP24 with ATI cards. A very fair comparison.

Eymar
06-01-03, 04:16 AM
This comparison article is lacking in technical knowledge. I'm no graphics technical guru either, but the main thing Digit-life is missing that most of the optimizations are just plain cheats that won't help in games. Things that indio and Sazar refer to. I don't know why Nvidia resorted to this type of stuff. It seems very petty for Nvidia to cheat then when caught try to make 3dmark irrelevant. Especially since 3dmark help them sell dx8 cards when no dx8 games were available. I don't think it was needed since the Nv35 seems to be a strong card in it's own right especially in dx8.1 games. It's performance in DX9 games is still in question and put into a deeper mystery with the cheating, but I don't see alot of dx9 games make it to market until next year.

bkswaney
06-01-03, 04:56 AM
Originally posted by StealthHawk
Last I checked, without using proprietary extensions in OGL, you would get FP32 with nvidia cards and FP24 with ATI cards. A very fair comparison.

OK if that is fair then why not Nvidia's 16 to ATI's 24? :p

tertsi
06-01-03, 05:29 AM
busted!

Nv40
06-01-03, 05:29 AM
for the only thing i was posting the digitlife link..
is that all the arguments about Bad IQ in 3dmark againts NVidia
was exaggerated , in fact there were times where the FX in 128bits
with the new 330 patch got better IQ than the Radeon9800pro
(the missing lights in the space ship). means that with cheats or without
cheats ,all the complaints about Nvidia IQ lost was irrelevant
,and meaningless to the eye of the gamer.

from 8 cheats posted by 3dmark.. only one was solid (clipping planes)
not because IQ lost ,but because NVidia was cutting corners ,the rest of
the accusations were simply meaningless. even FM told that Nvidia
was using "more efficient shaders". what Digitlife article proof is that
FUturemark does not represent the way a gamedeveloper will program
in games ,since obviously Nvidia and ATI found other ways to optimize
for better performance in their shaders with no IQ lost.

just look at the conclussions of digitlife..
1)Nvidia cheated.
2)ATI admited to cheat too.
3)there is Almost no IQ drop between cheats and non cheats ,between
patch 220 and 330 only performance diferences.
4)3dmark is not reliable test anymore since the test ,penalty for Nvidia
cards it -too BIg for nothing- ,because of the the way PS1.4 are used in the test and because doesnt simulate game conditions ,(this is what FM advertise in their site) since no IHV vendors will code shaders in a game in the same way of FM. i think the best thing FM can do to regain the
confidence in their benchmark is ..
1)to release the source code of the benchmark to the public ,(other programmers have do it with their syntetic benchmarks) even their latest patches. in a way that non betamembers or the people that dont pay $$ can confirm what the program is doing and not doing behind the scenes and they way they test video cards. that way we can have a better idea if Nvidia claims are valid or not. :)

2)and to stop asking ridiculous amounts of money to IHV vendors for "strategic partnership". not because BIg companies cant afford to pay
hundreds of thousands ,but because the BIG COnflicts of Interests this can cause .who pays more $$ will have more influence and more oportunities
with the program .

ChrisW
06-01-03, 05:30 AM
If this guy had a clue what he was talking about, he would have known that 3DMark03 is not optimized on purpose. The author is trying to simulate a more complex scene that will be representative of future, more complex games. By reducing the complexity, and thus reducing the amount of work done, you are underminding the entire purpose of the benchmark. It is not about the final image but the calculations that are being performed. Just what good is the value of the pixel shader 2.0 test if one card has changed the test to int12?

ChrisW
06-01-03, 05:33 AM
Originally posted by Nv40
from 8 cheats posted by 3dmark.. only one was solid (clipping planes)
not because IQ lost ,but because NVidia was cutting corners . ,the rest of
the accusations were simply meaningless. even FM told that Nvidia
was using "more efficient shaders". what Digitlife article proof is that
FUturemark does not represent the way a gamedeveloper will program
in games ,since obviously Nvidia and ATI found other ways to optimize
for better performance in their shaders .

What about the cheat where the drivers are not clearing the frame buffer for the space scene? Can you justify that?:rolleyes:

ChrisW
06-01-03, 05:34 AM
Originally posted by Nv40
2)ATI admited to cheat too.
You know full well that ATI has never admitted to cheating. To continue to spread this rumor is irresponsible.

Hanners
06-01-03, 05:36 AM
It's good to have a reminder as to why I never read any Digit-Life reviews or articles any more, and this was a pretty good reminder - Thanks. :p

I'm not sure anyone could actually have done a better job of misunderstanding the whole situation than Digit-Life seem to have here.

Behemoth
06-01-03, 05:40 AM
Originally posted by StealthHawk
Last I checked, without using proprietary extensions in OGL, you would get FP32 with nvidia cards and FP24 with ATI cards. A very fair comparison.
yeah i guess you think comparing fp32 to fp24 is fair. when dx9 introduce partial precison, you will know what is fair.

Behemoth
06-01-03, 06:23 AM
Originally posted by bkswaney
OK if that is fair then why not Nvidia's 16 to ATI's 24? :p
its not fair because there is a double standard, comparing 16bit to 24bit is criminal, comparing 24bit to 32bit is perfectly legitmate
writing games under nvidia proprietary extension is criminal.
writing games under m$ diretX specification is the only legitmate way to go.
making proprietary extension must be a crime!

Nv40
06-01-03, 06:26 AM
Originally posted by ChrisW
You know full well that ATI has never admitted to cheating. To continue to spread this rumor is irresponsible.

read again... my post..
it says -> Digitlife conclusions..

the fact is that ,i dont *know* what ATi have really done or not ,
what they have really said or not said. only ATi knows that.
i only knows what internet says.

Nvidia says what they have done are optimizations and that it was a bug.
ATI told the same thing about the quack "bug."
but because it is in internet not necessarily means it is true.

Nv40
06-01-03, 06:59 AM
Originally posted by Behemoth
yeah i guess you think comparing fp32 to fp24 is fair. when dx9 introduce partial precison, you will know what is fair.


DIrectx9 will support in the future Fp16?

Behemoth
06-01-03, 07:09 AM
Originally posted by Nv40
DIrectx9 will support in the future Fp16?


Originally posted by StealthHawk

The matter is officially closed, PS_2_0 is FP24 minimum without the partial precision flag, the SDK update (coming soon) will have an optional FORCE_PARTIAL_PRECISION flag that can be passed to the HLSL compiler to force all floats into halfs but thats a developer's choice. If you use HLSL half or _pp you will get the FP16 paths on GFFX (but no difference on ATI R3x0 cards).


This was posted in April, so it is unknown to me whether or not the updated SDK is available.

Nv40
06-01-03, 07:41 AM
Originally posted by Hanners
It's good to have a reminder as to why I never read any Digit-Life reviews or articles any more, and this was a pretty good reminder - Thanks. :p

I'm not sure anyone could actually have done a better job of misunderstanding the whole situation than Digit-Life seem to have here.

actually im really enjoying Digitlife more than ever :)
now it deserves to be in my top list.. :D
a picture speak better than a thousand of words ,
and those are the reviewers with more pictures in their reviews.
you can admit at least their Gif animations are excelent for IQ comparisons. and they know alot of hardware too. they discovered the ATI "optimizations" with their anisoF :)

surfhurleydude
06-01-03, 08:20 AM
I don't know about anyone else, but I find the IQ differences to be so small at even this extremely zoomed level that it doesn't bother me at all. It wouldn't bother me if ATi did the same thing for my current Radeon 9700 Pro either... The only reason you can really even tell there are IQ differences is because you're comparing them DIRECTLY side by side with other screens. If I played my GeForce FX 5900 Ultra with some "IQ degrading" settings, I know I wouldn't notice anything because I wouldn't be directly comparing anything. In fact, I think it'd be great if ATi did the same thing to get some more speed out of my current Radeon 9700 Pro.

Hanners
06-01-03, 08:25 AM
Originally posted by Nv40
they discovered the ATI "optimizations" with their anisoF :)

They didn't exactly 'discover' them - In fact, they got it horribly, horribly wrong for a very long time. I remember reading review after review on their site which stated that ATi use ripmapping in their aniso implementation, which is totally incorrect.

For that reason among others I would put very little bearing on any technical points they try to make, because to be honest I quite often get the feeling they don't have a clue what they're talking about.