PDA

View Full Version : New Digit-Life Review...With Anti-Detect =D


Pages : [1] 2 3

StealthHawk
07-12-03, 07:11 PM
http://www.ixbt-labs.com/articles2/gffx/gffx-16.html

They test Code Creatures, UT2003, Unreal2, and RightMark3D with and without the RivaTuner anti-detection patch script.

You guys can read the article and look at the numbers yourselves and draw your own conclusions, as well as consider whether or not you share the same opinions as the reviewer and want to buy into his conclusions. For example, he says this about UT2003:However, some of NVIDIA's optimizations are useful: just compare the performance with the Anti-Detect and without it on our demo benchmark (not the standard one). Since it's impossible to make cheats for our demo version, the difference is caused by useful optimizations which were disabled by the Anti-Detect.

First of all, it could be that NVIDIA may have special optimizations used for benchmarks like DM Antalus. Or, it could be that NVIDIA's global UT2003 optimizations affect Antalus scores more heavily. The author's conclusions that it is impossible to make "cheats" for non-standard benchmarks is a bit off the mark. Since we know that when in Quality mode NVIDIA is not doing full trilinear in UT2003 that fact means it obviously affects ALL demos, and all gameplay. Whether you consider it a cheat or not is up to you...but Digit-Life apparently doesn't know about this whatsoever.


Anyway, personally I would have liked to see some more stuff tested, like Commanche4 and Aquanox, to name a few others. From the original Digit-Life article by Unwinder we know that Aquanox is detected(but not how much performance is gained or how IQ changes).

-=DVS=-
07-12-03, 07:24 PM
Looks like Nvidia needs optimize(or cut corners depending on personal opinnion) on moust games to perform good at all :o , looks like ATI doesn't need to optimize on specific games ,runs same with or without script ;)

bkswaney
07-12-03, 07:31 PM
Who cares. Ever good game maker uses nvidia code. Get over it.
This subject is old.

StealthHawk
07-12-03, 07:40 PM
Originally posted by bkswaney
Who cares. Ever good game maker uses nvidia code. Get over it.

:rolleyes: Tim Sweeney isn't a good game maker then? As I recall, UT2003 is a TWIMTBP game. As I also recall, the game was developed primarily on NVIDIA cards. If they "use nvidia code" then why does NVIDIA need to make further optimizations/"optimizations."

Wasn't Code Creatures also optimized for NVIDIA cards?

This subject is old.

Why, because it shows an increasing pattern pertaining to popularly benchmarked software? The review is linked on the front page, how can it be "old?" None of the software tested in the review has been done so yet, besides UT2003.

bkswaney
07-12-03, 07:47 PM
Like I said it's just beating a dead horse.

Carry on. :)

Rogozhin
07-12-03, 07:50 PM
you don't have support a company that lies to it's customers and shareholders just because you own a card of theirs.

These guys are so full of logical fallacies that the impression I get is that of 4th graders doing a science report-it's just sad.

Nvidia is doing very underhanded things to pull the wool over their consumers and shareholders (and these idiots are so blind that they can't even use logic to clear their vision).

I am a coffee roaster that takes pride in the fact that I use the highest grade arabica bean available (these cost much more than robusta or low grade arabica, but I use them because of their flavor complexity) but I could lie and buy the robusta but advertise that it's high grade arabica and it's likely that only a coffee review board (using the cupping method) would be the only ones that could tell the difference, but that would be cheating and dishonest (and also would relegate my status to that of "folgers" rep) and not inline with being a gourmet coffee roaster. And since the nvidiots that can't see their hand from their azz still populate this forum I have to give this analogy because buisness should not be based on deceit-just read the jungle if you believe that cheaters should prosper.

rogo

reever2
07-12-03, 08:57 PM
So could anybody tell me why digit-life evens the AF field by running 16AF on the radeons since it is better on Nvidia cards. But they do not run Nvidia AA higher since it is of worse quality of Ati's?

Ruined
07-12-03, 09:22 PM
From the article:
"Below are several pictures that prove that the quality is the same with the AD and without"

So, the optimizations aren't affecting image quality, but they are speeding up the game. This is a bad thing because...? I guess if it bothers you that you are getting more FPS, you can encode an mp3 in the background or something. ;)

Personally, I'll take the faster optimized code on all my games. If Nvidia can tweak the drivers to give me speed boosts in many of my games while maintaining image quality, that is a good thing, not a bad thing!

Also, one of the recent digit life articles proved that the FX5900 ver44.67 driver (which likely has 'optimizations') is more accurately rendering 3dmark03 3.3.0 than ATI's driver, which ATI claims has no optimizations. They compared screenshots of both to reference 3dmark03 screenshots. So let me get this straight - you'd prefer ATI's driver, which is less optimized/slower and not as accurate compared to Nvidia's driver which is more optimized/faster and more accurate? That's strange logic.

Slappi
07-12-03, 09:31 PM
Originally posted by Ruined
From the article:
"Below are several pictures that prove that the quality is the same with the AD and without"

So, the optimizations aren't affecting image quality, but they are speeding up the game. This is a bad thing because...? I guess if it bothers you that you are getting more FPS, you can encode an mp3 in the background or something. ;)

Personally, I'll take the faster optimized code on all my games. If Nvidia can tweak the drivers to give me speed boosts in many of my games while maintaining image quality, that is a good thing, not a bad thing!

Also, one of the recent digit life articles proved that the FX5900 ver44.67 driver (which likely has 'optimizations') is more accurately rendering 3dmark03 3.3.0 than ATI's driver, which ATI claims has no optimizations. So let me get this straight - you'd prefer ATI's driver, which is less optimized/slower and not as accurate compared to Nvidia's driver which is more optimized/faster and more accurate? That's strange logic.


Yah....I fail to see where this is a bad thing. IQ is the same but it is faster. How is that a bad thing.

Also this is taken as fact because some guy made a program to detect a "cheat". Who is this guy? Is this on the up and up? Does he have a bias against nVidia? Is his program screwy?

We know nothing about his so called anti-cheat program.

Miester_V
07-12-03, 09:32 PM
It gets old only when mostly everone acknowledeges the fact that they are getting screwed by Nvidia. So far, there are still some blind loyalty in here, and that is a sign that this topic won't get old soon.

Slappi
07-12-03, 09:33 PM
Originally posted by Miester_V
It gets old only when mostly everone acknowledeges the fact that they are getting screwed by Nvidia. So far, there are still some blind loyalty in here, and that is a sign that this topic won't get old soon.

Just how is anyone getting screwed? By making things faster?

heedory
07-12-03, 09:41 PM
simple matter

same image quality and high fps=good thing
lower image quality and high fps=cheat

application doesn't mean cheat..
Image Quality is criteria

why didn't they use photo shop to find the differene with images :D

Miester_V
07-12-03, 09:44 PM
Originally posted by Slappi
Just how is anyone getting screwed? By making things faster?
No, it's HOW they make it faster. Think of it like this:

You are taking an algebra course. You take a test that involves problems with variables, however you have a calculator that accepts variables into their program. So instead of working out the problem completely, you just punch in the equation and get the answer. You should have been able to get that answer by working out the WHOLE problem, instead of skipping right to the answer. It's called CHEATING, and misunderstanding the concept of THOROUGH understanding of the mathematical language. What Nvidia has done is, instead of properly rendering things that ALL video cards should render, in equal and consistent quality, they skip through all of those standards and try to fool the viewer into thinking it's going through proper rendering techniques.

Ruined
07-12-03, 10:07 PM
Originally posted by Miester_V
No, it's HOW they make it faster. Think of it like this:

You are taking an algebra course. You take a test that involves problems with variables, however you have a calculator that accepts variables into their program. So instead of working out the problem completely, you just punch in the equation and get the answer. You should have been able to get that answer by working out the WHOLE problem, instead of skipping right to the answer. It's called CHEATING, and misunderstanding the concept of THOROUGH understanding of the mathematical language. What Nvidia has done is, instead of properly rendering things that ALL video cards should render, in equal and consistent quality, they skip through all of those standards and try to fool the viewer into thinking it's going through proper rendering techniques.

So Nvidia cards are like calculators and ATI is like pen and paper. I'll take the calc. If its faster and has the same image quality, it doesn't make a difference in the world how they made it faster, so long as its faster without losing noticable quality. We aren't math teachers looking for the most clear, step by step approach to solving a problem. We are gamers looking for the fastest FPS with the best image quality. People are being far too melodramatic saying that Nvidia isn't doing it the right way, etc... Nvidia is doing it the fastest way possible without losing image quality, and thats golden for the gamer.

Personally, I think it may be possible that many are just bitter that Nvidia has arguably taken the lead in performance while matching or even exceeding ATI in image quality (as stated by the digit life 3dmark03 article). It's bound to happen when people spend $350-$450 on a 9800pro and its relegated to 2nd or 3rd best. Now that Nvidia is back in the game and ahead, all sorts of sour grapes are being spilled about optimizations that are resulting in faster speeds with no loss of IQ. This isn't ethics class or math class or any class for that matter. This is gaming, and the fastest, best looking card wins, period. Right now that card looks to be the FX5900 Ultra, especially in Doom 3, which will likely be *the* PC graphical masterpiece for 2003.

CaptNKILL
07-12-03, 10:14 PM
Gee I wonder where this thread is going to go....

:argue:

Ruined
07-12-03, 10:18 PM
Originally posted by CaptNKILL
Gee I wonder where this thread is going to go....

:argue:

Maybe we should all go back to Direct3D Software Emulation for the most optimization-free experience ;)

reever2
07-12-03, 10:18 PM
Originally posted by Ruined
So Nvidia cards are like calculators and ATI is like pen and paper. I'll take the calc. If its faster and has the same image quality, it doesn't make a difference in the world how they made it faster, so long as its faster without losing noticable quality. We aren't math teachers looking for the most clear, step by step approach to solving a problem. We are gamers looking for the fastest FPS with the best image quality. People are being far too melodramatic saying that Nvidia isn't doing it the right way, etc... Nvidia is doing it the fastest way possible without losing image quality, and thats golden for the gamer.

Personally, I think it may be possible that many are just bitter that Nvidia has arguably taken the lead in performance while matching or even exceeding ATI in image quality (as stated by the digit life 3dmark03 article). It's bound to happen when people spend $350-$450 on a 9800pro and its relegated to 2nd or 3rd best. Now that Nvidia is back in the game and ahead, all sorts of sour grapes are being spilled about optimizations that are resulting in faster speeds with no loss of IQ. This isn't ethics class or math class or any class for that matter. This is gaming, and the fastest, best looking card wins, period. Right now that card looks to be the FX5900 Ultra, especially in Doom 3, which will likely be *the* PC graphical masterpiece for 2003.

They are both calculators. Only exception is with Ati you dont need to do massive amounts of work rewriting sahder code to make it run a little bit faster. But Nvidia makes sure developers do with the TWIMTP program. And Nvidia still has not fixed their crap AA. Digit-life for some reason doenst like to handicap Nvidia cards on their AA, but they love handicapping radeon because of their bad AF. And Doom3 is not the pc graphical or gameplay masterpiece of 2003. It might not even release in 2003 first of all, and pending any delays other games will come out which look better than doom3 like hl2 and stalker

CaptNKILL
07-12-03, 10:20 PM
Originally posted by Ruined
So Nvidia cards are like calculators and ATI is like pen and paper. I'll take the calc. If its faster and has the same image quality, it doesn't make a difference in the world how they made it faster, so long as its faster without losing noticable quality.

Idunno, I think a REAL pen with REAL lighting and shadows cast across the REAL paper would look a hell of a lot better than a 50x30 res, colorless LCD screen.... but thats just me. :rolleyes: :D

Ruined
07-12-03, 10:24 PM
Originally posted by reever2
They are both calculators. Only exception is with Ati you dont need to do massive amounts of work rewriting sahder code to make it run a little bit faster.

I don't need to do it with my Nvidia card either, I just plug in the card, load the drivers and they take care of everything. If a dev chooses to rewrite shader code to be in Nvidia's program, that is their choice, but I have nothing to do with that or the work that goes along with it. Are you suggesting gamers buy a slower card just to have pity on developers?


And Nvidia still has not fixed their crap AA. Digit-life for some reason doenst like to handicap Nvidia cards on their AA, but they love handicapping radeon because of their bad AF.

Personally I don't see why AA beyond 4X/quincux is even necessary at higher resolutions - the difference if you are actually playing the game and not staring at screenshots is not really noticable. No AA to 2X and 4X/Quincux is a big jump, but the increments after that are minor at best. As for why they haven't handicapped Nvidia, it's probably because the AA output in motion of ATI and Nvidia cards seems to be a matter of preference. Even moreso, Nvidia AA often gets dogged based on improper screencaptures due to their method of doing AA, while ATI's AA results can be seen with a simple screencap.

Ruined
07-12-03, 10:29 PM
Originally posted by CaptNKILL
Idunno, I think a REAL pen with REAL lighting and shadows cast across the REAL paper would look a hell of a lot better than a 50x30 res, colorless LCD screen.... but thats just me. :rolleyes: :D

Can the pen be an optimized ballpoint, or is that considered cheating? ;)

reever2
07-12-03, 10:30 PM
Originally posted by Ruined
I don't need to do it with my Nvidia card either, I just plug in the card, load the drivers and they take care of everything. If a dev chooses to rewrite shader code to be in Nvidia's program, that is their choice, but I have nothing to do with that. Are you suggesting to buy a slower card just to have pity on developers?



Personally I don't see why AA beyond 4X/quincux is even necessary at higher resolutions - the difference if you are actually playing the game and not staring at screenshots is not really noticable. No AA to 2X and 4X/Quincux is a big jump, but the increments after that are minor at best. As for why they haven't handicapped Nvidia, it's probably because the AA output of ATI and Nvidia cards seems to be a matter of preference.

1. Dev's dont choose to rewrite shader code to be in nvidias program. Nvidia gives the devs certain "benefits" to be in their program which in turn MAKES them rewrite shader code in order to be in the program, and if they dont, they are out of the program

2.Personal preference is not something you use in reviews. You review the cards at equal visual settings whether the person reviewing it likes using those settings or not. If you use that concept the AF output also seems to be a matter of preference, but they still handicap the cards anyways

Ruined
07-12-03, 10:32 PM
Originally posted by reever2
1. Dev's dont choose to rewrite shader code to be in nvidias program. Nvidia gives the devs certain "benefits" to be in their program which in turn MAKES them rewrite shader code in order to be in the program, and if they dont, they are out of the program

Right, and the next logical step is that they aren't forced to be in the program, so they have the choice not to do the shader code.


2.Personal preference is not something you use in reviews. You review the cards at equal visual settings whether the person reviewing it likes using those settings or not. If you use that concept the AF output also seems to be a matter of preference, but they still handicap the cards anyways

The only thing is that the ansio on the ATI cards at certain angles is clearly shown to be inferior with numerous tests both still and in motion. Its not so clear with Nvidia/ATI AA, especially when you can't capture the Nvidia AA properly (even Hypersnap DX doesn't capture everything).

For myself, I usually use highest quality 1024x768x32bit with 4x AA (or Quincux if the system has an Nvidia card) and 2x Ansio. Seems to be the best performance that gives IQ improvement I actually notice while playing a game.

reever2
07-12-03, 10:40 PM
Originally posted by Ruined

Its not so clear with Nvidia/ATI AA, especially when you can't capture the Nvidia AA properly (even Hypersnap DX doesn't capture everything).


I doubt they actually take screenshots and compare the 2 screenshots when they have the complete image on their screens in front of them

CaptNKILL
07-12-03, 10:51 PM
Originally posted by Ruined
Can the pen be an optimized ballpoint, or is that considered cheating? ;)
Hah, I know what they(Nvidia) would do.... they would make it so less ink(texture-filtering) would flow out (be rendered), decreasing the load on the ink well(memory bandwidth).

hah, but after using a 500x magnifieing glass(photoshop zoom), I would see that the ink has some MAJOR (minor) lightness to it (innacurate filtering).


Note: things in parenthesis are related to the actual topic at hand....

:D :p :angel:

extreme_dB
07-13-03, 12:03 AM
Ruined, your arguments are flawed for the following reason: in a benchmark comparison, the cards have to be matched as closely as possible in image quality and/or settings to be meaningful. If Nvidia gets to enjoy huge performance gains with little loss in quality, that's great! But why compare that performance figure against ATI, which lacks those same "optimizations"? That's what's happening in UT2003 with trilinear filtering, for example, which Nvidia disables no matter what. Why should ATI take a huge performance hit with little if any noticeable quality increase? It's not apples to apples. The ATI could turn out to spank Nvidia in both "unoptimized" and "optimized" settings in those benchmarks!

The calculator/pen-and-paper analogy is only useful in this way: Say you want to test how fast two people can work out a problem on paper. They both have calculators and pens and paper, but only one uses the paper while the other uses the calculator. The other person can use the calculator as well, but the test calls for paper! So the end result of the test is meaningless, even though we'd all rather use calculators!

Get it now? In the case of UT2003, ATI can turn off trilinear as well for a ~30% performance boost, but the test calls for trilinear!

As for AA, it's most noticeable in motion, not in screenshots. The jaggies cause a crawling effect along edges as you move through a scene which can be very distracting. The impact of AA depends on the graphical content in the game, just as with AF. Racing simulations get a huge benefit from AA.