PDA

View Full Version : Will nvidia Ever Remove Optimizations From Their Drivers?


Pages : [1] 2 3 4 5

StealthHawk
07-07-03, 01:08 AM
As we all know, nvidia has been caught "optimizing" for 3dmark03, ShaderMark, 3dmark2001, and UT2003 so far. In these cases they have either altered image quality or degraded it outright.

The question now is, can nvidia ever remove these and other "optimizations" from their drivers even if they wanted to? The short, ill thought out answer would be "yes." But let's think about this some more.

First of all, nvidia has not had much publicity over their cheats in even 3dmark03. It actually seems that ATI's 3dmark03 cheats were equal or greater publicity than nvidia's. I also doubt that ShaderMark cheats are highly publicized, let alone Unwinder's 3dmark2001 findings, which were only recently translated into English. The UT2003 "exposure" is newer still.

Which brings us to the next point. Even if nvidia wanted to get rid of the cheats, how would they explain the severe drop in performance that new drivers would bring over old drivers? nvidia has not taken responsibility for their actions even once. I don't think they have ever officially or explicitly admitted to optimizing or changing the image quality in 3dmark03, which was an event in the limelight for all 3D enthusiasts. Besides that, they have never admitted to any wrong doing, and there are still a shockingly high amount of people who suppor their actions, albeit they are in the minority. So, if cheats were removed, how would nvidia explain drops in everyone's favorite benchmarks without a public statement of why performance dropped? It does not seem like they want to acknowledge what they did, take responsbility for any wrong doing, or anything of the sort- ever.

The average person who installs new drivers but does not keep up with politics will want the question of dramatically lower performance answered. And since any answer makes nvidia look bad, I personally don't see how nvidia could ever remove the cheats. There is no possible PR spin that I can see that would end up making them look like the Good Guys. Basically, as more and more cheats are exposed, nvidia slips deeper and deeper into a hole they themselves dug. Even if they wanted to try to climb out and fill it in, at this point I think it's impossible.

Anyway, what do you guys think? Remember, this poll is not asking the question of whether or not you think nvidia should remove optimizations. It is a question of whether or not you think they ever will.

Remember, it has also been rumored that the Detonator50 drivers would get rid of most of the funny business.

-=DVS=-
07-07-03, 01:16 AM
They will not remove them from Games , but they should remove them from synthetic benchmarks, but i think they wount.

Video Cards should be optimized for games , without lowering quality , and Benchmarks should be without optimizations , but no company is gonna play fair thats the real life , to say the truth im bored from all this stuff about cheating in drivers and optimizations :o , optimizations are NOT BAD

Bad is Nvidia Lowering quality and cutting big corners to gain performance

Ati is optimizeing also as does everyone else , but they don't cut corners so badly

Just my opinion :rolleyes:

extreme_dB
07-07-03, 03:01 AM
I think Nvidia can remove the cheats without losing face as soon as they deliver a new product that is genuinely fast in all respects. They can spin the resulting lower performance in the current GFFX line by claiming the following:

1. The new drivers aren't as optimized for older products, hence the lower performance. (just like the current situation where older drivers offer the best performance for products from previous generations).

2. The new drivers are geared for high image quality in newer (DX9) games; older games will suffer.

I wonder how consumers would react if Nvidia just came out and admitted that they cheated, and made a committment to set things straight. For me, they would gain back a lot of respect, though I wonder about the legal ramifications of that possibility.

Perhaps Nvidia has no choice but to maintain the deceptions. However, if they were truly affected on an ethical level, then they wouldn't/shouldn't continue to tout the GFFX's superiority by directly comparing it to ATI in their marketing material, or criticize the competition's current products. At the very least, they could reduce the prices and sacrifice some profits to benefit consumers (and maybe even maintain/gain marketshare).

extreme_dB
07-07-03, 03:08 AM
Oh, and if the latest drivers offer poor performance in upcoming games, they can always spin that by saying the games aren't coded efficiently to take advantage of their architecture. :)

extreme_dB
07-07-03, 03:32 AM
I just thought of something: the whole 5800 Ultra benchmark situation! At first, many review sites were showing it competing head to head against the 9800Pro. When the 5900U started being reviewed, suddenly the 5800U's performance didn't matter, and the 9800Pro was shown to beat it thoroughly. The 5800U's speed was no longer of any importance to the public, even when it was losing badly to the 9700Pro. The review sites criticized the 5800U only when the 5900U arrived.

That just goes to show how people don't care about past issues as long as everything's rosy in the present.

SurfMonkey
07-07-03, 03:55 AM
They won't ever remove the optimisations from there drivers. For a start it wouldn't make sense, all the legitimate game optimisations are needed anyway.

And until ATI, SiS, S3 et al also stop optimising for benchmarks I don't see NVidia giving them the advantage for free.

extreme_dB
07-07-03, 04:25 AM
ATI did remove the optimization for 3dmark2003, and the Anti-detector script doesn't show ATI to be cheating in any games or benchmarks besides 3dmark2001, unlike Nvidia. That's not to say definitively that Nvidia is cheating or ATI isn't, but then there's the UT2003 trilinear controversy. Nvidia is doing things they have no legitimate reason for.

Edit: I'm baffled - isn't Nvidia's action to disable trilinear filtering in UT2003 regardless of the driver or application setting at least as bad as Quack? I guess people are just sick of discussing cheats. Nvidia gets all the breaks.

Another edit: There's no reason to try to defend or justify these types of actions. If there are ways to optimize performance by reducing image quality, they should be user-definable options in the driver. There's no excuse for misleading consumers by ignoring settings and forcing those optimizations.

Hanners
07-07-03, 04:45 AM
I don't think we'll see any of the 'optimisations' removed for the current architecture, but I'd like to think that when NV40 rolls around, nVidia will take it as an opportunity to turn over a new leaf and remove them all (Hopefully by then they won't need them). It's probably wishful thinking, but a man can dream, can't he?

Nv40
07-07-03, 04:50 AM
Originally posted by SurfMonkey
They won't ever remove the optimisations from there drivers. For a start it wouldn't make sense, all the legitimate game optimisations are needed anyway.

And until ATI, SiS, S3 et al also stop optimising for benchmarks I don't see NVidia giving them the advantage for free.

indeed.

StealthHawk:
your premise in the question is a bit premature. We dont know yet if RIvaturner anti-detection patch is removing all valid optimizations or not.

in almost all the aplications you mention ATI is using aplication detection too. UNwinder programmer have not finished his report. since he told that in latest CAt3.5 drivers is detecting aplication too ,but that -he was unable to block those aplications detections- because ATi is using more complex techiques , or "optimizations". but you can be sure he will find more from NVidia and more from ATI aplications detections.

in this German review..

http://www.computerbase.de/article.php?id=237&page=4&sid=b4a3f395e788a0f6425e7f056f76b7ab#lautstaerke_b ildqualitaet./

looks like ATI is "optimizing" in Aquamark and UnrealT2003 too. also according to the german site ,looks like ATI drivers drops to bilinear filtering,when using AF. and this time the performance drops by a lot more that the reasonable 1% .so to answer your questionable but predictable question in this thread .unless ATI drops all their optimizations ,i dont think Nvidia will drop all their optimizations . ;)

so dont point your FIngers to early to Nvidia ,since the investigations have not finished ,and ATI is not out of this controversy yet. incoming reviews and future drivers will clarify more the situation.

Hanners
07-07-03, 04:55 AM
Originally posted by Nv40
looks like ATI is "optimizing" in Aquamark and UnrealT2003 too. also according to the german site ,looks like ATI drivers drops to bilinear filtering,when using AF. and this time the performance drops by a lot more that the reasonable 1% .so to answer your questionable but predictable
thread .unless ATI drops all their optimizations ,i dont think Nvidia will drop all their optimizations . :)

I can't speak about AquaMark, but what you are saying about UT2003 is incorrect - If you want evidence, take a look at the article I wrote here (http://www.elitebastards.com/page.php?pageid=1551&head=1&comments=1).

If UT2003 and the ATi control panel is configured as per ATi's recommendations, there is no problem with aniso in the game, trilinear works perfectly with it. There's definitely no application detection going on in the case of UT2003.

Nv40
07-07-03, 05:27 AM
Originally posted by Hanners
I can't speak about AquaMark, but what you are saying about UT2003 is incorrect - If you want evidence, take a look at the article I wrote here (http://www.elitebastards.com/page.php?pageid=1551&head=1&comments=1).

If UT2003 and the ATi control panel is configured as per ATi's recommendations, there is no problem with aniso in the game, trilinear works perfectly with it. There's definitely no application detection going on in the case of UT2003.

remember is not my information , its the information for the german review. and ATI AF is perfectly ,unless you have non conventional angles where everyones know ATI AF doesnt works that well. large outdoors areas maps like Antalus and many other common maps used for benchamrking have terrain with all kind of surfaces inclinations .
so nope i dont think optimizations will end anytime soon, if both companies cut corners.

btw.. fixed the link..

here is the specs used by the site..

CPU:
Pentium 4 3,00GHz (200MHz FSB QDR)
Motherboard:
Intel D875 PBZ mit BIOS B03 vom 15.3.2003
Arbeitsspeicher:
2*256MB Corsair XMS3000 CL2 DDR-RAM mit 2-5-3-3 Timing, gnadenlos übertaktet auf 200MHz
Grafikkarte:
nVidia GeForceFX 5900 Ultra 256MB (Treiber offz. Detonator 44.03
Inno3D GeForceFX 5800 128MB (Treiber offz. Detonator 44.03)
Connect3D Radeon9800 Pro 128MB (Treiber Offizieller Catalyst 3.5

is in german ,use your favorite site translation.

http://www.computerbase.de/article.php?id=237&page=4&%20sid=b4a3f395e788a0f6425e7f056f76b7ab

Hanners
07-07-03, 05:39 AM
Originally posted by Nv40
perfectly ,unless you have non conventional angles where everyones know ATI AF doesnt works well. large outdoors areas maps like Antalus and many other common maps used for benchamrking have terrain with all kind of surfaces inclinations . so i dont think optimizations will end anytime
soon until both companies stop optimizing for best performance.

:rolleyes:

Not this tired old argument again. Adaptive aniso is not cheating, it's a conscious design decision.

All that article is doing with UT2003 is comparing framerate between AF not set correctly as per recommendations and with it set correctly. It's a bit like saying 'I ran one test with AA disabled and one with it set at 6x AA, therefore they are cheating with their AA implementation'.

As I mentioned in my article, the correct way to use AF in UT2003 with ATi's R3x0 cards is to set aniso to 'Application Preference' in the ATi control panel, and set aniso to your preffered level through the UT2003.ini in the game. This ensures that trilinear filtering is applied to all texture stages, not just the first. There's no way you can possibly construe it as a cheat - Besides which, ATi are looking to implement more AF settings in their control panel in the future, hopefully the ability to force trilinear filtering with AF on all texture stages will be one of those settings to avoid this kind of confusion.

Shamrock
07-07-03, 06:20 AM
Originally posted by Hanners
I can't speak about AquaMark, but what you are saying about UT2003 is incorrect - If you want evidence, take a look at the article I wrote here (http://www.elitebastards.com/page.php?pageid=1551&head=1&comments=1).

If UT2003 and the ATi control panel is configured as per ATi's recommendations, there is no problem with aniso in the game, trilinear works perfectly with it. There's definitely no application detection going on in the case of UT2003.

I'm not saying your wrong Hanners, but the anti-detector you used was designed for Cat 3.4 drivers. Since then ATI "could" have made newer, and harder to detect optimizations into Cat 3.5.

And the site NV40 posted, claims ATI is optimizing UT2k3 as well..and by a BIG difference...from 181.08, down to 145.51 (1024x768) AND 109.44 down to 88.61 (1280x960)

Maybe Unwinder needs to rip apart the Cat 3.5 and detect some more and bring out a new version....as well as for NV 44.65 drivers (gosh, he's gonna be busy if he brings out versions for EACH driver revision)

On a positive note...good review you did Hanners :)

Hanners
07-07-03, 06:28 AM
Originally posted by Shamrock
I'm not saying your wrong Hanners, but the anti-detector you used was designed for Cat 3.4 drivers. Since then ATI "could" have made newer, and harder to detect optimizations into Cat 3.5.

And the site NV40 posted, claims ATI is optimizing UT2k3 as well..and by a BIG difference...from 181.08, down to 145.51 (1024x768) AND 109.44 down to 88.61 (1280x960)

Maybe Unwinder needs to rip apart the Cat 3.5 and detect some more and bring out a new version....as well as for NV 44.65 drivers (gosh, he's gonna be busy if he brings out versions for EACH driver revision)

On a positive note...good review you did Hanners :)

They could have changed the way applications are detected in the Cat 3.5's, but it seems very unlikely - The way 3DMark 2001 is obviously the same for starters, because the score drops with the script in use.

I think I've already pointed out why the claims of ATi optimising for UT2003 are wrong, it's all to do with ATi's quality aniso only doing trilinear on the first texture stage when it's forced in the ATi control panel. When you set aniso through the application instead, trilinear is applied to all texture stages. It's simply a case of reviewers (and users) making sure they choose all the correct settings to get full trilinear with AF.

extreme_dB
07-07-03, 06:58 AM
Originally posted by Hanners
It's simply a case of reviewers (and users) making sure they choose all the correct settings to get full trilinear with AF.

...whereas with Nvidia, there's no way to enable trilinear at all in UT2003 except through the anti-detection script. It's Quack Part 2 (or 5 or 6).

rokzy
07-07-03, 07:13 AM
Originally posted by extreme_dB
...whereas with Nvidia, there's no way to enable trilinear at all in UT2003 except through the anti-detection script. It's Quack Part 2 (or 5 or 6).

that's because UT2003 isn't MEANT to be played with trilinear AF. blurry textures simulate the fatigue and confusion of an intense fight. nVidia cards provide a more realistic gaming expeience.

Skuzzy
07-07-03, 07:15 AM
The only imeptus I see for NVidia is to find ways to hide the way they detect things better.

If Nvidia is smart (I have no reason to believe they are not), they will just keep thier mouths shut and continue onward. Note, none of the major business press has picked up on this news.

The only ones screaming about it are very much in the minority and even in that small percentage there are those that are still defending NVidia.

Without a consenous, without major business press releases (which will not happen), there is no reason for NVidia to make any changes.
From a business perspective, if I am looking to sell high end video cards, I would do anything I could to make benchmarks look good, as that is what is important to the buyer of that product.
Even the consumer cheats to artifically increase the performance. How many people have set thier video cards up just to make 3DMarks high, without regard to visual quality? Ever stood and watched someone run 3D200x and the visuals were trashed? All that mattered was getting a high score.
How many people, on this board alone, are buying the 5900U? Bought the 5800U? If sales are not hurt, why change?

I see no reason for NVidia to stop doing what it has been doing for so long. They are playing the same game as the consumer.

NOTE: I am not condoning what the video card manufacturers are doing. I am just simply stating the business case they have for doing it.

Cool Barn
07-07-03, 07:25 AM
Originally posted by Nv40

so dont point your FIngers to early to Nvidia ,since the investigations have not finished ,and ATI is not out of this controversy yet. incoming reviews and future drivers will clarify more the situation.
Yeah that really makes sense, just because another company might also be cheating means we shouldn't point fingers at Nvidia at the current time from what we DO know :rolleyes:

StealthHawk puts forward an interesting poll relating to Nvidia, and then you come and spout unrelated crap. I don't recall him mentioning ATI and their drivers at any stage.

If it turns out ATI is cheating there will be a time and place to mention it. I will have the same disappointment with them that I currently do with Nvidia, but IMO this is unrelated to this thread.

To all fanboys of ANY company - please don't defend a company purely because of who they are. If they have good products and services say so, and if they have bad products and services then also say so. It will give your opinion much more validity.

Kruno
07-07-03, 07:44 AM
Originally posted by Cool Barn

StealthHawk puts forward an interesting poll relating to Nvidia, and then you come and spout unrelated crap. I don't recall him mentioning ATI and their drivers at any stage.

If it turns out ATI is cheating there will be a time and place to mention it. I will have the same disappointment with them that I currently do with Nvidia, but IMO this is unrelated to this thread.


I could not have said it better myself. :)

StealthHawk
07-07-03, 08:09 AM
Originally posted by SurfMonkey
They won't ever remove the optimisations from there drivers. For a start it wouldn't make sense, all the legitimate game optimisations are needed anyway.

Yeah, I never said that legitimate game optimizations should be removed. Just illegitimate "optimizations," in this case, benchmark related, or IQ reducing in games(e.g. UT2003).

And until ATI, SiS, S3 et al also stop optimising for benchmarks I don't see NVidia giving them the advantage for free.

ATI seems to be leading the way by removing 3dmark03 optimizations. While we have basically seen the opposite attitude from nvidia. 44.65 re-enables the optimized 3dmark03 scores, which seems to suggest one thing...

The Baron
07-07-03, 08:34 AM
No company ever fully will.

1. The enthusiast market is the ONLY market that cares about benchmark cheating, and it's the only market where people might know.

2. The enthusiast market is 2% at best.

3. Many manufacturers base the video cards they bundle with their systems on benchmark scores.

4. The OEM market is what matters financially, and since they don't care about cheating but do care about benchmarks, the cheats will still be present from every company.

Hanners
07-07-03, 08:40 AM
Originally posted by The Baron
4. The OEM market is what matters financially, and since they don't care about cheating but do care about benchmarks, the cheats will still be present from every company.

Isn't that a bit of a contradction? I would imagine OEMs want accurate benchmarks to choose products with, and IHVs constant cheating prevents them from attaining that. Although they obviously won't want to get their hands dirty in all these cheating fiascos, I'd be very suprised if they weren't taking some notice of current goings-on.

The Baron
07-07-03, 08:42 AM
Originally posted by Hanners
Isn't that a bit of a contradction? I would imagine OEMs want accurate benchmarks to choose products with. Although they obviously won't want to get their hands dirty in all these cheating fiascos, I'd be very suprised if they weren't taking some notice of current goings-on.
You know, I used to think that. But, it's not like you're going to have soccer moms having debates about shader precision (no offense, Dig ;) ). They want it as high as possible for the price so on the off chance the home user runs a benchmark for whatever reason, the benchmark is as high as possible. Sure, they notice. But I get the feeling they don't care in the least.

gokickrocks
07-07-03, 08:46 AM
what ever happened to the release of the det 50s? werent they targetted to be out before the start of the whole 3dmark controversy?

Bopple
07-07-03, 08:47 AM
I just don't get it.
While the subject is a poll about nvidia optimization, this thread became a silly one - ati af method is a cheat(which is apparently false and already been proved so many times) / defending nvidia saying "ati and others are cheating, so nvidia won't remove cheat."
It's nvidia who's the biggest cheater. But you ppl say they cheat because others cheat?
Oh, please. :banghead: