Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-11-03, 04:55 PM   #109
lukar
 
Join Date: Jan 2003
Posts: 163
Default

Quote:
Originally posted by ChrisW
You guys are assuming that ATI can't also gain performance from optimizing for 3DMark03. Do you really think ATI can't also gain performance by replacing shaders that look "close enough" or some other things?
You're right. Ati can replace the shaders and gain up to 1000 points if they want to and leave Nvidia in the dust!
But, that's not the point, and ATI knows that. Their hardware proves their superiority of ideas and technical skills over Nvidia's.

Well, I guess Nvidia has to release a new driver set, and withdraw their new yearly driver policy lol

They are really retarded...
lukar is offline   Reply With Quote
Old 11-11-03, 04:56 PM   #110
Rogozhin
Registered User
 
Rogozhin's Avatar
 
Join Date: Jul 2002
Location: oregon
Posts: 826
Default

The whole reason for nvidia to cheat in 3dmark2003 is because it is the benchmark that hold the most impact across all forms of advertising.

Their sole intent is to inflate scores (these score get published all over the world) so that joesixpack associates nvidia with superior benchmark numbers and buys Nvidia's card.

This has nothing to do with powerusers who know better-I run a small computer buisness and people almost always tell me they want an nvidia card because they've heard, or read that they are fast in "this 3dbenchmark2000 thingy."

And I have to explain what is going on and why the scores aren't valid.

It's just spurious buisness and it's something that really irks me as a small buisness owner (coffee house and computer shop).

rogo
Rogozhin is offline   Reply With Quote
Old 11-11-03, 05:26 PM   #111
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Ady
thats a good point. I thought it used to kick out some awful ps2.0 scores with older drivers, or I am just remember the horrid nv30 ps2.0 performance?

looking at the results at nordic hardware it seems the 5950 doesn't have a problem beating the 9800PRO in PS2.0 performance
You're not imagining things. You can see a whole bunch of NVIDIA 3dmark03 scores in this post I made awhile ago.

I only wonder if 3dmark really caught and disabled all detection with the 340 build. When they introduced the 330 build it didn't catch everything.
  Reply With Quote
Old 11-11-03, 05:32 PM   #112
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by digitalwanderer
A much healthier attitude, you'll live longer.

Trust me.
Yes, embrace the power of apathy. If you can be bothered.

I have a XP2000 (mildly overclocked via system acceleration mode) and a Ti4600 using 44.03. I ran it pre and post patch and my score dropped from 1816 to 1751. Insignificant really. However, due to my setup I couldn't run GT4, Pixel Shader 2.0, or sound-test 3 () don't know why on that one. I don't have the latest drivers installed for the audigy yet.) That seems to confirm that the optimisation (is in the shaders.

Thought this might help isolate the problem still further.

Voudoun

Last edited by Voudoun; 11-12-03 at 10:33 AM.
Voudoun is offline   Reply With Quote
Old 11-11-03, 05:36 PM   #113
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nutty
I see what you're getting at, but its not valid anymore. _ALL_ future games will have their shaders tinkered with by a much more complex compiler than we currently have. This will be true for all IHV's. Not just the ones that balls up their hardware. Putting a patch out that deliberately circumvents these optimizations is wrong.
If the driver shader compiler was "inflating" 3dmark scores (correctly & fairly) then a new patch shouldn't lower scores.

If you look at a history of 3dmark scores with NVIDIA drivers, I think the obvious answer is that the application specific optimization in the drivers is still ocurring and the shader compiler is not responsible for their performance in any meaningful way. The scores have been quite static all along, build 330 lowering certain drivers notwithstanding.
  Reply With Quote
Old 11-11-03, 05:37 PM   #114
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

I remember there was talk of ATI replacing all the shaders in 3DMark03 with ATI logos but that idea was scrapped after everyone threatened to protest ATI for doing it. I wanted them to do it myself as I thought it would be funny.
__________________
AIW 9700 Pro | 1.3GHz CeleronT | 512MB PC133 SDRAM | ECS PCIPAT Mobo (Intel 815EP)
RadLinker/RadClocker
ChrisW is offline   Reply With Quote
Old 11-11-03, 06:05 PM   #115
lukar
 
Join Date: Jan 2003
Posts: 163
Default

Nvidia will update their drivers now? Right?

But, they established a new driver policy, which says that driver update goes every year or twice per year. Right?

We shouldn't set new policy

We hate Futuremark hate hate hate

What the hell... make it now 8000 score in 3DMark03

Futuremark response

Do not think about releasing patch 350
We own you....

Futuremark team is confused...

Ati folks
lukar is offline   Reply With Quote
Old 11-11-03, 06:13 PM   #116
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
Exactly right - and if the task is 'run this piece of code, which I wrote in a standard programming language' then the driver/compiler/processor should do just that. It shouldn't decide that 16 or 12 bits is enough precision. That's not what I asked it to do.
No, its what what you asked it to do, but if the end result is the same, does it matter how it got there?

I can something like this in C
Code:
{
     int j=0;
     for(int i=0; i<100; i++)
     {
          j = 2;
     }
}
Now if I ask the compiler to compile that into machine code, it will just strip the whole thing out. Its not what I asked it to do, but it does nothing anyway, so it just removes it.


Quote:
If some enterprising hard drive manufacturer decided that all the reads/writes to a temporary file during a benchmark didn't do any meaningful work (after all, there's nothing left on the drive, is there?) and decided to just skip the whole thing and report 'done', would we call it a cheat, or congratulate them for an aggressive optimization?
Its not exactly the same is it. NV aren't skipping everything. They're just trying to get to the final result a bit easier.

And in actual fact what you said _does_ happen. In write caching, if I write a small file to disk, then immediately delete it, provided their was space in the cache, the file would have never touched the HD due to optimizations brought about by the cache.

Quote:
If I benchmark MP3 encoding on a CPU, is performing the encoding at 128kbps instead of the requested 256kbps acceptable if I can't hear the difference? Personally, I think not.
Again its not the same thing, as the end result is different. Whereas the end result with NV's drivers is not different.

Quote:
It's all about equal work. Who cares if the work being requested is inefficient? Just do the work (all of it) in the most efficient way you can. No sweeping things under the rug. No cutting corners. Just do it.
But what if you cant do it in the most efficient way without changing it? Are you gonna just let your product look like crap, or change it, so the result is the same, but shows your product in a better light.

You see for a race to be fair, they have to have equal capabilities. And this is the problem. One IHV designed the hardware one way, another the other way. I'm not saying this was good on NV's part, it wasn't. But the fact is they're stuck with it. The hardware is out there, and the only thing they can do, is play to it strengths.

What if I write a benchmark that does 100 sin functions in the fragment program. We all know NV's sin function is about 4 times faster, so would that be fair?

Quote:
Is this too much to ask?
Depends what exactly you're asking. If you're asking for a true apples to apples benchmarking system, then probably yes.

I'm a tired devils advocate now.. I'm off to bed.. I'll argue more tomorrow..
Nutty is offline   Reply With Quote

Old 11-11-03, 06:16 PM   #117
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nutty
Again its not the same thing, as the end result is different. Whereas the end result with NV's drivers is not different.
What about the algorithm replacement they are doing in GT4, for example? Replacing the real algorithm with an approximation(they did this in 3dmark2001 too). Surely the end result is clearly "different."
  Reply With Quote
Old 11-11-03, 06:25 PM   #118
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

This does not at all prove Nvidia is cheating. It proves that Nvidia was using application specific optimizations that may have been very much legit with 3dmark03.

And everyone and their mother knows that NV3x need application specific optimizations for dx9 games that lack partial precision hints or have complex shaders that were not optimized for NV3x cards.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 11-11-03, 07:05 PM   #119
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Nutty
But what if you cant do it in the most efficient way without changing it? Are you gonna just let your product look like crap, or change it, so the result is the same, but shows your product in a better light.
According to that kind of logic there is nothing wrong with inserting clip-planes since they won't alter the output.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-11-03, 07:17 PM   #120
Razor04
Registered User
 
Join Date: Jan 2003
Posts: 205
Default

I would just like to take a minute to remind all of you that this is a benchmark.

Some of you have suggested that they may in fact be valid optimizations to improve performance on the NV3X. There is one problem with that...this is a benchmark where each card is supposed to take the same path and do the same amount of work. Any time you leave that path (i.e. application specific optimizations) you are making it easier for your card and ruining the benchmarks purpose which is to compare two or more cards running the same path.

And for those of you that say that 3DMark isn't representative of DX9 performance...I think it is...and it has been shown time and time again that the NV3X blows at DX9. The same people who suggest this also suggest using real games to benchmark. It has been shown that not only will NV lower quality globally (as in the case of AF) to thwart people that don't use default benchmark paths, but those that do use the default paths are benching something that is completely inaccurate with regards to in game performance (*cough* UT2003 *cough*). So using a very popular game is only useful when they haven't made negative global optimizations or if you aren't using a set benchmark path.

I am not saying that it isn't useful to have real world game benchmarks as it is useful. I am saying that synthetic benchmarks have their place in the world too.
Razor04 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 06:06 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.