Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-12-03, 08:47 AM   #169
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by jimmyjames123
I absolutely cannot believe that you are trying to paint yourself as impartial!

Were I to review an Nvidia-based board I would use the exact same test standards that I have for ATI hardware. It's the reviewers who suddenly change their standards, sometimes from review to review, that should be of more concern.

That said, I will always be vocal about blatant cheating regardless of who's doing it. If that makes me biased, so be it.

Quote:
Last I saw, Intel integrated and NVDIA GeForceFX 5200 cards were used on most of Dell's configurations, almost certainly being used in the vast majority of the computers that they sold over the past year or so.
You don't know what I was referring to and I'm not going into details. Let's just say Dell put some smack down earlier this year and that's one company large enough to make even Nvidia jump (and they did). Besides read B3D again. Dave just posted that one journalist suggested to him that it's Dell's fault that FM updated 3DMark because all their high-end PCs are using ATI hardware.
John Reynolds is offline   Reply With Quote
Old 11-12-03, 09:07 AM   #170
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

What is a benchmark suppose to do? Its suppose to give a task, with a certain workload, with a certain result (in this case an image with high IQ) and then see who is faster. Any time a one company optimize they are changing the workload and thus "cheating". It does not matter that games are optimize in this way. Why? benchmark != game. Games are not designed solely to measure performance, benchmarks are. Two vastly different things here. Both can be used to measure certain aspects of your performance. Both are useful and yes games are more of interest here. It does not matter if IQ is changed in a benchmark as its still "cheating" from what that benchmark was suppose to do. Now weather 3dmark is useful tool or not is beyond this point. Its sad to see that people think its ok to inflate scores on the a benchmarks that is still one of the most used.
jbirney is offline   Reply With Quote
Old 11-12-03, 09:15 AM   #171
The Baron
Guest
 
Posts: n/a
Default

I don't get why people are saying, "Well, the NV3x needs special optimizations to reach optimal performance, and 3DMark does not contain such optimizations. So, it's perfectly all right to have NVIDIA optimize for it." No, it's not. 3DMark is measuring (well, in large part) shader performance at whatever the hardware uses as minimum full precision. For ATI, this is FP24; for NVIDIA, FP32. It's the Absolute Worst Case Scenario for NV cards since there are not any _pp hints (well, not as far as I know).

The question I ask you--so what? You recognize this, I recognize this, so WHY DO WE CARE?! It's a SINGLE BENCHMARK. It is not the Omega of 3D Performance Measurements. Recognize 3DMark for what it does--measure performance where DX9 specifications are (or should be) followed without the use of _pp hints with PS2.0 shaders. So yes. NVIDIA is at a disadvantage. Oh well. There are probably plenty of other benchmarks where ATI is a disadvantage.

If you don't like it, just don't use 3DMark.

(oh, and I'm not an NVidiot, nor am I a fanATIc. don't even try. most people here know me well enough to know that.)
  Reply With Quote
Old 11-12-03, 09:59 AM   #172
Joe DeFuria
Registered User
 
Join Date: Oct 2002
Posts: 236
Default

Quote:
Originally posted by jimmyjames123
My point stands. I don't even know how you can argue about that. The 3dmark tests show that NVIDIA can gain about 15-30% in performance with virtually no difference in image quality.
Which means exactly nothing with respect to real games. 3DMark tests show that unless nVidia detects the application, and has hand tuned optimizations for a specific app, you're not getting squat.

If their "compiler" were so magical, then they wouldn't need to detect the application or use empirical data. The nVidia's compiler were genuine, the patch would not impact their scores.
Joe DeFuria is offline   Reply With Quote
Old 11-12-03, 10:00 AM   #173
Joe DeFuria
Registered User
 
Join Date: Oct 2002
Posts: 236
Default

Quote:
Originally posted by jimmyjames123
Are you an ATI employee, by any chance? Or just someone who loves to argue semantics on the internet?
No, I'm just someone that prefers

1) The truth be known

so that

2) Consumers make appropriate buying decisions.
Joe DeFuria is offline   Reply With Quote
Old 11-12-03, 10:02 AM   #174
The Baron
Guest
 
Posts: n/a
Default

the fact that it didn't drop as much as it did as when Patch 330 came out makes me think that the compiler is doing SOMETHING, but that they are still using reduced precision for a lot of shaders.
  Reply With Quote
Old 11-12-03, 10:17 AM   #175
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Has anyone done a before/after on a GF4ti? I'd be interested to hear if the earlier generation cards lost performance too. (Heck, anyone ran a GF3 on it?)
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-12-03, 10:27 AM   #176
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by The Baron
the fact that it didn't drop as much as it did as when Patch 330 came out makes me think that the compiler is doing SOMETHING, but that they are still using reduced precision for a lot of shaders.
How many times will I have to say it?

NVIDIA's compiler technology is excellent. It does a great job at what it's supposed to do: reducing register usage through swizzle and other things, as well as reordering instructions.

You don't really know how good the compiler is until you tried to get some performance numbers for register usage, and you suddenly realize you've been way too naive: the compiler has optimized your program, and it runs at full speed.

I've had it happen to me, and it certainly was an astonishing sight

---

The Det50s also expose certain hardware FPUs more efficiently, more particularly in the NV35 ( and NV36, although I'm not aware of any non-Det50-driver that supports the NV36, but it's certainly possible ).

There's ONE thing I'd like to know about the compiler, but it's way too technical, and extremely micro architectural. I'll never have the time nor the will to investigate it myself, but I'll ask some people to check themselves if they got the time, because I think it could be interesting if it was true Of course, only if you understand it, more of a "finesse" thing.


Uttar

P.S.: Of course, even regarding register usage, hand tuning it can be beneficial. Not something any serious developer would do though, since it could even become negative in the next driver release. Same thing for other stuff.
Beside thinking of an algorithm minimizing register usage, knowing what can be done in parallel and what can't, and using partial precision where it's possible, there's not much a developer can do to exploit the NV3x better than it is through standard code paths anymore.

Also, why am I saying the compiler is so great, even though I admitted before the performance gains are much lower than what I had expected? Well, I had personally expected a much dumber compiler, but more new units being exposed. Sounds like those new units most likely don't even exist, or if they do, they most likely won't bring in more than a minor performance gain.
Uttar is offline   Reply With Quote

Old 11-12-03, 10:28 AM   #177
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by digitalwanderer
Has anyone done a before/after on a GF4ti? I'd be interested to hear if the earlier generation cards lost performance too. (Heck, anyone ran a GF3 on it?)
It's my lot to be ignored obviously. I did it on page 6, half-way down.

Voudoun
Voudoun is offline   Reply With Quote
Old 11-12-03, 10:30 AM   #178
adlai7
Registered User
 
Join Date: Oct 2003
Location: Madison, WI
Posts: 3
Default

Quote:
Originally posted by Joe DeFuria
No, I'm just someone that prefers

1) The truth be known

so that

2) Consumers make appropriate buying decisions.


But how can we as consumers make appropriate buying decisions when we, as consumers, don't have the truth?

All we have is speculation by people who are, at best, hobbyists and enthusiasts.

What I find interesting about the Beyond3d article, is that the test that lost the most performance, GT4 had no image differences between build 330 and 340. So what did change between the builds?

It would be nice to see a reference render to look at.
adlai7 is offline   Reply With Quote
Old 11-12-03, 10:32 AM   #179
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by adlai7
What I find interesting about the Beyond3d article, is that the test that lost the most performance, GT4 had no image differences between build 330 and 340. So what did change between the builds?
There were noticeable differences, just nothing horribly glaring.
John Reynolds is offline   Reply With Quote
Old 11-12-03, 10:33 AM   #180
DMA
Zzzleepy
 
DMA's Avatar
 
Join Date: Apr 2003
Location: In bed
Posts: 997
Default

Quote:
Originally posted by scott123


If you notice; 3dmark03 is hardly used anymore on most review sites
Well, the ATI baised sites are gonna use it again in their next reviews for sure

The perfect benchmark suite for the ATI baised reviewer:

Mafia
Tomb Raider
BF 1942
3DMark-03 Build 340
Max Payne 2
UT2003

Of course you run all the tests at high res with AA/AF kicked up as high as possible. (IMPORTANT: Not one single test, except 3DM-03 of course, can be run without at least 4xAA/8xAF! NV might do better then.)

And throw in a good old Q3 bench just to give NV one win, cause we don't wanna seem baised right?
__________________
Gigabyte EP35-DS4 | E6850@3.8GHz |
8800GT@800/2000MHz | 4 GB Corsair 6400|
DMA is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 10:51 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.