Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-12-03, 11:25 AM   #193
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by Razor04
Shouldn't any compiler run on a CPU? My understanding is it takes the HLSL code that is given to it...transforms it into code that will run on the GPU and sends it off to the GPU to be processed. Either my understanding is wrong or somehow a compiler can now be done in hardware on a NV3X GPU.

Now lets have another little hypothetical situation where the compiler runs on the GPU not the CPU. Shouldn't they be outputting identical code? I would hope so...but then again is it even possible? (I am a Mechanical Engineer not a Computer Engineer or Comp Sci. major so please excuse me if I am wrong)
We aren't talking about where the compiler runs, but the actual shaders themselves. With so much CPU time left to spare in 3DMark03, it would be pretty simple to offload certain shaders (especially vertex shaders) to the CPU to process, lightening the workload on the GPU and allowing it to concentrate on other things.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 11-12-03, 11:33 AM   #194
DMA
Zzzleepy
 
DMA's Avatar
 
Join Date: Apr 2003
Location: In bed
Posts: 997
Default

Quote:
Originally posted by Hanners
I think all of them except for 3DMark03 and Mafia off the top of my head. Even Mafia might be one, I don't remember.
Yup.
Thats why they're perfect.
-"Even though this title is a so called "TWIMTBP game" ATI kicks NV's ass!!111

__________________
Gigabyte EP35-DS4 | E6850@3.8GHz |
8800GT@800/2000MHz | 4 GB Corsair 6400|
DMA is offline   Reply With Quote
Old 11-12-03, 11:38 AM   #195
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by The Baron
The question I ask you--so what? You recognize this, I recognize this, so WHY DO WE CARE?! It's a SINGLE BENCHMARK. It is not the Omega of 3D Performance Measurements.....If you don't like it, just don't use 3DMark.
Sad fact is, that for all intents and purpose it is the benchmark that many OEMs look at. Granted they look at others but a LOT of stock is placed on 3dmark. Now we both know thats not good for the OEMs to do but its what happens out there. If you followed JRs post or the thread on B3D you can see that this patch was asked for by one Large OEM. If 3dmark were really that less of an benchmark, then why would nV spend so much time and effort to keep optimizing it? I mean they have had to change their optimization methods at least twice on it now so some one over at NV HQ thinks its important enough to warrent their time on it.

I think 3dmark is still a usefull tool. It does give some usefull info that when taken with other benchmarks (real games and other syth) paints a pretty accurate picture of what a card can do. Its never usefull as the only benchmark nor should you place a large value on any one score....
jbirney is offline   Reply With Quote
Old 11-12-03, 11:40 AM   #196
euan
Fully Qualified FanATIc
 
euan's Avatar
 
Join Date: Oct 2002
Location: Glasgow, Scotland.
Posts: 387
Default

Quote:
Originally posted by adlai7
But how can we as consumers make appropriate buying decisions when we, as consumers, don't have the truth?

All we have is speculation by people who are, at best, hobbyists and enthusiasts.

What I find interesting about the Beyond3d article, is that the test that lost the most performance, GT4 had no image differences between build 330 and 340. So what did change between the builds?

It would be nice to see a reference render to look at.
This all comes down the design of the benchmark. The major reason why cheating in benchmarks is a no no (especially 3dmark2003) is that 3dmark2003 artificially creates work load for the graphics device to perform. Think of it as the benchmark telling the GPU to draw a character 4 times in a row on top of the previous image. In this simplified example, you can see that the easy "optimisation" is to simply draw the character once. Hence, there is very little to no difference in the final image but the performance is way up.

Now many people donít understand why the benchmark deliberately uses artificial workloads and uses less than optimal algorithms for calculating shadows and so on. The simple reason was to develop a benchmark that would be hard work for a GPU, and not require 100's of MBs of textures with highly complex geometry.

If the GPU companies just did the work as they are given, then the benchmark would scale to equal future games that use efficient highly optimised algorithms and practices. Sadly NVIDIA have decided that 3Dmark2003 tests can be done with a lot less work, and massive shortcuts can be taken (which they can). But how in the world will it translate to real games in the future? You have seen what they had to do to make unreal 2003 perform well. Are they going to be expected to do this to every single game in future?

People will argue that future games aren't here yet (although they are now just appearing), and that the next gen NV40 or whatever will be around to solve all the problems. Do people expect Nvidia to continue hand crafting their optimisations?
__________________
Sys.txt
euan is offline   Reply With Quote
Old 11-12-03, 11:44 AM   #197
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by euan
Do people expect Nvidia to continue hand crafting their optimisations?
Well, at least until the nV40 is commercially available... (Notice that I did not say "released" )
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-12-03, 11:48 AM   #198
euan
Fully Qualified FanATIc
 
euan's Avatar
 
Join Date: Oct 2002
Location: Glasgow, Scotland.
Posts: 387
Default

Quote:
Originally posted by digitalwanderer
Well, at least until the nV40 is commercially available... (Notice that I did not say "released" )
Do ya like ma new sig?
__________________
Sys.txt
euan is offline   Reply With Quote
Old 11-12-03, 12:03 PM   #199
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by Uttar

BTW, Voudoun: In what GT are you losing performance? GT2 and GT3 I assume? Or maybe just one of the two?
I had another go at it and, frustratingly, under 330, the details button is greyed out when the final score comes up, so I've got no ability to get an exact comparison! The overall scores were almost identical to my previous test, which suggests that there is a difference. This time round I got 1818 and 1753. From a purely visual comparison, I appear to be losing ground in GT's 1, 2, and 3. The trough and peak frame-rates in GT1 are both lower, but the frame-rate varies so widely it's hard to get a consistent picture. In GT's 2 and 3, which are more consistent, the frame-rate is consistently lower by a few frames/second. I don't know what to make of this, because GT1 is DX7 isn't it? I could understand them messing with GTs 2, 3, and 4, but why 1? Did something that slipped through previous patches finally get picked up? Any thoughts?

Voudoun
Voudoun is offline   Reply With Quote
Old 11-12-03, 12:14 PM   #200
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by digitalwanderer
Sorry to hear about the illness and hope you're past the worst of it and recover quickly, bad health SUCKS!

I'm doing faboo. I had my first cardiologist appt Monday (I ditched the last one after waiting an hour and a half, I HATE waiting for doctors! ) and the Doc was bloody delighted with my progress. I was given a clean bill-o-health and fully-recovered status with no follow-up appointment for 3 months. I just have to keep exercising, eating right, taking me meds (4 pills a day, nothing major), and NOT SMOKING and I'll be good to go.

I'm actually a lot better off than before my heart attack. I had a 95% blockage on one of my valves and my circulation sucked, now that it's opened up I have a LOT more energy. The only weird thing is my pulse is kind of high from the meds I'm on, I'm averaging about 80bpm compared to a normal 40-50 for me. It's normal and to be expected the Doc says, but it's just a bit weird to me. (Yeah, I'm a bit more conscientious about my pulse now. )
Great news. You'll soon be and again, but no more Your mind must be a lot sharper, and I imagine your creativity is on a high too. Are the drugs permanent or just for recovery? If they're just for the recovery period then it'll be interesting to see how the clearout has affected your pulse. You mentioned eating right, have you seen a dietician? I ask because I've lost 27lbs since I started seeing one, and controlled weight-loss (if you're overweight) will ease the load on your heart too.

I'll stop mummying you now Glad to hear you're doing so good. I trust you'll take this the right way, but are you glad this has happened? You sound like you're in better shape now than you have been for years.

Voudoun
Voudoun is offline   Reply With Quote

Old 11-12-03, 12:39 PM   #201
DMA
Zzzleepy
 
DMA's Avatar
 
Join Date: Apr 2003
Location: In bed
Posts: 997
Default

Quote:
Originally posted by AnteP
ROFL!
Hey, don't laugh. Thank me for the good tips instead. I promise you, the kids out there reading these pathetic biased reviews won't even see how bad it is. They'll only see ATI on top and BAM!! Job done.

Well, i'm off. I gotta edit my latest review and try to add some scores from the latest build of 3DMark-03.
Haven't used that bench for months but this is too good to leave out.


Go ATI!!
__________________
Gigabyte EP35-DS4 | E6850@3.8GHz |
8800GT@800/2000MHz | 4 GB Corsair 6400|
DMA is offline   Reply With Quote
Old 11-12-03, 01:02 PM   #202
TheTaz
Registered User
 
Join Date: Jul 2002
Posts: 621
Default

I agree with Eaun... some points that people are forgetting.

1) It's a Direct X 9 Benchmark. It's meant to show STANDARD coded Direct X, Not IHV specific paths. (This, of course, does not give an OpenGL picture... so to use it as "THE ONLY" Benchmark to go by is ridiculous... It also doesn't give a picture of any IHV specific paths that Devs may be using for actual games... and It was NEVER MEANT TO.)

2) Since it's a Benchmark, It deligates a SIMULATED workload. It's not intended to reflect "REAL GAME" performance. It's intended to show how much a piece of hardware can do under a FUTUREMARK'S heavy SIMULATED workload (Way heavier than games). A lot of benchmarks use 'simulated workloads'... that's why they are Benchmarks, not real applications!

3) In order for this "Benchmark" to accurately depict the capabilities of 'said hardware', NO SHORTCUTS can be allowed. (As Euan pointed out... if there are things that are purposely giving out more workload for SIMULATION purposes, making shortcuts that doesn't affect the Image is STILL a CHEAT, in a BENCHMARKING situation, because you are circumventing the 'assigned workload.').

That said... why are people upset if this isn't meant to reflect real game performance? Because, since this app is used as a PR tool (By both IHV's and OEM's), cheating (Or more nicely put... circumventing the workload) in it, is equivalent to "consumer fraud".


Let me try to put this in perspective:

My friend has two nForce 2 boards.

One is an older ASUS A7N8X Dual channel 333MHz FSB and an Athlon XP 2700+ Tbred.
The other is a newer ASUS A7N8X-X Single channel 400MHz FSB and an Athlon XP 3000+ Barton

Stock clocking, SiSoft Sandra shows the older System beating or near equal the newer system in MANY of the tests (CPU's are clocked near the same if you look it up... actually the 2700+ is clocked a *tiny* bit higher). It doesn't reflect the gaming performance. It only reflects the *general* hardware capabilities *within SiSoft's rules*. (Most games actually perform a little better on the newer motherboard, due to the extra cache and FSB speed).

Bottom line... if you *break the rules* of a benchmark... the benchmark is INVALID. Futuremark realizes this, and is attempting to protect the validity of THEIR *ruleset*.

Whether you believe FutureMark's application is a good tool to test with, or not, doesn't mean ANYTHING. What matters is, if it's a tool being used as PR to persuade consumers. And since IHV's and OEM's are using it as a PR tool to persuade consumers... IT MUST BE UNTAINTED in it's "ruleset".

/em steps off the soapbox

Regards,

Taz

Last edited by TheTaz; 11-12-03 at 01:36 PM.
TheTaz is offline   Reply With Quote
Old 11-12-03, 01:47 PM   #203
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

The question is which is more reflective of future (DirectX 9) games? Before patching to 340 or after? Which shows a closer relative performance difference between the two cards (nVidia and ATI) with games like Tomb Raider (which is already optimized for nVidia cards)? Game after game seems to agree more with the 340 patched version than before.

EDIT: Has anyone checked their scores with the new 52.70 driver set?
__________________
AIW 9700 Pro | 1.3GHz CeleronT | 512MB PC133 SDRAM | ECS PCIPAT Mobo (Intel 815EP)
RadLinker/RadClocker
ChrisW is offline   Reply With Quote
Old 11-12-03, 01:49 PM   #204
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

I'm more concerned about Game Test 2 and 3 than 4,


I've always read that the vertex skinning in Game Test 2 and 3 are unrealistic. But, How different is the r300's and Nv30 series vertex shader?
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 08:59 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.