Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-13-03, 12:01 PM   #253
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Just a small point, but if you defeat an application specific optimisation in nVidia's drivers, shouldn't it fall back to running through the 'real' compiler instead?

If so, is that the reason why nVidia now 'only' lose 900 points as opposed to about 1000 the last time 3DMark got patched?
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 11-13-03, 12:14 PM   #254
john19055
 
john19055's Avatar
 
Join Date: Jul 2002
Location: GREENVILLE,TX
Posts: 3,857
Default

I do not see why this is a shock to anybody,everone knows that Nvidia has to use a unified complier to make certain games run at aceptable framerates.It was already known that they used there own quasitrilinear and you could'nt make it do trilinear on it's own.It's just shows that they have to use some optimization to make the games run at aceptable framerate and I.Q. to be on par.Everbody knows they have to make 2K3 mark on par with what the competition has so it will sell cards.The average person does'nt take time to research witch card perfourms the best at what games they will be playing.they just look at the benchmarks and what ever is higher should be the best.Why else would they be doing ever thing to make that benchmark high as posable if not why should they care.This sure was'nt news to me I knew when FM came out with a new patch for 2K3 nvidia score would go down.But as long as your games play alright and I.Q. looks good,I am glad they put out a new complier witch keeps games playing good with little degrade in I.Q.,if none.People would really be bitching if they chose not to do nothing.It just the way the FX line has to be done weather you think it is a good thing or a bad thing.The big question is when the new NV40 or what ever it is going to be called comes out will they still use these optimiztion to keep the scores up on the 5900 and the rest of the FX line.Or will you see even a bigger drop in performance in games and not just this one benchmark.
__________________
Intel i7-3820+Corsair H-100+Gigabyte X79-UD5+16gigs G.Skill PC1600DDR3+2-ASUS DirectCU II GTX-670 in SLI+Crucial 256g-SSD+1-3Gig Seagate+2- Samsung 1-TB+3-WesternDigtial 640g+LG-12x Blu-Ray Burner+850watt XFX+Antec-P280 case+50" Plasma PM6700+Logitech Mouse+Keyboard+Pioneer VSX-1020+Polk Audio Speakers
john19055 is offline   Reply With Quote
Old 11-13-03, 12:50 PM   #255
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Please do not put words into my mouth.
I never said I encouraged NVIDIA's behaviour in 3DMark03. On the contrary. Repeating times and again that NVIDIA should stop doing shader replacements won't result in any worthwhile discussion IMO.

Also:
Quote:
Originally posted by Hanners
Just a small point, but if you defeat an application specific optimisation in nVidia's drivers, shouldn't it fall back to running through the 'real' compiler instead?

If so, is that the reason why nVidia now 'only' lose 900 points as opposed to about 1000 the last time 3DMark got patched?
Yeppers. Although the compiler is better than 900 as opposed to 1000.
According to the 3DMark03 original audit PDF file from FutureMark, the score dropped from 5806 to 4679 on a GFFX 5900U and the 44.03 drivers.
DaveB reported the score going from 6412 to 5538 on a 5950U with 52.16 drivers.

The first is a 24.1% drop, while the second is a 13.6% drop. That means the detection degrades performance by 43.6% less than last time. Not amazing, but not bad either!
Please note that two cheats which degraded performance a bit, but had great backslash from the community, were removed since the 44.67: these are the sky clip planes and the backbuffer clearing removal.

But you also got to note that considering a 480/450 factor, the legit performance boost compared to the 44.03 non-optimized results are highly impressive: 5538 compared to 4991!
There's better, there's worse, as they say, though

BTW, looking at Wolfbar's statement:
Quote:
According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance.
Replace "GPU compiler" by "hardcoded shader replacements" and "The compiler (that) has to run on the CPU" by "NVIDIA's general purpose Unified Compiler Technology is used instead on the CPU,".
And you'll realize his statement made an awful lot more sense than what it seemed to originally.
"GPU Compiler" would be the BS NVIDIA gave to Gainward and AIBs in geenral in order to keep face.


Uttar
Uttar is offline   Reply With Quote
Old 11-13-03, 01:11 PM   #256
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Deja-FREAKING-vu!

I feel like asking if that isn't just hiding their application specific shader replacements under the guise of their new compiler and if that isn't a specific violation of Futuremark's rules.

Weird.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-13-03, 01:19 PM   #257
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

So basically, Nvidia considers any app that disables their replacement shaders to be disabling their complier?
__________________
Snake's System:

[size=1][list][*]AthlonXP Mobile 2500+ (@2.5ghz, 1.850v)[*]Albatron KX18D PRO (NForce 2 Ultra @227FSB)[*]512MB's OCZ Platinum PC3200 EL (@DDR454, CAS 2.5, 6,3,3)[*]GeForce3 (@230c, 460m)[*]Fortissimo III, Gamesurround 7.1[*]POS Intel 56k Modem (soon to get high speed, though)[*]Maxtor DiamondPlus 9 120GB, ATA133 (8mb)[*]Samsung DVD/CD-RW Combo (52x32x52x16x)[*]Lite-On LTR 16102B (16x8x40x)

[/size][/list]
NickSpolec is offline   Reply With Quote
Old 11-13-03, 01:29 PM   #258
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by NickSpolec
So basically, Nvidia considers any app that disables their replacement shaders to be disabling their complier?
Don't assume the level of internal communication at NVIDIA is sufficent for that to happen. NVIDIA Marketing and PR departments love to contradict themselves.

I sincerly wish NVIDIA stops this before it's too late. This crap ain't worth 900 points. If they want to get the enthusiast community back with the NV40, they shouldn't be doing this type of crap just 3 months before launch.


Uttar
Uttar is offline   Reply With Quote
Old 11-13-03, 01:37 PM   #259
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by NickSpolec
So basically, Nvidia considers any app that disables their replacement shaders to be disabling their complier?
Well, `considers' implies careful thinking, and I'm not seeing a whole lot of that at the moment.

If they're going to FUD their competitor, why not a small software company that exposes what they're doing? 3DMark doesn't go to the driver, but straight to DirectX.

Voudoun
Voudoun is offline   Reply With Quote
Old 11-13-03, 01:38 PM   #260
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

It's not just the enthuasists Uttar. Devs are getting pretty bent out of shape about it too. Piss in the cornflakes of too many devs and NV40 will be just a number.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote

Old 11-13-03, 01:50 PM   #261
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by Skuzzy
It's not just the enthuasists Uttar. Devs are getting pretty bent out of shape about it too. Piss in the cornflakes of too many devs and NV40 will be just a number.
nVidia: The Way It's Meant To Grind To A Halt.

If true then they've really got a problem, because how many developers are going to want to do what Valve did, and spend so much time optimising for a single manufacturer's cards? How many can afford to?

Voudoun
Voudoun is offline   Reply With Quote
Old 11-13-03, 02:09 PM   #262
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by Skuzzy
It's not just the enthuasists Uttar. Devs are getting pretty bent out of shape about it too. Piss in the cornflakes of too many devs and NV40 will be just a number.
Agreed. I said enthusiasts because NVIDIA PR doesn't deal with developers, and I never even talked to anyone in NVIDIA DR department.

What do you mean "will be just a number"? That it'd never see the light of day? In such a case, I'm sorry to announce you there's already functional NV40 silicon :P


Uttar
Uttar is offline   Reply With Quote
Old 11-13-03, 02:21 PM   #263
DMA
Zzzleepy
 
DMA's Avatar
 
Join Date: Apr 2003
Location: In bed
Posts: 997
Default

Quote:
Originally posted by Uttar
In such a case, I'm sorry to announce you there's already functional NV40 silicon :P
See? You know you can't stop giving us hints about new GPU's/VPU's, so please, start updating NFI again!

..at least once a week?

__________________
Gigabyte EP35-DS4 | E6850@3.8GHz |
8800GT@800/2000MHz | 4 GB Corsair 6400|
DMA is offline   Reply With Quote
Old 11-13-03, 02:33 PM   #264
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

Quote:
Originally posted by Uttar
Agreed. I said enthusiasts because NVIDIA PR doesn't deal with developers, and I never even talked to anyone in NVIDIA DR department.

What do you mean "will be just a number"? That it'd never see the light of day? In such a case, I'm sorry to announce you there's already functional NV40 silicon :P
Now, now Uttar, I didn't say it would not see the light of day. I was making an exaggeration as to the effect this continued BS is going to have with devs. How much of an exaggeration remains to be seen. I hope a lot, but it is not looking that way. This dev could care less about the NV40 right now. If the current practices are carried into the NV40 line, then I won't be alone in this stance either.

The DR side of things appears to have marching orders as well. They avoid saying or admitting things like, "If you use the standard coding practice for that, the performance will degrade", replacing it with, "If you use this fragment, it will perform super!..Uh, yes, we know it is proprietary."
Now that is a bit of a summary and not to be quoted as something from NVidia.
__________________
Stuff Happenz!
Skuzzy is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 03:48 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.