Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-13-03, 12:11 PM   #13
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default

Quote:
Originally posted by The Baron
okay. Kyle is insane. I'll leave it at that. he's just nuts.
Isn't that a bit old news? Seems [H]e has no clue what happens around him, or he doesn't just care. I think we gain nothing by blaming the poor old man for his ignorance...
__________________
no sig.
aapo is offline   Reply With Quote
Old 11-13-03, 12:13 PM   #14
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

Quote:
Originally posted by DMA
ATI's compiler stuff added in one of their driver sets (3.5/3.6?) is probably still alive and kicking too
(Dunno if that's the same thing so don't kill me please )
ATI compiler? I don't know anything about it but I want to know what that is too. If ATI is doing the same thing and calling it a "compiler" then they should be equally chastised.
__________________
AIW 9700 Pro | 1.3GHz CeleronT | 512MB PC133 SDRAM | ECS PCIPAT Mobo (Intel 815EP)
RadLinker/RadClocker
ChrisW is offline   Reply With Quote
Old 11-13-03, 12:15 PM   #15
vampireuk
**** Holster
 
vampireuk's Avatar
 
Join Date: Mar 2001
Location: The armoury
Posts: 2,813
Send a message via AIM to vampireuk
Default

Quote:
Originally posted by The Baron
okay. Kyle is insane. I'll leave it at that. he's just nuts.
__________________
I put children in microwaves.
vampireuk is offline   Reply With Quote
Old 11-13-03, 12:22 PM   #16
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default

Quote:
Originally posted by TheTaz
It open's a whole "can of worms". Who's to say one side isn't "paying more money" than the other side, which is swaying the results? Heck, there's already someone on B3D forums telling Dave that "Dell asked for this patch because they're selling ATi cards, now."
Uhhuh, if you've been reading B3D, you should know the relevant changes in the pixel shaders of the new patch are register permutations. If the nVidia unified compiler would be a real deal, it should have no effect on the performance. Actually, the compiler is a real deal, because performance only drops ~15%. It would be a lot more if the compiler would be the old one - the 15 % difference emerges from hand-compiled shaders replacement (cheating, slightly different output).

Furthermore, someone at B3D extracted the old 3dMark patch 330 shaders, and tried them with his own program. Then he tried the patch 340 shaders with his own program. No performance difference - so nVidia drivers are not only doing shader detection but also application detection to make sure only 3dMark03 shaders get replaced. This is obviously not very nice.

EDIT: Some clarifications...
__________________
no sig.
aapo is offline   Reply With Quote
Old 11-13-03, 12:27 PM   #17
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by The Baron
3DMark03's relevance is dead now. It was not dead when it came out. We have DX9 games now. We didn't have DX9 games then. So... I dunno what I'm trying to say. But it's probably something.
What? Ok so please give me some indications on how HL2 will run. What about Doom3? Any ideas on Deus Ex2? SS2? Stalker? We have 2/3 games out now that use DX9 true. But are you going to say that those are the end all be all of DX9 performance? We know that Halo allows for the FX to drop down to partial precession. Are all other DX9 games going to do that? That's the issue here. Yes games are much better to gauge hardware. But we all know that every game is different and performance on one game != performance on the other games. 3Dmark is as good as any game that we have now that tries to bench DX performance. Again its one of many things/tools you should use and not the be all end all of benchmarks.
jbirney is offline   Reply With Quote
Old 11-13-03, 12:42 PM   #18
euan
Fully Qualified FanATIc
 
euan's Avatar
 
Join Date: Oct 2002
Location: Glasgow, Scotland.
Posts: 387
Default

Quote:
Originally posted by Uttar
AAARGH!
The compiler is for real guys. Stop inveting BS - everyone is, and it's just annoying me.

The effects of the compiler cannot be as good as hand-tuned code most of the time, but it's pretty darn good. And developers will have making better than what the compiler does.

Disabling the compiler seems like BS though. It's still on, or their scores would be even significantly lower.


Uttar
Uttar,

Even if the "complier" is not as good as the hand crafted stuff, NVIDIA should just take it like mature grown ups and use the compiler. Then we can all stop having to fight about hand crafted shaders being cheating which is exactly what they are (application detection by shader).

If they had used the compiler (assuming it is real) then when FM changed their code, the results would have changed. However, when they changed them again, the next results would be slightly different (less or more optimal). That maybe acceptable to some. IE those who except shader replacement as being valid. However, to change the code all the results flatline appears to be cheating.

If the compiler had kicked in now that the handcrafter shaders for 3dmark2003 cannot be used then it shows us clearly that the compiler is pretty much useless for real world applications. Hence it is a POS.

Why does changing code disable a compiler??? Because it is not a compiler.
__________________
Sys.txt
euan is offline   Reply With Quote
Old 11-13-03, 01:53 PM   #19
TheTaz
Registered User
 
Join Date: Jul 2002
Posts: 621
Default

Quote:
Originally posted by aapo
Uhhuh, if you've been reading B3D, you should know the relevant changes in the pixel shaders of the new patch are register permutations. If the nVidia unified compiler would be a real deal, it should have no effect on the performance. Actually, the compiler is a real deal, because performance only drops ~15%. It would be a lot more if the compiler would be the old one - the 15 % difference emerges from hand-compiled shaders replacement (cheating, slightly different output).

Furthermore, someone at B3D extracted the old 3dMark patch 330 shaders, and tried them with his own program. Then he tried the patch 340 shaders with his own program. No performance difference - so nVidia drivers are not only doing shader detection but also application detection to make sure only 3dMark03 shaders get replaced. This is obviously not very nice.

EDIT: Some clarifications...
Huh?

Not sure where your coming from with this (In regards to my statements).

I wasn't claiming that Futuremark wasn't legitematly trying to block cheats. I believe they are, and I believe nVidia is cheating.

I'm just saying... as long as they take money from IHVs and OEMs... there's always going to be "political accusations", like the one I mentioned. (The can of worms I was referring to).

Though *I* believe Futuremark is doing it's best to stop cheats... others *may* believe something stupid like "Dell threw it's weight around because they're selling ATi cards now". Which, in the consumer's eye, may make 3DMark not be taken as seriously.

That's why I said it's a bad tool, nothing to do with it's technology.

Regards,

Taz
TheTaz is offline   Reply With Quote
Old 11-13-03, 02:00 PM   #20
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

I especially love this quote from Derek Perez, in regards to the performance drop in 3DMark03 on the GFFX's (with the newest patch)

Quote:
What we expect will happen is that we'll be forced to expend more engineering effort to update our compiler's fingerprinter to be more intelligent, specifically to make it intelligent in its ability to optimize code even when application developers are trying to specifically defeat compilation and optimal code generation.
This just proves that Nvidia cares far more about benchmarks then actual games. How completely, utterly stupid do you have to be to make a comment like this? It was unintentional on Perez's part --- He was trying to make Nvidia the victum once again, ala "Oh, look what FutureMark is going to force us to do... Now we have to devote more resources to cheating because they keep disabling our cheats!"


Hey, Nvidia, hear's an idea... STOP WORRYING ABOUT 3DMARK AND WORRY MORE ABOUT GAMES.

It is just so infuriating.

Nvidia's like an addict, and the 3DMark score is their crack.
__________________
Snake's System:

[size=1][list][*]AthlonXP Mobile 2500+ (@2.5ghz, 1.850v)[*]Albatron KX18D PRO (NForce 2 Ultra @227FSB)[*]512MB's OCZ Platinum PC3200 EL (@DDR454, CAS 2.5, 6,3,3)[*]GeForce3 (@230c, 460m)[*]Fortissimo III, Gamesurround 7.1[*]POS Intel 56k Modem (soon to get high speed, though)[*]Maxtor DiamondPlus 9 120GB, ATA133 (8mb)[*]Samsung DVD/CD-RW Combo (52x32x52x16x)[*]Lite-On LTR 16102B (16x8x40x)

[/size][/list]
NickSpolec is offline   Reply With Quote

Old 11-13-03, 02:09 PM   #21
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
What we expect will happen is that we'll be forced to expend more engineering effort to update our compiler's fingerprinter to be more intelligent, specifically to make it intelligent in its ability to optimize code even when application developers are trying to specifically defeat compilation and optimal code generation.


Please!

I can't believe Kyle didn't call nVidia's BS out for the BS it obviously is.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-13-03, 02:15 PM   #22
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by euan
Why does changing code disable a compiler??? Because it is not a compiler.
As I said in the other thread, don't put words into my mouth please, and even less so if you didn't understand what I meant

That part of NVIDIA's UCT ( Unified Compiler Technology ) which are shader replacements should never be called "Compiler", or even "GPU Compiler" as someone from Gainward said. But PR and Marketing did their job, and that's what we got now.

FM disables the "shader replacement" part of UCT. The legit parts remain used. NVIDIA is trying to keep face by claiming it disables UCT as a whole, which it doesn't.


Uttar
Uttar is offline   Reply With Quote
Old 11-13-03, 02:39 PM   #23
euan
Fully Qualified FanATIc
 
euan's Avatar
 
Join Date: Oct 2002
Location: Glasgow, Scotland.
Posts: 387
Default

Quote:
Originally posted by Uttar
As I said in the other thread, don't put words into my mouth please, and even less so if you didn't understand what I meant
I don't know where I was putting words in your mouth. Usually when I say stuff it goes in their ears.
__________________
Sys.txt
euan is offline   Reply With Quote
Old 11-13-03, 03:01 PM   #24
StealthHawk
Guest
 
Posts: n/a
Default

Kyle's ignorance shines through again.

He obviously doesn't understand what a compiler is or how it should work.

Nor has he taken the time to investigate why NVIDIA's scores dropped and ATI's didn't, and what changes the 340 patch brought over the 330 patch(I'll clue you in Kyle. NVIDIA PR was truthful, the shaders were being detected and replaced by the "compiler." Ask yourself why scores dropped, because there is no way Futuremark can turn off the Unified Shading Compiler in its true form!)

NVIDIA lost 15% in GT2, 10% in GT1 and 30% in GT4. Interesting how they glossed over GT4's drops, as I would think those are the most damning.


Well, at least I have to give Kyle some credit. He hasn't started using any other synthetic benchmarks. So at least his stance of all synthetics are worthless hasn't contradicted itself.
  Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Benchmarking Analytical Queries on a GPU News Archived News Items 0 05-20-12 08:00 AM
NVIDIA GeForce GTX 670 Video Card Review @ [H] News GeForce GTX 670 Reviews 0 05-10-12 11:11 AM
unigine Benchmarking with GTX285 and 302.07 on KDE4. This is normal? sl1pkn07 NVIDIA Linux 3 05-10-12 07:11 AM
Benchmarking AMD's 768-Shader Pitcairn: Not For Public Consumption News Archived News Items 0 05-08-12 02:30 AM
Hardball Editorial legion88 Feedback Forum 1 09-02-02 06:45 PM

All times are GMT -5. The time now is 08:17 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.