Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Closed Thread
 
Thread Tools
Old 05-23-03, 11:18 PM   #49
The Baron
Guest
 
Posts: n/a
Default

Not for a benchmark it doesn't.

If you replace a part of a benchmark with your own creation in order to improve your scores for the benchmark, no matter how crappily the benchmark is coded for whatever architecture you're using, you're corrupting the benchmark. Plain and simple.

For a game? Sure. I'm ALL FOR IT. But for a benchmark? No. Otherwise, you'd have to be dependent on a driver team to optimize for every game on the planet in order to get similar performance.
 
Old 05-23-03, 11:20 PM   #50
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
What games do you KNOW are going to use pixel shader 2.0?
I dont know how far into the "future" 3DMark2003 (2006?) is looking, but you have to think about what you are saying. If you were making a game and had a choice between pixel shader 1.x and 2.0 instructions, which ones would you use:
It's the same as when 3dmark2001 came out. What games were out that used dx8 shaders at the time? None. Eventualy dx8 games did come out and what was shown with 3dmark pretty much panned out. Cards that did well with 3dmark2001 shaders do well with dx8 shader games. It's the same situation here.
jjjayb is offline  
Old 05-23-03, 11:29 PM   #51
bloodbob
Registered User
 
Join Date: Oct 2002
Posts: 123
Default

Quote:
Originally posted by CaptNKILL
Ok, so Doom 3 is the next step in graphics, I mean, it pretty much represents the future of graphics AND it uses PS2.0. Few will disagree there.

So...


http://www.anandtech.com/video/showdoc.html?i=1821&p=22


How is 3dMark2K3 an accurate representation of "future" games if the NV35 is CLEARLY the best card for Doom III, yet a "poor" performer in 3dMark2K3?
Umm doom3 doesn't use PS2.0 actually doom 3 doesn't use PS at all becuase IT IS OPENGL TWIT.

Next doom3 doesn't use the ogl similar features to that of PS2.0 except in the ARB2 and Nvidia special NV_extensions. Now if you go for the offical not the Nvidia special PS2.0 equalivant then doom3 is about 50% slower on the NV30 compared to the 9700s.
NV35 beats the r350 in doom3 when it uses it special function I don't know how it affect quality but good on Nvidia there but other then that its not good news for them.
__________________
I come from planet viper days so don't call me noob. I own 2 nvidia cards and one ati card so don't call me biased.
bloodbob is offline  
Old 05-23-03, 11:33 PM   #52
CaptNKILL
CUBE
 
CaptNKILL's Avatar
 
Join Date: Jan 2003
Location: PA, USA
Posts: 18,844
Default

Quote:
Originally posted by OICAspork
One last reply before I go home...

a)Doom3 does not use PS2.0... it uses OpenGL, not DX9.
Quote:
Originally posted by bloodbob
Umm doom3 doesn't use PS2.0 actually doom 3 doesn't use PS at all becuase IT IS OPENGL TWIT.
Humus is the one who mentioned Doom 3 when I was talking about PS2.0. Im not the one who brought it up.
__________________
---- Primary Rig ---- CoolerMaster 690 II Advance - Gigabyte GA-EP45-UD3P - Intel Core 2 Quad Q9550 @ 4.0Ghz + Thermalright Ultra 120 Extreme
6GB DDR2 @ 942Mhz 5-5-5-20 1.9v (2x1Gb Wintec AMPX PC2-8500 & 2x2Gb G.Skill PC2-6400) - EVGA Geforce GTX 470 @ 750/1500/1850 (1.050v)
Sparkle Geforce GTS 250 1Gb Low-Profile (Physx) - Crucial RealSSD C300 64Gb SSD - Seagate 7200.12 500Gb SATA - Seagate 7200.10 320Gb SATA
ASUS VW266H 25.5" LCD - OCZ GameXStream 700W PSU - ASUS Xonar DX - Logitech Z-5500 5.1 Surround - Windows 7 Professional x64
---- HTPC ---- Asus M3A78-EM 780G - AMD Athlon X2 5050e 45W @ 2.6Ghz - 2x2GB Kingston PC2-6400 DDR2 - Sparkle 350W PSU
Seagate 7200.10 320Gb SATA - Seagate 7200.10 250Gb SATA - Athenatech A100BB.350 MicroATX Desktop - Creative X-Fi XtremeMusic

Last edited by CaptNKILL; 05-23-03 at 11:37 PM.
CaptNKILL is offline  
Old 05-23-03, 11:34 PM   #53
bloodbob
Registered User
 
Join Date: Oct 2002
Posts: 123
Default

Quote:
Originally posted by The Baron
Not for a benchmark it doesn't.

If you replace a part of a benchmark with your own creation in order to improve your scores for the benchmark, no matter how crappily the benchmark is coded for whatever architecture you're using, you're corrupting the benchmark. Plain and simple.

For a game? Sure. I'm ALL FOR IT. But for a benchmark? No. Otherwise, you'd have to be dependent on a driver team to optimize for every game on the planet in order to get similar performance.
First is this cheating
Var a=1;
Var b=2;
a=a+1;
a=a+b;

changed to by drivers
Var a=1;
a=a+1;
Var b=2;
a=a+b;

In a game the answer is 100% NO. No quality loss nothing wrong looks exactly the same just extra speed if it can run those instructions faster in that order. In a benchmark it is questionable that is why ATI are removing it.

Nvidia on the other hand it doesn't look the same the quality there for can't be compared its not how it was ment to look what happend if ATI made everyone pixel shader blank would you say thats fair and fine?.
__________________
I come from planet viper days so don't call me noob. I own 2 nvidia cards and one ati card so don't call me biased.
bloodbob is offline  
Old 05-23-03, 11:36 PM   #54
The Baron
Guest
 
Posts: n/a
Default

Quote:
In a benchmark it is questionable that is why ATI are removing it.
In a benchmark, it's a cheat. I'd love it in a game, but 3DMark ain't a game. I applaud ATI for removing it, but I don't get why a lot of people aren't seeing it for what it is--a cheat. It's a cheat surrounded by spin, but it's a cheat nonetheless.

And holy crap, you think that I believe that NVIDIA wasn't cheating? Are you kidding me? NVIDIA is guilty as sin. But so is ATI. And so is Trident. And so is SiS. So I guess the time has come for me to just not really care about 3DMark.... oh wait, that happened a long time ago.
 
Old 05-23-03, 11:43 PM   #55
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
How is 3dMark2K3 an accurate representation of "future" games if the NV35 is CLEARLY the best card for Doom III, yet a "poor" performer in 3dMark2K3?
Heh, best at "medium quality" settings in doom3. The only site that tested at "high quality" settings actually shows the cards pretty much on par at those settings with the 9800 winning 2 of the 3 tests. That is with the 9800 only using 128megs of it's 256megs of memory. Are you going to pay $500 to play a game at "medium quality" settings?



Quote:
Originally posted by Grrrpoop
[b]Yes, R3x0 is so bad they developed the entire STALKER engine on it before nVidia jumped in to steal the credit. No doubt it will run terribly. Btw that wooshing noise going past your left ear was a clue. I suggest you give chase


--------------------------------------------------------------------------------




Simply ,because the R300 was faster than the fastest card
of Nvidia by that time ,which was ->Geforce4.
and because the R300 was a directx9 card , the game will use
PS2.0/Vs2.0/ and maybe VS/PS+.
But when they GOt the Nv30 ,they switched again ,
because..
1) it was the faster card.
2)to play with the EXTRAS.. that the Nv30 offer..
Nv40, you ever stop to think that that are using the nv30 now because the HAVE to. They have to so they can jerry rig the game to run decently on the nv30. If they leave standard dx9 calls in there the nv30 will tank.

If you design a game adhering to dx9 specs without tailoring to any one card, the r300 will run it better. You will need to tweak the engine to get it to run at the same level on an nv30 board. Those "EXTRAS" that the nv30 offer are nothing but EXTRA programing headaches to get it to run acceptably.

Same thing with doom3. The r300 runs the standard ARB path great. It will actually use the standard arb path. When the nv30 runs on the STANDARD arb path it is HALF as fast as the r300. It NEEDS a specialized path to run at an acceptable framerate. The specialized nv30 path IS of lower quality than the Standard path. Period. Who has the better engineering again?

So how is the r300 a better engineered card again? It's better engineered because game developers don't have to spend extra time to write specialized paths to get it to run at a decent level.

Just think, if Carmack didn't have to take the extra time to develop specialized paths for the nv30 Doom3 may have been out by now. Of course the nv30 would run it like hell so I'm sure you should be glad he's spending the extra time.

Last edited by jjjayb; 05-23-03 at 11:49 PM.
jjjayb is offline  
Old 05-23-03, 11:45 PM   #56
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by The Baron
In a benchmark, it's a cheat. I'd love it in a game, but 3DMark ain't a game. I applaud ATI for removing it, but I don't get why a lot of people aren't seeing it for what it is--a cheat. It's a cheat surrounded by spin, but it's a cheat nonetheless.
]
Its not a cheat.... Ati never said it was and Futuremark never said it was. Ati said it was an optimization, BUT they understand and respect 3dmark and what its purpose is, and has decided to remove the optimization because in some peoples eyes comparing the non-cheating Nvidia version to the slightly optimized version of Ati would not be fair, so they decided to remove it to give an apples to apples comparison
reever2 is offline  

Old 05-23-03, 11:46 PM   #57
jAkUp
eat. sleep. overclock.
 
jAkUp's Avatar
 
Join Date: Dec 2002
Location: Chino, California
Posts: 17,744
Default

i dont know what all this talk is about the nv30 running it like hell... i have an nv30, i have the d3 alpha demo... the alpha is from 1 year ago... there was no nv30 around. the alpha runs great on my system, and runs identical on my friends system with a raddy 9700 they are running the same config file... the same everything.
__________________
965xe || evga x58 classified || 3x evga gtx 480 || 6gb g.skill || win7 x64
jAkUp is offline  
Old 05-23-03, 11:47 PM   #58
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by The Baron
Not for a benchmark it doesn't.

If you replace a part of a benchmark with your own creation in order to improve your scores for the benchmark, no matter how crappily the benchmark is coded for whatever architecture you're using, you're corrupting the benchmark. Plain and simple.

For a game? Sure. I'm ALL FOR IT. But for a benchmark? No. Otherwise, you'd have to be dependent on a driver team to optimize for every game on the planet in order to get similar performance.
Ati didnt replace anything, they just reordered the instructions to take advantage of the architecture, nvidia on the other hand took the code, threw it out, and put thier own code in

And i agree that it is not good for a benchmark, which is why they are removing it
reever2 is offline  
Old 05-23-03, 11:48 PM   #59
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by jjjayb
Nv40, you ever stop to think. . . .
You should've stopped right there. You're using logic and common sense on a fanboy who's just going to re-interpret everything he can until it spins positive for his IHV of choice.
John Reynolds is offline  
Old 05-23-03, 11:51 PM   #60
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
i dont know what all this talk is about the nv30 running it like hell... i have an nv30, i have the d3 alpha demo... the alpha is from 1 year ago... there was no nv30 around. the alpha runs great on my system, and runs identical on my friends system with a raddy 9700 they are running the same config file... the same everything.
Taken from Carmak's own mouth. The nv30 runs the Standard Arb path at half the performance of the r300.
jjjayb is offline  
Closed Thread


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 01:00 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 09:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 05:48 PM

All times are GMT -5. The time now is 12:59 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.