Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-12-03, 04:58 PM   #49
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

Well Stealth...if games use PS 1.4 shaders, then yes,..all current NVidia cards, except the NV30 ones, will perform pretty poorly, or look pretty poorly, depending on how the developer chooses to solve the problem.

There is a compelling reason to use 1.4 pixel shaders, as it is completely forward compatible with 2.0 and later. Keeps us dev guys from reinventing the wheel all the time.

I have about 40 or so pixel shader routines in 1.4, that will find thier way into my software. What I am doing is disabling them when I detect a card that does not support them, rather trying to do it in software.
Doing it on the CPU side means a heck of a lot more code and thus creating a major performance bottleneck.

While 1.4 shaders are not a huge part of the market today, we dev guys are working on stuff for a year, or more, from now. ATI's 8500 and all later cards support 1.4. The NV3x and later lines appear to support 2.0+, so 1.4 will work for them.
Skuzzy is offline   Reply With Quote
Old 02-12-03, 05:11 PM   #50
StealthHawk
Guest
 
Posts: n/a
Default

i have heard that the r8500 performs poorly on 3dmark03. it supports PS1.4. if this is true, how do you explain the bad performance? surely it is not purely attributable to the use of PS1.4 vs PS1.1?
  Reply With Quote
Old 02-12-03, 05:50 PM   #51
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
even when games finally start using Shaders will the game you play with a gf3 or gf4 match the abysmal performance seen in 3dmark03?
It wasn't coded to be fast, it was coded to put much as much pressure on the graphics system as possible. Hence the utter attrocious performance of the sections that used so-called "doom3 shadowing". i.e. the space station and the one with the 2 ogre's.

The shadowing system done by these two demos was grossly in-efficient, but it did the job of stressing the graphics card.

But I have to agree with nvidia. It _isn't_ representative of games now, _NOR_ games in the future. As no 3d engine coder worth his salt would code something as inefficient as that.
Nutty is offline   Reply With Quote
Old 02-12-03, 05:59 PM   #52
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

The 8500 architecture is not up to the task. In my shader tests, I see it losing about 15-20% performance using pretty small shaders.
Combine the bazillion polys 3dmark is using in most of those tests and the 8500 is going to be murdered. It's vertex engine is not up to the task.

However, the 9xxx series seems to be much better at it.

For me as a developer, 1.4 pixel shaders are the baseline to work with. I simply will not attempt to port my shaders to 1.3, 1.2, and 1.1 PS levels. Too much work and code and the performance levels of the previous PS versions was pretty poor.

I don't know of any other developers using older shader versions for anything serious right now. I know a couple of games using them today, but they were designed before 1.4 became a known quantity.

In house, I had to write a synthetic test for shaders to find out the ramifications of using them versus using the CPU to accomplish the same task. Overall, the 1.4's are a win on the 8500, if you are careful about vertex/prim counts.
But in a heavy vertex/poly count scene, the 8500 will die when using any PS shader.
Skuzzy is offline   Reply With Quote
Old 02-12-03, 10:04 PM   #53
scott123
Registered User
 
scott123's Avatar
 
Join Date: Sep 2002
Location: USA
Posts: 473
Default

Here's the problem with 3dmark 2003.

Futuremark calls it "The gamers benchmark".

Well here is the flaw.

2000+/9700 Pro = 4550

2800+/Ti4600 = 2200

Yet in actual gaming the 2800+/Ti4600 combo will overall out perform the 2000+/9700 combo. Then why is the 2000+/9700 combo over double the score of the 2000+/Ti4600 combo?

Well Futuremark says their benchmark is heavily GPU dependent, and is essentially a graphics card benchmark.

Thats where the benchmark looses validity as an actual "gaming" benchmark.

Nvidia recognized this defiency back in December and pulled out of Futuremarks beta program. I am not brand loyal, but in this case I agree with Nvidia on this issue.

Games that run D3D/OpenGL that come with their own benchmarks (ie- Unreal/Unreal T/Quake 3/etc, etc) are the best measure of performance.

Futuremarks latest approach looks at a narrow area of hardware. PC's are not Gamecubes, and last time I checked, there was a lot more to making a PC fast, then just the VGA card.

3dmark is no longer a valid benchmark, and they did it to themselves.

Scott
__________________
ASUS P5B Deluxe
Intel Core Duo 6600
eVGA 8800GTX
Creative Xfi
Cooler Master CM Stacker STC-T01
scott123 is offline   Reply With Quote
Old 02-12-03, 10:45 PM   #54
LiquidX
xxxxxxx
 
LiquidX's Avatar
 
Join Date: Jan 2003
Location: xxxxxxxx
Posts: 2,113
Default

Quote:
3dmark is no longer a valid benchmark, and they did it to themselves.
Scott-I agree with every single word in your post.
LiquidX is offline   Reply With Quote
Old 02-12-03, 10:47 PM   #55
creedamd
 
creedamd's Avatar
 
Join Date: Oct 2002
Posts: 597
Default

Quote:
Originally posted by scott123
Here's the problem with 3dmark 2003.

Futuremark calls it "The gamers benchmark".

Well here is the flaw.

2000+/9700 Pro = 4550

2800+/Ti4600 = 2200

Yet in actual gaming the 2800+/Ti4600 combo will overall out perform the 2000+/9700 combo. Then why is the 2000+/9700 combo over double the score of the 2000+/Ti4600 combo?
/snip

It may outperform the 9700, but not in dx9. Sorry, 3dmark01 did the same thing with the geforce3. Nobody cryed, nor should they now. If you don't like the BM dont use it.
creedamd is offline   Reply With Quote
Old 02-12-03, 10:55 PM   #56
creedamd
 
creedamd's Avatar
 
Join Date: Oct 2002
Posts: 597
Default

Quote:
Originally posted by StealthHawk
the question on everyone's mind is how far into the future are we looking?

will real games play like that in 1 year's time? 2 years? 3 years?

even when games finally start using Shaders will the game you play with a gf3 or gf4 match the abysmal performance seen in 3dmark03?
Will 3dmark drive the developers to make better graphics in games after seeing that it is possible?

3dmark should be used as a FREE tool to tweak your own system. I think that people start worrying about others scores and get out of control. I use 3dmark03 on my kids computer with the ti4200, I have found a few tweaks to up the score. It helped my computer run better. It gets a 1/3 of the score that my 9700pro does, but who cares? The cards are in 2 different leagues.

My point is, I enjoy the eye candy from 3dmark03, I like to show it off to friends(even though its a slide show), and it's something different and free. If I paid for it thinking that it would be something that it is not. Then I fight. But what's wrong with another cool benchmark/demo?
creedamd is offline   Reply With Quote

Old 02-12-03, 11:27 PM   #57
Bigus Dickus
GF7 FX Ti 12800 SE Ultra
 
Join Date: Jul 2002
Posts: 651
Default

Quote:
Originally posted by scott123
Here's the problem with 3dmark 2003.

Futuremark calls it "The gamers benchmark".

Well here is the flaw.

2000+/9700 Pro = 4550

2800+/Ti4600 = 2200

Yet in actual gaming the 2800+/Ti4600 combo will overall out perform the 2000+/9700 combo. Then why is the 2000+/9700 combo over double the score of the 2000+/Ti4600 combo?
Say what? Turn up the res or enable some AA/AF and the 2800/Ti4600 combo won't be winning anything but sympathy. So what if it the combo gets 258 fps. vs. 234 fps at 800 x 600 with all effects and IQ settings turned to minimum? You use that as the basis for declaring that it "will overall outperform the 2000/9700 combo?"

__________________
IMO, Mr. Derek Smart is a hypocrite: Only someone who is either (a) lying (b) ashamed of their products (c) just plain ashamed, would hestitate to give out some simple and straight forward information. - Derek Smart, Ph.D.
Bigus Dickus is offline   Reply With Quote
Old 02-12-03, 11:39 PM   #58
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nutty
It wasn't coded to be fast, it was coded to put much as much pressure on the graphics system as possible. Hence the utter attrocious performance of the sections that used so-called "doom3 shadowing". i.e. the space station and the one with the 2 ogre's.

The shadowing system done by these two demos was grossly in-efficient, but it did the job of stressing the graphics card.

But I have to agree with nvidia. It _isn't_ representative of games now, _NOR_ games in the future. As no 3d engine coder worth his salt would code something as inefficient as that.
well, i think games can be coded to stress the graphics sub-system as well as perform fast. Doom3's lighting system is a LOT faster than 3dmark03's. hell, this is based on Doom3 alpha version 0.02 or whatever it was
  Reply With Quote
Old 02-12-03, 11:45 PM   #59
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by scott123
Here's the problem with 3dmark 2003.

Futuremark calls it "The gamers benchmark".

Well here is the flaw.

2000+/9700 Pro = 4550

2800+/Ti4600 = 2200

Yet in actual gaming the 2800+/Ti4600 combo will overall out perform the 2000+/9700 combo. Then why is the 2000+/9700 combo over double the score of the 2000+/Ti4600 combo?

Well Futuremark says their benchmark is heavily GPU dependent, and is essentially a graphics card benchmark.

Thats where the benchmark looses validity as an actual "gaming" benchmark.

Nvidia recognized this defiency back in December and pulled out of Futuremarks beta program. I am not brand loyal, but in this case I agree with Nvidia on this issue.

Games that run D3D/OpenGL that come with their own benchmarks (ie- Unreal/Unreal T/Quake 3/etc, etc) are the best measure of performance.

Futuremarks latest approach looks at a narrow area of hardware. PC's are not Gamecubes, and last time I checked, there was a lot more to making a PC fast, then just the VGA card.

3dmark is no longer a valid benchmark, and they did it to themselves.

Scott
yeah, and people complained about 3dmark2001 saying that it was too CPU dependent. now that the new version is video card dependent people cry foul. i guess you can't make everyone happy.

GT1 in 3dmark03 is a game test that is pretty representative of performance today IMO. you see high framerates, and it's basically a DX7 game except it uses VS instead of fixed function T&L.

GT2-3 are shader tests. they should test shaders. they shouldn't be CPU dependent.

GT4 is a DX9 shader test i guess, i can't run it.

a lot of people rightfully were mad that the Nature test in 3dmark2001, called a shader test, was more CPU limited than card limited. so basically the premise of that test told you absolutely zero about the functions the test was supposedly stressing.

then again, it seems that FutureMark went from the extremely CPU limited extreme to the extremely video card limited extreme. neither is very good IMO.
  Reply With Quote
Old 02-13-03, 12:33 AM   #60
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by StealthHawk
yeah, and people complained about 3dmark2001 saying that it was too CPU dependent. now that the new version is video card dependent people cry foul. i guess you can't make everyone happy.
Only problem is games today seem to be much more CPU dependent than they were in 3DMark2001's time.

Quote:
GT1 in 3dmark03 is a game test that is pretty representative of performance today IMO. you see high framerates, and it's basically a DX7 game except it uses VS instead of fixed function T&L.
Not so sure about that. It's a flight sim test, at best. There's no way it can translate to a test of "today's games." At best, it's a test of "today's high-end flight sims."

Quote:
GT2-3 are shader tests. they should test shaders. they shouldn't be CPU dependent.
But they're still more GPU-dependent than a similar real modern game test, because other calculations are not being done (namely AI).

Regardless, my objects still stream, first and foremost, that this is a synthetic benchmark, no matter how much Futuremark tries to make it seem like a real game benchmark.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 01:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 10:30 PM

All times are GMT -5. The time now is 10:57 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.