Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

View Poll Results: Choose One
NVIDIA's new products (NV31 and NV34) are not DX9 parts and their current line lacks the technology to do well 62 65.26%
3DMark03 is a poor benchmark 18 18.95%
Both, but more of option number 1 9 9.47%
Both, but more of option number 2 6 6.32%
Voters: 95. You may not vote on this poll

Reply
 
Thread Tools
Old 02-17-03, 12:00 AM   #13
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

i voted #1 like most so far, im not sure about the first part of the statement (not like any of us realy is one way or antoher), but most defenatly the second part is an issue. after all, not only are the cheaper parts lagging far behind the standards but we are still yet to see conformation of the nv30 running the bench well with anything but qestionable drivers.


as for "the gamer's benchmark" i agree that it can be missleading, but no more missleading than ati saying the "r300 core supports ssaa" or any vast number of examples i could think of. but in a situation like that no one is realy lieing by anymeans and the only way anyone gets mislead in situations like that is when they read too much into it, which is bound to happen. for instance, check out the_matrix's comments starting at near the botom of this is page for an example of an nvidia fan gone way too far:


case in point

i almost felt like signing up as someone brainwashed by ati and start an argument about how truefrom was going to rule the world.
kyleb is offline   Reply With Quote
Old 02-17-03, 02:12 AM   #14
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Smokey
It also DOES not reflect game performance. The 8500 was faster in 3dmark than my gf3, yet in games it wasnt
And my 8500 is faster in both 3dm2k3 & 3dm2k1se than my GF3 (but gets edged out in 3dm2k & futuremark99), and I also prefer my GF3 to game on. Why? Because it's better, smoother, faster, cleaner looking and it plays all the games I like a lot better.

Trouble is, my GF3 is starting to show it's age...especially with U2. It's weird, but the newer games (with the exception of GTA3, which is R*s fault) are starting to run better on my 8500 than my GF3. NOLF2, SoF2, & UT2k3 I'd really have to say might be nudging ahead of my GF3 in the actual gaming department...if you add in AA & AF it'd be a "gimme" to my 8500.

This test is supposed to be reflective of what games are going to be, and in that I don't think ANYONE can tell yet if it is or isn't. It's just a benchmark for now, I'm planning on not taking it too seriously yet.

Well, at least until some games come out that actually justify me upgrading my hardware to a level that will make this benchmark less of a slideshow on my system. Til then I'm planning on sticking with 3dm2k1se for a while longer as me DX bench of choice. It's a hell of a lot more reflective right now of the games I like to play.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 02-17-03, 07:46 AM   #15
Smokey
Team Rainbow
 
Smokey's Avatar
 
Join Date: Jul 2002
Location: FRANCE
Posts: 2,273
Default

Quote:
Originally posted by Myrmecophagavir
Bingo. I wish they wouldn't advertise it as "the gamer's benchmark", because maybe it's true that games don't use the same techniques (ie. PS 1.4). But if you view 3DMark as a test of what a card is capable of rather than trying to emulate gaming performance, it works better. 8500 can utilise PS 1.4 to get things done more efficiently than GF3, so why shouldn't it "win"?

It's like saying "The 8500 wins in app A, but the GF3 wins in app B, therefore app A is a bad benchmark for the card's performance". It's only that app A's programmers made an effort to take advantage of the 8500's extra features. If game developers would take advantage of it more widely then it would win in more tests!
Sorry I should have been clearer, I said 3dmark, meaning both 2001+2003 My point was that fine in 3dmark, the 8500 is faster than my gf3, but im games its not.

Now digitalwanderer added some comments on this also, I dont have both cards myself, so its just from benchmarks on websites. But from what digitalwanderer said, the 8500 seems to be a bit faster now in newer games, this may have something to do with the faster core/memory speeds? Without looking it up, wasnt the 8500 core clocked at 275? my OC gf3 (non Ti) is clocked at 240, which is good for that core.

Back on topic, I dont think Nvidia would slam a DX9 benchmark, 3dmark or any other, just because they have new cards coming that dont support DX9


I havent been keeping up with the NV31 and NV34, but are they not going to support any new DX9 features
__________________
HTPC/Gaming
| Hiper Type R 580W PSU
| Intel Q9550 @ 4GHz | Gigabyte EP45 UD3R |4x 1024MB OCZ Reaper PC9200 DDR2 | Seagate 320GB/ Maxtor 320GB/ Maxtor 500GB HDD|Sapphire HD5850 | Creative SB X-Fi Titanium Pro | Harmon Kardon AVR135 reciever | Jamo S718 speakers | 42" Plasma 720p (lounge room)-Samsung P2450H (bedroom)
Smokey is offline   Reply With Quote
Old 02-17-03, 07:56 AM   #16
silence
 
silence's Avatar
 
Join Date: Jan 2003
Location: Zagreb, Croatia
Posts: 425
Default Re: You've completely missed the point

Quote:
Originally posted by ZoinKs!
The new 3dmark is not made to measure performance in current games. It is made to predict performance in games which will be released 1-2 years from now.

A graphics card producing a high 3dmark03 score will be better in future games than a card with a low 3dmark03 score.

If you want to know performance in current games, then look at framerates in those games. If you're shopping for a new card and want one that will last, then look at 3dmark03 results.
1-2 years from now???........with only ONE PARTIAL dx9 test??do i have to remind you thet first test is dx7??

3dmark03 is more like for stuff we currently have plus little peak in future........
silence is offline   Reply With Quote
Old 02-17-03, 07:56 AM   #17
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Back on topic, I dont think Nvidia would slam a DX9 benchmark, 3dmark or any other, just because they have new cards coming that dont support DX9
Well if the rumors are true with the nV 31/34 being some subset of Dx9 features and with the weight that 3dmark2k3 gives the scores, we can almost bet they these nV cards will score pretty low.
jbirney is offline   Reply With Quote
Old 02-17-03, 08:24 AM   #18
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
what are you on about?

the fx does what it is supposed to... its ps 2.0 does not seem as efficient as the 9700pro's for some reason @ the moment.. either because it is not or because of driver problems... dunno...

what problem exists is the dx8 gf4ti cards running the dx8 games with ps 1.3 instead of 1.4 hence using extra renderng pases and therefore garnering a lower score...

concerning comments about 3dmark03... READ THE WHITE PAPER BEFORE MAKING RETARDED COMMENTS
I was under the impression the PS2.0 shader results were gained from the DX9 scene. If I'm wrong tell me.

This "DX9" shader scene consists mainly of DX8 shaders. Presumably PS1.4. I just thought that maybe nvidia still dont expose 1.4 explicity, but thus promote 1.4 shaders to 2.0, and therefore incurr a performance penalty.

It was just a thought, chill out!

Quote:
Uh huh. So I guess we don't really need DX9 at all just code everything in a 64k boundry and that's all it takes.
Eh? Theres nothing stopping you writing a DX9 demo in 64k. I'm just saying I wasn't very impressed with the demo itself. The ogre scene looked awful. That womans's hair was crap.

Quote:
You know, why doesn't everyone read FutureMark's white paper on 3DMark03 it explains a lot and answers many questions and concerns people are having.
I have.

Quote:
AND GET IT THROUGH YOUR HEADS that it is for FUTURE GAMES not what is being played now.
Thats the problem, Genius!! Future games WILL NOT use the crap algorithms used in this badly written demo/benchmark.
Nutty is offline   Reply With Quote
Old 02-17-03, 09:36 AM   #19
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by Nutty
Thats the problem, Genius!! Future games WILL NOT use the crap algorithms used in this badly written demo/benchmark.
And exactly how do you know it's badly written? You've dissected the source code already? Or are you simply parroting a certain IHV's opinion?
John Reynolds is offline   Reply With Quote
Old 02-17-03, 10:48 AM   #20
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

Quote:
Originally posted by Nutty

Thats the problem, Genius!! Future games WILL NOT use the crap algorithms used in this badly written demo/benchmark.
there are many REAL games out there that unfortunately are also badly written

case in point... ut2k3... even though I play it flawlessly @ 1600x1200 epic really made me mad with the memory leaks and what not in the game

however it is still used as one of the premier benchmarking games out there... even with the splash screen... even with the memory leak issues... hence alls fair... init ?

btw I still don't understand why 3dmark03 is considered badly written...
Sazar is offline   Reply With Quote

Old 02-17-03, 11:34 AM   #21
tamattack
Registered User
 
Join Date: Nov 2002
Posts: 159
Default

Quote:
Originally posted by Nutty
I'm just saying I wasn't very impressed with the demo itself. The ogre scene looked awful. That womans's hair was crap.
Do you realize that each strand of the woman's hair is being rendered individually in that scene? Let me guess, you would rather have a big blurred texture there instead?
tamattack is offline   Reply With Quote
Old 02-17-03, 11:37 AM   #22
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

Quote:
Originally posted by tamattack
Do you realize that each strand of the woman's hair is being rendered individually in that scene? Let me guess, you would rather have a big blurred texture there instead?
yes it was quite impressive for a dx8 demo.. I was surprised @ the lighting/rendering/bump mapping...
Sazar is offline   Reply With Quote
Old 02-17-03, 12:49 PM   #23
noko
noko
 
noko's Avatar
 
Join Date: Sep 2002
Location: Orlando Florida
Posts: 735
Default

Part of DX9 ps2.0 is also ps1.4, 1.3 - 1.1. Meaning if parts of the benchmark is using ps1.4 is because that is all that is required for rendering that stage or even 1.1 version, it is all part of DX9 spec. That doesn't make it less then a DX9 benchmark. 3dMark03 supports most pixel shaders inherit in DX9 and uses the best one available for your card. Why penalize cards with ps1.4 support just because your card doesn't have it. Now some effects did require vs2.0 and ps2.0 to work reasonalbly where as ps1.4 and 1.1 with multiple passes wouldn't work. As for why FutureMark didn't use ps1.3? Well it just doesn't give you much more of anything is the bottom line.
__________________
Gigabit DSL, Q6600 @ 3.4ghz, Mushkin 4gb DDR 800 OCZ 2x2gb , PowerColor HD 5870. 24" Acer LCD, 1TB hd with Vista 64 Home Premium.

Foxconn 780G, Athlon64 X2 5600 @ 3ghz, Mushkin 4gb DDR 800 OCZ 2x2gb , eVga 260 GTX. 17" MagAcer LCD, 120gb Vista 32
noko is offline   Reply With Quote
Old 02-17-03, 01:41 PM   #24
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

I voted option 1, but please note that it *isn't* exactly what I meant.

Option 1 suggests the NV31&NV34 couldn't run Game Test 4. I believe both can ( I'd probably really not be impressed by Game 4 performance, however... )

My opinion, however, is that NV34&NV31 performance has been cutted in the areas 3DMark 2003 find the most important.
I'd guess, for example, that if nVidia insists soo much that 3DMark 2003 does too much skinning, it's because NV34 performance for VS is simply crap.


Uttar
Uttar is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 09:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 05:48 PM
rh7.3 and nvidia vcrispo NVIDIA Linux 11 07-31-02 09:57 PM

All times are GMT -5. The time now is 02:31 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.