Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-13-03, 07:16 AM   #73
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by StealthHawk
what is a recent game with good graphics that wasn't CPU limited? all the ones in my recent memory have been.
At higher settings demands on vid-card grow. If the games were CPU-limited, it would mean that different vid-cards would get similar results, and that is not the case (espesially if you use FSAA and/or AF).
Nemesis77 is offline   Reply With Quote
Old 02-13-03, 07:26 AM   #74
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by StealthHawk
and i don't think that reflects realworld performance AT ALL. show me ONE GAME where a DX9 capable card scores 4-5 times higher than a DX8 gen card, regardless of the CPU used.
I haven't seen benchmarks where they use vastly different CPU's but here is one showing huge differences.

What I see here is just faithful repeat of the part-line. NV says something, and certain people automatically parrot what NV said. Why didn't NV (and those people) whine when previous 3DMarks were used? Back then it was OK to use 3DMark, but not anymore because NV says so . Suddenly 3DMark is bad. I mean, we can't have vid-card benchmark that actually (shock and horror!) is demanding on the vid-card.
Nemesis77 is offline   Reply With Quote
Old 02-13-03, 08:11 AM   #75
Evildeus
Registered User
 
Join Date: Nov 2002
Posts: 309
Default

Quote:
Originally posted by Hellbinder
You have absolutly NOTHING to back that up except Nvidia own Statement agaisnt 3dmark03...Funny Nvidia is the ONLY comany of several IHV's making such statements.

Why dont you back that statement of yours up with soem detailed technical info.
Well take a card 2 years ago and take the best CPU today, put it at 1024*768 on the most recent game, and if it goes faster than 20 FPS, then we have something to back this.

So WTF with this bench?
__________________
But if they think they're going to hold onto it, they're smoking something hallucinogenic - Jen-Hsun Huang
Evildeus is offline   Reply With Quote
Old 02-13-03, 08:29 AM   #76
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
You have absolutly NOTHING to back that up except Nvidia own Statement agaisnt 3dmark03...Funny Nvidia is the ONLY comany of several IHV's making such statements.

Why dont you back that statement of yours up with soem detailed technical info.
Go read the whitepaper at;
http://www.futuremark.com/products/3dmark03/

The fact of the matter is, its shadowing is crap. It may stress the graphics card, but that doesn't necessarily represent true gaming performance.

Its easy to write graphics code that will chug like crap on a graphics card. Does this show off the true performance and capability of the card? No it doesn't.
Nutty is offline   Reply With Quote
Old 02-13-03, 08:38 AM   #77
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
Its easy to write graphics code that will chug like crap on a graphics card. Does this show off the true performance and capability of the card? No it doesn't.
Yeh but with devs like myself you are more likely to get crappy/slow code.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-13-03, 08:52 AM   #78
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

hehe

Assides from speeds and stuff, I have to say I was seriously not impressed with 3dmark03.

When I 1st saw 3dmark2001 at an nvidia conference, the nature scene running on a huge projector for the 1st time blew me away.. it was stunning..
Nutty is offline   Reply With Quote
Old 02-13-03, 10:02 AM   #79
Skynet
Registered User
 
Skynet's Avatar
 
Join Date: Sep 2002
Location: Canada
Posts: 273
Send a message via ICQ to Skynet Send a message via AIM to Skynet
Thumbs down

I think a lot of you are missing the point. A large part of why 3DMark is so widely used is that it is a vast database used for COMPARISON. The test is designed to showcase the rendering ability of the card as well as allow you to compare it to other systems and GPU's.

If the benchmark is poorly coded, so be it. But it still allows you to compare your performance to other systems, cards, and manufactures.

Sure one could argue that it is slanted to particular hardware, but I don't believe this to be the case. When Nvidia was pummeling ATI in 3DMark we all KNEW it was because their cards were faster. It was obvious. That did not make all ATI pieces of crap though, they still had merits, most noteably better image quality, less power consumption, and lower heat production. But at one time if you wanted the fastest card it was Nvidia, and 3Dmark CONFIRMED this. Now the tables have turned, and NVidia cards are gagging on the rendering load.

It is absurd and reeks of sour milk when Nvidia and Nvidia fans are screaming bloody murder over 3DMark03. Nvidia STOP COMPLAINING and give us a NEXT GEN CARD.

If a card scores 800 on 3DMark03 you can be DAMN SURE it will suck at any game using DX8/9 features. If your card scores 7000 on the same test you can be DAMN SURE it will scream at the same next-gen games. Isn't that the whole point of a benchmark?

Last edited by Skynet; 02-13-03 at 10:06 AM.
Skynet is offline   Reply With Quote
Old 02-13-03, 10:49 AM   #80
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by Skynet
If a card scores 800 on 3DMark03 you can be DAMN SURE it will suck at any game using DX8/9 features. If your card scores 7000 on the same test you can be DAMN SURE it will scream at the same next-gen games. Isn't that the whole point of a benchmark?
But the numbers can be misleading. I had a Radeon 8500 and a GeForce3. The 8500 would score consistently 700-1000 points higher in 3dmark than my GF3. But in games, that ATI card would choke with low framerates and/or graphical errors whereas the NVIDIA card would deliver smooth performance and no visual errors to speak of.

Mere benchmarks do not a great video card make. It's all about trial and error to determine which card works best for you for the reasons you need it.
saturnotaku is offline   Reply With Quote

Old 02-13-03, 12:39 PM   #81
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by saturnotaku
But the numbers can be misleading. I had a Radeon 8500 and a GeForce3. The 8500 would score consistently 700-1000 points higher in 3dmark than my GF3. But in games, that ATI card would choke with low framerates and/or graphical errors whereas the NVIDIA card would deliver smooth performance and no visual errors to speak of.
I agree with your findings to a point. I agree with your concuslion to a point. The issue I have with your reply is 3dmark was buit and advertised as a forward looking benchmark. Thus the scores you saw and the games you played then did not really have any correlation. Its with the new games that really use those features of DX8 that will tell us if there that bench was accruate. For example we know that Doom3 will use most of the DX8 feature set just in an OpenGL version. We know that the 8500 has its own path (dubed the R200 path). We know that JC has also built other paths for different cards. In JC's plan late last year he said that there are cases were the GF4 beats the 8500 and cases were the 8500 beats the GF4. We all know that the GF4 > GF3 so can we conclude that the 8500 in most cases will be faster than the GF3 just like 3dmarks2001 "said" it would? Dont know unilt the game ships and we see benchs.

The point is that its very easy to not understand what the numbers are telling us. They are not telling us how current games will work. I agree that 3dmark is/was/always will be poor at current games. Its track record for furture game was not all that....but only time will tell how well/bad 3dmark2001 ...

Again I am not saying your point is not valid for current games. It is and I agree with it. Just to early to tell on the future stuff....
jbirney is offline   Reply With Quote
Old 02-13-03, 12:53 PM   #82
deckard
Registered User
 
deckard's Avatar
 
Join Date: Jul 2002
Location: Tampa, FL
Posts: 22
Default

I think the whole problem is they (Futuremark, Madonion, whatever) are calling it a gaming benchmark. This is misleading a TON of people. I think results so far clearly show that 3DMark03 is a GPU benchmark. You get basically the same score with a GF4 Ti on a P3 800 Mhz system as you do on a brand new P4 3 Ghz system. How does that represent gaming? Nothing makes even a dent in 3DMark03 except the videocard.

Since DX9 games aren't anywhere near release, perhaps they just released this product a little too early? With only 1 DX9 part on the shelves, GFFX status at retail questionable and no other IHV even close, it seems the only conclusion you can draw right now from 3DMark03 is that the 9700 Pro is really, really fast. There's a news flash. I've wanted one of those LONG before 3DMark03 came along.
deckard is offline   Reply With Quote
Old 02-13-03, 01:20 PM   #83
deckard
Registered User
 
deckard's Avatar
 
Join Date: Jul 2002
Location: Tampa, FL
Posts: 22
Default

I'm reading more about this in other threads around the net and beginning to wonder. Has Futuremark stated in some way that 3DMark03 is meant for DX9 cards (basically 9700 Pro and GFFX) and anything else should still be used with 3DMark2001? It would make good sense and clear up some confusion if they stated as much but I was wondering if something like that is in print somewhere within the product or on their website.
deckard is offline   Reply With Quote
Old 02-13-03, 02:39 PM   #84
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by saturnotaku
But the numbers can be misleading. I had a Radeon 8500 and a GeForce3. The 8500 would score consistently 700-1000 points higher in 3dmark than my GF3. But in games, that ATI card would choke with low framerates and/or graphical errors whereas the NVIDIA card would deliver smooth performance and no visual errors to speak of.

Mere benchmarks do not a great video card make. It's all about trial and error to determine which card works best for you for the reasons you need it.
ROFLMAO~~~~~~


I still have a GF3 & 8500 and have found the exact same thing to be true....I've just never seen anyone else who had before.

The GF3 is me gaming card, but it's getting long of tooth and I'm really needing to upgrade. U2 is showing it's age off too much.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 02:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 11:30 PM

All times are GMT -5. The time now is 09:55 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.