Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-17-03, 01:20 AM   #97
Skynet
Registered User
 
Skynet's Avatar
 
Join Date: Sep 2002
Location: Canada
Posts: 273
Send a message via ICQ to Skynet Send a message via AIM to Skynet
Default

Quote:
Originally posted by StealthHawk
no, you are missing the point. 3dmark2001 was great becauseof the database so you could compare systems.

why is this true? because 3dmark2001 was a system benchmark that stressed the memory subsystem, the video subsystem, and the CPU.

now, this is NOT true of 3dmark03. it is a video card benchmark, nothing more, nothing less. the CPU makes such an insignificant contribution to the final score that comparing systems leads to incorrect and erroneous conclusions. other people have already posted this. i don't want to repeat them.

i mean, we have already had people say that some guy with a 1GHz system and a DX9 card scored 2-3x higher than someone with a DX8 card and a 2.4GHz system or something. and obviously the latter system is faster in reality, right?

edit: and if you still don't believe me, have a look at this thread http://www.nvnews.net/vbulletin/sho...=&threadid=7456 even the CPU test of 3dmark03 shows ridiculous results.
no I am afraid you don't understand the point at all. Future DX8/9 games are going to be much more video card dependent and less CPU intensive. You state 3DMark03 is a video card benchmark and not a system benchmark. This is exactly correct and exactly how it should be. If the nature test, which used DX8/9 features can run smoothly on an Athlon T-Bird 1.4 and a Pentium 4 2.8 then that should tell you something. It means that finally all those 110+ million transistors in the latest GPU's are doing what they were designed to do. I understand that the CPU still has to do some physics/AI etc. but that computational load is must less intense than shifting around billions of pixels. 3DMark03 is designed properly for the future of gaming and gaming cards.

If you want to benchmark and get and idea of how a system will play games from the last 3 or 4 years then use 3Dmark2001. If all you want to do is play DX7 games all day then stick with a GeForce3/4 or Radeon 8500.

You are using today's games as a reference when talking about 3DMark03 which is just plain wrong. Read the 3DMar03 white paper it explains all of this and a lot more. I believe they have researched and studied the situation very well, what they have come up with makes sense.

EDIT: and I just wanted to add that it is unreasonable and actually crazy to expect 3DMark03 to use the same methodology as 3DMk2001. Why? Because 2001 is based on a set of criteria for a much much slower generation of cards with much much less custom imagery processing. So what do you expect your new GeForceFX to do, push the pixels around in the exact same way as your GeForce 4 only at a higher clock rate? So why bother packing in all those shaders if they still rely on the CPU to do all the work anyway?

Last edited by Skynet; 02-17-03 at 01:26 AM.
Skynet is offline   Reply With Quote
Old 02-17-03, 02:00 AM   #98
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by Skynet
no I am afraid you don't understand the point at all. Future DX8/9 games are going to be much more video card dependent and less CPU intensive. You state 3DMark03 is a video card benchmark and not a system benchmark. This is exactly correct and exactly how it should be.
That's just a guess, something you cannot be sure about.

I find it more likely that game developers will use the additional CPU power made available in new ways, rather than just not use it. We've already seen parts of this come into play, such as the Karma physics engine of Unreal 2 and Unreal Tournament 2003. I expect other things like more realistic sound and better AI will also become more prevalent.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-17-03, 02:51 AM   #99
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
and better AI will also become more prevalent.
As soon as we have AI that thinks.

/me waits 5*10^500 years

__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-17-03, 03:15 AM   #100
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

i posted this in another thread but i think it is just as relvent here:

as for "the gamer's benchmark" i agree that it can be missleading, but no more missleading than ati saying the "r300 core supports ssaa" or any vast number of examples i could think of. but in a situation like that no one is realy lieing by anymeans and the only way anyone gets mislead in situations like that is when they read too much into it, which is bound to happen. for instance, check out the_matrix's comments starting at near the botom of this is page for an example of a nvidia fan gone way too far:

case in point

i almost felt like signing up as someone brainwashed by ati and start an argument about how truefrom was going to rule the world.
kyleb is offline   Reply With Quote
Old 02-17-03, 04:17 AM   #101
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Skynet
no I am afraid you don't understand the point at all. Future DX8/9 games are going to be much more video card dependent and less CPU intensive.
to quote Gandalf in the Fellowship of the Ring movie, "you know this? how?" me thinks you are trying to look into the palantir when you shouldn't.

Quote:
I understand that the CPU still has to do some physics/AI etc. but that computational load is must less intense than shifting around billions of pixels. 3DMark03 is designed properly for the future of gaming and gaming cards.
again, what makes you think this? have video card companies been saying that the CPU is no longer as necessary for the DX9 generation of games? have game developers been saying that? those aren't rhetorical questions. please give me the evidence behind your statements. you seem awfully sure of yourself.

look back a few years to the lates 90s. the GeForce256 was just announced with this amazing thing called a hardware Transform & Lighting engine. it was supposed to usher in a new era of high polygon gaming and make the need for upgrading CPUs less. suddenly, slower CPUs would be just as good as fast CPUs because the great number of graphical calculations would be offloaded from CPUs and done on the GPU. games utilizing T&L(ie DX7 games) were supposed to be right around the corner and everyone was supposed to be delighted that this video card would be awesome and extend the life of a computer's CPU.

flash forward to the present day, some 3-4 years later. obviously it took a lot longer for real DX7 games to show up. UT2003 makes good use of T&L, and Doom3 is based on the featureset of the gf1 as the minimum hardware needed to render.

now, look at the many DX7 and DX8 games that have been released. almost all of them are very CPU dependent

physics and AI will keep improving. coders will take advantage of and use available CPU power where they can. if that means that this time the CPU will be offloaded so it can do AI and physics(a promise of DX7 and T&L which never really happened, if anything games became more CPU dependent), then we'll be just as CPU limited down the road as we've always been.

you severely underestimate the computation required for good physics and good AI. remember Tresspasser? remember how slow it was? that was the remarkable physics engine it had at the time. and the physics were still far, far from perfect(if i remember there was little to no friction when moving objects).
  Reply With Quote
Old 02-17-03, 05:08 AM   #102
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
now, look at the many DX7 and DX8 games that have been released. almost all of them are very CPU dependent
Depends on the graphics options mate.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-17-03, 05:21 AM   #103
kyleb
Registered User
 
Join Date: Jan 2003
Posts: 364
Default

Quote:
Originally posted by StealthHawk
to quote Gandalf in the Fellowship of the Ring movie, "you know this? how?" me thinks you are trying to look into the palantir when you shouldn't.
rofl, well Gandalf, where have you been, maybe you should lay off the pipe a little? the off loading of the cpu is not some unjustifiable look it the future, it is the direction the industry has been pushing for years, and we all have nvidia to thank for being the major motivator in that direction. up until recently quite a few foolish fanboys claimed that it was all a conspiracy to overthrow 3dfx and would never be of any use to gamers. actually if you go check out the forums at tdhq.net i will bet there are some still holding to that story now. last i knew they also have some guys that are planing to resurrect 3dfx and are pushing the slogan "revenge is commeing." rofl
kyleb is offline   Reply With Quote
Old 02-17-03, 06:17 AM   #104
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by kyleb
rofl, well Gandalf, where have you been, maybe you should lay off the pipe a little? the off loading of the cpu is not some unjustifiable look it the future, it is the direction the industry has been pushing for years, and we all have nvidia to thank for being the major motivator in that direction.
what in the world are you talking about? look at some recently released games that push the technological edge. Jedi Knight2, Morrowind, Commanche4, Unreal Tournament2003. i know for a fact that these are all very CPU dependent engines. whether or not the CPU is actually having more of the transforming offloaded from it in these games is irrelevant. in fact, if they are, then it helps prove my point.

that point being that games in the future will stay CPU limited.

so take your pick. either video cards have not been offloading the work from the CPU in newer games, or they have and the CPU cycles are being spent elsewhere. either way we still have the more pretty tech-pushing games being CPU limited.

Quote:
up until recently quite a few foolish fanboys claimed that it was all a conspiracy to overthrow 3dfx and would never be of any use to gamers. actually if you go check out the forums at tdhq.net i will bet there are some still holding to that story now. last i knew they also have some guys that are planing to resurrect 3dfx and are pushing the slogan "revenge is commeing." rofl
i'm sure they do. and i never said offloading the CPU wasn't a good thing. i want good AI, and i want better physics.

frankly, it doesn't matter WHAT the CPU is doing, past, present, or future. it will still be used to its capacity while doing whatever task it is doing.

you guys that keep saying the future is games that will magically not be CPU limited in any major capacity ala 3dmark03 has not been justified at all yet.

you made the statements, you(not necessarily you kyleb, although you seem to be supporting Skynet) need to back them up. you can do this a few ways.

1) show me the trend of recent technology leading games that are not CPU limited, or are more specifically video limited only.

2) show me quotes from ATI or another video card company that thinks games will follow this trend.

3) show me game developer quotes that say the CPU will be idling because everything will be done on the video card, and AI and physics won't be enough to stress the CPU fully.

i mean, really. enough is enough. time to show the hand you're holding. all i see and hear is a bunch of rhetoric with no actual substance to it.

you can ignore all the above if your statement is just based on personal opinion and nothing else, but if it is, then please just say so.
  Reply With Quote

Old 02-17-03, 06:27 AM   #105
Smokey
Team Rainbow
 
Smokey's Avatar
 
Join Date: Jul 2002
Location: FRANCE
Posts: 2,273
Default

I might be wrong here, but Im sure I read sometime ago, that there will always be somethings that the cpu can do faster than any GPU can when it comes to games?
__________________
HTPC/Gaming
| Hiper Type R 580W PSU
| Intel Q9550 @ 4GHz | Gigabyte EP45 UD3R |4x 1024MB OCZ Reaper PC9200 DDR2 | Seagate 320GB/ Maxtor 320GB/ Maxtor 500GB HDD|Sapphire HD5850 | Creative SB X-Fi Titanium Pro | Harmon Kardon AVR135 reciever | Jamo S718 speakers | 42" Plasma 720p (lounge room)-Samsung P2450H (bedroom)
Smokey is offline   Reply With Quote
Old 02-17-03, 06:30 AM   #106
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
Originally posted by Smokey
I might be wrong here, but Im sure I read sometime ago, that there will always be somethings that the cpu can do faster than any GPU can when it comes to games?
Physics?
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-17-03, 07:23 AM   #107
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

As a developer, and I think most would concur, the baseline video card we program to today is the GF2 Ultra running in an 800Mhz P3.

The more powerful CPU's certainly allow a lot more to be done today than yesterday. But we cannot ignore that the GPU's have also gotten more powerful.

To that end, programmers today are looking at better physics and better scene/object interaction.
The physics is virtually all done on the CPU, as well as collsion detection. In order for developers to step up to the next level in these areas, we need to push some graphics chores off to the GPU.
Why? Even if the GPU is slower than the CPU, we can take advantage of the parallel processing of both. This is just a balancing act. How much can we push off to the GPU? That is a bit of an unknown quantity.
With today's pixel shaders (1.4 and later), there are things that can be done far more efficiently on the video cards versus using the CPU. As time goes forward, and the baseline video card is raised, there will be more and more programmers using the video card features.

Programmers face a problem today. In the past, video card technology actually did not move that quickly. Yes, the clock speeds increased, the amount of video ram increased, which help by allowing more scene details.
But today we are seeing video card technology jump forward by leaps and bounds in the feature set they offer. Unfortunately, these new video cards will not be the baseline cards for 3 to 5 years down the road.
And even then, they may not be the true baseline. Many factors come into play here. Both ATI and NVidia ship video cards today that are pretty brain dead. One of the more popular line of NVidia cards has no shaders worth mentioning (MX line I beleive).

Searching for the holy grail of balancing the graphics speed with the CPU speed is what it is all about. In an ideal world, the scene would be finished rendering just as the next frame is ready to go. To get there programmers cannot ignore what the GPU can do to help. If we do, then we are doing a disservice to the consumer and generally writting poor code.

My two nickles.....
Skuzzy is offline   Reply With Quote
Old 02-17-03, 11:33 AM   #108
Skynet
Registered User
 
Skynet's Avatar
 
Join Date: Sep 2002
Location: Canada
Posts: 273
Send a message via ICQ to Skynet Send a message via AIM to Skynet
Default

Quote:
One of the more popular line of NVidia cards has no shaders worth mentioning (MX line I beleive).
That's an awefully vague statement from a developer. the MX line has ZERO shaders it is a cut down DX8 card with all the good shader hardware disabled/removed totally.
Skynet is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 01:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 10:30 PM

All times are GMT -5. The time now is 11:21 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.