Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-13-03, 03:48 PM   #85
deckard
Registered User
 
deckard's Avatar
 
Join Date: Jul 2002
Location: Tampa, FL
Posts: 22
Default

The official press release does specifically state, or at least infer strongly, that 3DMark03 is intended for DX9 hardware and therefore, the games of tomorrow. So I guess I really can't complain. I wish I could at least enjoy the program for gee-wiz purposes but it's so slow on my GF4 nobody would be impressed. So why didn't they just make all 4 game tests DX9 specific?
deckard is offline   Reply With Quote
Old 02-13-03, 05:57 PM   #86
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nemesis77
At higher settings demands on vid-card grow. If the games were CPU-limited, it would mean that different vid-cards would get similar results, and that is not the case (espesially if you use FSAA and/or AF).
raw benchmarks! that has been my argument from the beginning.

this is my original post
Quote:
most games released today are CPU limited in all situations unless you add FSAA or AF into the mix. and even then the great majority are still CPU limited.

regardless, 3dmark03 is a LOT more video dependent even with FSAA and AF both turned off.

unless the future is actually going towards the trend of becoming less CPU dependent, which i doubt. if anything games will stay just as CPU limited as they have always been, albeit CPU might be used for things like AI and physics.
do you see my point now? my point IS that in games when you add FSAA and AF vast differences can be seen in many games. without it there isn't. the vast majority of games are cpu limited in this way. future games most likely will be too.

i am NOT arguing that FSAA and AF will not change that. that was never my point.

edit: typo
  Reply With Quote
Old 02-13-03, 06:01 PM   #87
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Skynet
I think a lot of you are missing the point. A large part of why 3DMark is so widely used is that it is a vast database used for COMPARISON. The test is designed to showcase the rendering ability of the card as well as allow you to compare it to other systems and GPU's.
no, you are missing the point. 3dmark2001 was great becauseof the database so you could compare systems.

why is this true? because 3dmark2001 was a system benchmark that stressed the memory subsystem, the video subsystem, and the CPU.

now, this is NOT true of 3dmark03. it is a video card benchmark, nothing more, nothing less. the CPU makes such an insignificant contribution to the final score that comparing systems leads to incorrect and erroneous conclusions. other people have already posted this. i don't want to repeat them.

i mean, we have already had people say that some guy with a 1GHz system and a DX9 card scored 2-3x higher than someone with a DX8 card and a 2.4GHz system or something. and obviously the latter system is faster in reality, right?

edit: and if you still don't believe me, have a look at this thread http://www.nvnews.net/vbulletin/show...&threadid=7456 even the CPU test of 3dmark03 shows ridiculous results.
  Reply With Quote
Old 02-13-03, 06:13 PM   #88
jAkUp
eat. sleep. overclock.
 
jAkUp's Avatar
 
Join Date: Dec 2002
Location: Chino, California
Posts: 17,744
Default

well i have a 2.4 ghz pc, and it sure it ****ing slow in every game.. i know its just because of the vcard though
__________________
965xe || evga x58 classified || 3x evga gtx 480 || 6gb g.skill || win7 x64
jAkUp is offline   Reply With Quote
Old 02-13-03, 06:34 PM   #89
mreman4k
Registered User
 
Join Date: Dec 2002
Location: North Kakilaki, USA
Posts: 74
Send a message via AIM to mreman4k
Default

Yea its tragic, but Nvidia failed to push the envelope as far up until the not so spectacular GFFX. They just sat on their GF 3/4 product line and focused on speed. While ATi tried to advance the video card market. I can't help to say that nvidia brought it all on themselves.
mreman4k is offline   Reply With Quote
Old 02-13-03, 08:09 PM   #90
ElMoIsEviL
Registered User
 
ElMoIsEviL's Avatar
 
Join Date: Feb 2003
Location: Canada
Posts: 6
Default OMG...

Hehehe,

Okay time for some major typing.

In reality comments that I see often such as "The GeForceFX is more technologically advanced" and "It's performance will get better in AF and AA with better drivers" are kind of.... well false.

Here's the reason why.

What API's are you using to play games?
Probably Direct3D and OpenGL.
Both of them have updated versions being OpenGL 1.4 and Directx9.0.
Both of them have the same limitations.

Cg is only a programming language that is easier to understand and program then Direct3D or OpenGL. What it does is facilitate the programming of complex Pixel and Vertex Shaders for Direct3D and OpenGL.
Seeing as you can only program up to what Direct3D or OpenGL support then you're still limited as to what you can do.

This is where all the GeForceFX's advanced features go out the crapper. DX10 is not due out till sometime end of next year (Q4 2004). By then there will be many new cards and the GeForceFX will be in the same position as the GeForce3 is right now... barely able to play a game using advanced features such as DoomIII.

So it's technological advancements are only in it's extra support for Shader versions 2.0+ in both Pixel and Vertex.

Now to counter the other claim there is no way they can gain much more performance in AA or AF using there current 128Bit memory controller. They may squeeze 5% more at the lower resolutions (1024*768) and 1-2% at 1600*1200 if they're lucky.

It's Image quality using AF and AA is not up to par to R300. That's a fact and HardOCP have the final say on that too having tested the newer drivers given to them personally by nVidia fixing the issues (or so they said).

There is nothing, and I repeat nothing, the GeForceFX can do that the R300 can't.
On the other hand the GeForceFX emulates displacement mapping and cannot do it where as the R300 can do it in hardware.

So technology wise, in DirectX9.0 and OpenGL 1.4 games the R300 is more advanced.

The GeForceFX is coming out end/Beg of February/March 2003 and the R350 Launches in March and should be shipping by early April.

Is there any reason to buy a GeForceFX? Yes there is, if you are a FanBoy like most of you here then go ahead and by this poor attempt at keeping a performance crown, but most of us enthusiasts will pass and look towards R350/R400 and NV35.

The GeForceFX is in the same position as the Voodoo5 5500 back in 2000, if you remember the GeForce2 GTS was released and beat the Voodoo5 5500 in all respects except Unreal Tournament... hmm Deja vu no?

And wasn't it nVidia that was backing 3D Mark 2000 and 2001 when there cards were at the top of those benchmarks.. now that they aren't they turn around and release this Hypocritical statement.

nVidia will not go out of business, but they've lost more customers then they will ever know... the bigger they are... the harder they fall.
__________________
DFI Lanparty UT NF4 SLI-DR | AMD Athlon64 X2 4800+ @ 3.2GHz (291x11)(Swiftech MCW5002-64T 226W Pelt) | 2GB OCZ Platinum EL XTC PC4000 DDR 1:1 @ 3-3-3-8 3.0v! | ATi Radeon X1900XTX 512MB (Alphacool NexXxoS NVXP-3 Waterblock) | Sound Blaster X-Fi FaT@l1Ty FPS Sound Card | 2x74GB WD Raptor 10,000 RPM RAID0 | 2x250GB Maxtor Maxline III 7,200 RPM RAID0 | 2x74GB Quantum Atlas III SCSI 10,000 RPM | SuperCooled: Enterprise Custom Cooling

Asus A8R-MVP /vmod | AMD Athlon64 X2 3800+ @ 2.8GHz (10x280) | 2x1024MB PQI PC4200 Performance DDR (1:1) | Asus EN7900GTX 512MB | Sound Blaster X-Fi Xtreme Music Sound Card | 1x250GB Seagate 7200.9 8MB Cache HD
ElMoIsEviL is offline   Reply With Quote
Old 02-13-03, 08:29 PM   #91
legion88
WhatIfSports.com Junkie
 
Join Date: Jul 2002
Posts: 135
Default

Quote:
Originally posted by StealthHawk
no, you are missing the point. 3dmark2001 was great becauseof the database so you could compare systems.

why is this true? because 3dmark2001 was a system benchmark that stressed the memory subsystem, the video subsystem, and the CPU.

now, this is NOT true of 3dmark03. it is a video card benchmark, nothing more, nothing less. the CPU makes such an insignificant contribution to the final score that comparing systems leads to incorrect and erroneous conclusions. other people have already posted this. i don't want to repeat them.

i mean, we have already had people say that some guy with a 1GHz system and a DX9 card scored 2-3x higher than someone with a DX8 card and a 2.4GHz system or something. and obviously the latter system is faster in reality, right?

edit: and if you still don't believe me, have a look at this thread http://www.nvnews.net/vbulletin/show...&threadid=7456 even the CPU test of 3dmark03 shows ridiculous results.
The purpose of any benchmark is to compare the competition against a standard. In the absense of a standard (which is often the case), the purpose is to compare the competition against each other. It is always better to have a standard. All the cards could stink, for instance, and you wouldn't know it unless you have something else (the standard) to compare it to.

The presense of a database that futuremark maintains simply makes comparing easier. It does not make the benchmark itself poor or great.

The purpose of a video card (performance) benchmark is to compare the performance of various video cards. Therefore, we do not want the CPU to be a dominating influence on the final result like it was in 3DMark2001.

The purpose of a CPU (performance) benchmark is to compare the performance of various CPUs. Therefore, we do not want the video card to be a dominating influence on the final result. Judging from the comments so far, it appears that 3DMark2003's CPU test is having some problems.
legion88 is offline   Reply With Quote
Old 02-14-03, 09:19 AM   #92
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by legion88
The purpose of a video card (performance) benchmark is to compare the performance of various video cards. Therefore, we do not want the CPU to be a dominating influence on the final result like it was in 3DMark2001.
Absolutely right. I think this is where a distinction needs to be made - Is 3DMark a system benchmark or a video card benchmark?

I think FutureMark's biggest mistake is calling 3DMark 2003 a 'gamer's benchmark', suggesting it is benchmarking your whole system, when in reality it only really comes into it's own and a video card benchmark.

The other criticism you can level at 3DMark 2003 is that it is more of a feature test of your video card than a performance test. However, I have to say in this day and age, where the buzzwords are 'cinematic rendering' and image quality, taking that kind of line in a benchmark seems like a perfectly fair and useful one to me.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote

Old 02-14-03, 09:46 AM   #93
PreservedSwine
I'm always hungry
 
PreservedSwine's Avatar
 
Join Date: Aug 2002
Posts: 548
Default

I remember people complainind 3D mark was *too* CPU dependant. Now that it doesn't rely on the CPU, people are finding fault in that.
Funny...I guess the guy's at futuremark are thinking that no good deed goes unpunished
Computer junkies sure are a hard group to satisfy
PreservedSwine is offline   Reply With Quote
Old 02-14-03, 10:27 AM   #94
ElMoIsEviL
Registered User
 
ElMoIsEviL's Avatar
 
Join Date: Feb 2003
Location: Canada
Posts: 6
Default

I agree, from what I see 3D Mark 2003 is very video card dependant.

I like this seeing as we're essentially compairing our video cards. And in all honesty a Pentium IV 2.4Ghz with a TNT2 M64 video card will not be faster then an Athlon 900Mhz with a Radeon 9500 Pro.

It's a synthetic benchmark that actually uses some of the futur options that will be available. The thing is games will never fully be DX9.0, they will always keep some elements of DX8.1 and 7.0a.

Unreal 2 is mostly a DX 7.0a game with some features like Cube mapping etc that are DX8.0. They only raised the polygon levels.
Which is why it's good to see Futuremark keep some DX7.0 level tests in there benchmark suite.

Are nVidia cards at a disadvantage? Yes, because they're technologically inferior on the DirectX features side (not the GeForceFX but all the ones that came before) they have issues running games that are built on Pixel shader 1.4.

It might sound funny but the GeForce4Ti is inferior technology wise to even an R200 (Radeon 8500) when it comes to Pixel Shaders, Anisotropic filtering and overall Image quality. The GeForce4Ti does however possess a stronger Vertex shading engine seeing as it's using 2 of them in parrallel as well as having higher clock speeds and a better memory controller (it's a crossbar memory controller).

So before we go judge the performance of the GeForce4Ti is these tests we have agree that it does not have the technological support to achieve higher scores.

The difference between the GeForceFX and all the older GeForce cards is staggering. It's a completely redesigned architecture that was built to be very ATi-like in respects to making it more futurproof.
The Radeon 9700 Pro is very nvidia-like with respects to being very brute in force. Using a better memory controller(the best in the industry) as well as having a better performance to Mhz ratio (astounding can you imagine an R300 clocked at 500Mhz?).

Take it or leave it 3D Mark 2003 is here to stay. It's now the Dawn of Cinematic rendering thanks to R300.u
__________________
DFI Lanparty UT NF4 SLI-DR | AMD Athlon64 X2 4800+ @ 3.2GHz (291x11)(Swiftech MCW5002-64T 226W Pelt) | 2GB OCZ Platinum EL XTC PC4000 DDR 1:1 @ 3-3-3-8 3.0v! | ATi Radeon X1900XTX 512MB (Alphacool NexXxoS NVXP-3 Waterblock) | Sound Blaster X-Fi FaT@l1Ty FPS Sound Card | 2x74GB WD Raptor 10,000 RPM RAID0 | 2x250GB Maxtor Maxline III 7,200 RPM RAID0 | 2x74GB Quantum Atlas III SCSI 10,000 RPM | SuperCooled: Enterprise Custom Cooling

Asus A8R-MVP /vmod | AMD Athlon64 X2 3800+ @ 2.8GHz (10x280) | 2x1024MB PQI PC4200 Performance DDR (1:1) | Asus EN7900GTX 512MB | Sound Blaster X-Fi Xtreme Music Sound Card | 1x250GB Seagate 7200.9 8MB Cache HD
ElMoIsEviL is offline   Reply With Quote
Old 02-14-03, 09:27 PM   #95
PreservedSwine
I'm always hungry
 
PreservedSwine's Avatar
 
Join Date: Aug 2002
Posts: 548
Default

Also found this funny.....
Quote:
Finally, the choice of pixel shaders in game tests 2 and 3 is also odd. These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn’t support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4"
Think someone should tell him TIGER WOODS AND UT2K3 support PS1.4?
PreservedSwine is offline   Reply With Quote
Old 02-15-03, 09:33 AM   #96
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 7,165
Default

Futuremark's Response to 3DMark03 Criticism

http://www.nvnews.net/vbulletin/show...&threadid=7594
MikeC is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 02:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 11:30 PM

All times are GMT -5. The time now is 02:49 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.