Go Back   nV News Forums > Website Related > Feedback Forum

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-13-03, 01:25 AM   #73
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Aim: To maintain system stability while improving performance.

I have drivers installed.
I tweaked some options in my registry.
I ran 3dmark 2003 (for the flanging last time it is not 2k3) and I scored 500 points higher.
I then decided to run a game.
My ingame framerates were 50fps higher than previously.

Conclusion: I have found no system instabilities and my framerate washed my room.

__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-13-03, 03:37 AM   #74
FrgMstr
Registered User
 
Join Date: Dec 2002
Location: The Great State of Texas
Posts: 47
Default

EDIT: Nevermind...
__________________
Kyle Bennett
Editor-in-Chief @ HardOCP.com

Last edited by FrgMstr; 02-13-03 at 03:40 AM.
FrgMstr is offline   Reply With Quote
Old 02-13-03, 04:33 AM   #75
hordaktheman
Hans... boobie...
 
hordaktheman's Avatar
 
Join Date: Aug 2002
Location: Iceland
Posts: 273
Default

Goddemet! It really is quite simple:

If you were going to upgrade EITHER your cpu OR your vid card, which should you upgrade in order to play future games?

The Doom3 alpha suggests you should upgrade your cpu, while 3dmark03 suggests you should upgrade your vid card.

Doom3 is a future game, while 3dmark03 is not. Conclusion: Doom3 is a better benchmark to represent future games' performance.

Another example: When UT2003 was released I was running on an Athlon 1133/Geforce DDR rig. 3dmark2001 suggested I should upgrade my vid card in order to play it more effectively. Which I did.

Did it improve my performance? Yes, but only a little. Did it live up to the twofold performance gain that 3dmark2001 suggested? Not by a long shot.

HOWEVER, last week I upgraded my cpu to an AthlonXP 2000. My 3dmarks only went up by about a 1000 points, BUT my UT2003 performance skyrocketed. Meaning that 3dmark2001 is/was a crappy benchmark to represent future (at the time) games.

Even for kicks I put my old Geforce 1 card in my computer. UT2003 runs WAY better on an AthlonXP 2000/Geforce 1 DDR setup than on an Athlon 1133/Ti4200 setup.

Which leads us back to the 3dmark03/Doom3 dilemma: Having checked the alpha I know for a fact that Doom3 will run better on a top of the line cpu with a midrange vid card, than a midrange cpu with a top of the line vid card. 3dmark03 suggests the exact opposite, which is, quite simply, wrong.

It really is as simple as that.
__________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.

-Aristotle
hordaktheman is offline   Reply With Quote
Old 02-13-03, 04:40 AM   #76
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

When I upgraded my CPU from 800MHz to 1.8GHz I had no performance benefit. I was using a Geforce 3, FSAA really dived on that card with newer games.

When I upgraded from my NV20 to and R300, my performance jumped more than 6 fold.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-13-03, 04:51 AM   #77
hordaktheman
Hans... boobie...
 
hordaktheman's Avatar
 
Join Date: Aug 2002
Location: Iceland
Posts: 273
Default

Yes, I know what you mean but the point is that at least you have the possibility to turn off the fsaa.

On a high end vid card/midrange cpu, you can turn on high detail settings without a performance drop, but your performance is crappy anyway, whether it's at low detail or high detail.

On a midrange vid card/ high end cpu, you have at least the possibility to lower your resolution/detail, or turn off fsaa.

My point is that if you had upgraded your nv20 to a r300 on your 800mhz cpu, your performance would have sucked just as bad, while in your case, you had the OPTION to improve your framerates by turning off fsaa. If you had turned off the fsaa on your 1.8ghz cpu your performance WOULD HAVE improved.

That's why I'm saying that 3dmark2001/2003 are deceiving.
__________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.

-Aristotle
hordaktheman is offline   Reply With Quote
Old 02-13-03, 04:55 AM   #78
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
My point is that if you had upgraded your nv20 to a r300 on your 800mhz cpu, your performance would have sucked just as bad, while in your case, you had the OPTION to improve your framerates by turning off fsaa. If you had turned off the fsaa on your 1.8ghz cpu your performance WOULD HAVE improved.
Sorry, I wasn't talking about 3dmark in that post.

Anyway, I only play with FSAA. I don't believe turning off AA is an option despite the option being there in the driver control panel.

My brothers P3 800MHz takes just as large of a dip with FSAA as it had on my P4 1.8GHz.
He also doesn't believe turning off FSAA is an option.
Though at least his games are still playable.

I only care about FSAA+AF performance. Nothing less, I ignore all benchmarks that don't show FSAA+AF scores, as they are useless to me as toilet paper.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR

Last edited by Kruno; 02-13-03 at 05:01 AM.
Kruno is offline   Reply With Quote
Old 02-13-03, 04:59 AM   #79
hordaktheman
Hans... boobie...
 
hordaktheman's Avatar
 
Join Date: Aug 2002
Location: Iceland
Posts: 273
Default

No, I know you weren't, but the point is that you were already straining the vid card by running it with fsaa turned on. Had you turned fsaa off, your cpu upgrade would have been far more noticable.

At the same time your vid card upgrade absolutely helped your fsaa framerates, they wouldn't have if you hadn't already upgraded your cpu.
__________________
It is the mark of an educated mind to be able to entertain a thought without accepting it.

-Aristotle
hordaktheman is offline   Reply With Quote
Old 02-13-03, 06:07 AM   #80
Smokey
Team Rainbow
 
Smokey's Avatar
 
Join Date: Jul 2002
Location: FRANCE
Posts: 2,273
Default

Quote:
Originally posted by SnakeEyes
Smokey:

It goes back to what jbirney posted earlier (which I agree 100% with). Basically his point was that the final weighted score is worthless, but the tests themselves as well as their results are extremely useful.

Ignore the final '3DMark' score, and instead look at the details to see what the average framerates for the game scenes is. Those will tell you more about your card's performance than the score ever could. (For instance, getting 40+ fps for the 9700Pro vs. the FX's ~15 at the moment would tell me that for what that particular scene is testing, the FX is especially weak. The weighting of the scores to determine the final 3DMark combined score can hide things like that.)
Yes, but when im getting 2.6fps on GT2+3 Im screwed whichever way you look at it, and getting my new cpu soon isnt really going to help
__________________
HTPC/Gaming
| Hiper Type R 580W PSU
| Intel Q9550 @ 4GHz | Gigabyte EP45 UD3R |4x 1024MB OCZ Reaper PC9200 DDR2 | Seagate 320GB/ Maxtor 320GB/ Maxtor 500GB HDD|Sapphire HD5850 | Creative SB X-Fi Titanium Pro | Harmon Kardon AVR135 reciever | Jamo S718 speakers | 42" Plasma 720p (lounge room)-Samsung P2450H (bedroom)
Smokey is offline   Reply With Quote

Old 02-13-03, 07:30 AM   #81
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

sebazve,

I think Brent and Kyle (dont know so just my worthless 2 cents input) have been wanting to get away from the 3dMark thing for some time now. Brent has posted in the B3D forums that he knows that 3dmark2001 was not very usefull for measuring card A vrs card B. Now why they did not do this awhile ago (stop using 3dmark 2001 with the overall results in reviews) I dont know. But I dont think they are playing favorites. Sometimes when a new rev of a tool comes out its the best time to pitch it.....
jbirney is offline   Reply With Quote
Old 02-13-03, 09:27 AM   #82
sebazve
 
Join Date: Jul 2002
Location: Montevideo, Uruguay
Posts: 421
Default

Quote:
Originally posted by jbirney
sebazve,

I think Brent and Kyle (dont know so just my worthless 2 cents input) have been wanting to get away from the 3dMark thing for some time now. Brent has posted in the B3D forums that he knows that 3dmark2001 was not very usefull for measuring card A vrs card B. Now why they did not do this awhile ago (stop using 3dmark 2001 with the overall results in reviews) I dont know. But I dont think they are playing favorites. Sometimes when a new rev of a tool comes out its the best time to pitch it.....
ok then so end of story. i wasnt really attacking hardop or anything since i like their reviews i just wanted to know why th e sudden desicion just after nvidias...
__________________
Signatures are a waste of bandwidth!
thanks rwolf!!!!! :-P
sebazve is offline   Reply With Quote
Old 02-13-03, 10:04 AM   #83
pelly
Registered User
 
pelly's Avatar
 
Join Date: Jul 2002
Posts: 681
Default

Quote:
I think Brent and Kyle (dont know so just my worthless 2 cents input) have been wanting to get away from the 3dMark thing for some time now.
If I should be so bold as to give my take on HardOCP's position...I can assure you that you are correct. We ( especially KYle ) have been unsatisfied with 3dMark's illustration of a real gaming environments for quite some time. Long before 3dMark2003 was in any appreciable form, Kyle was already contemplating an article to outline the points he has recently made. If you read the article you will find that his points cover all benchmarks and do not solely attack 3dMark2003. Rather, the arrival of 3dMark2003 was merely a convenient means of emphasizing the points being made.

I honestly can't see why anyone would try to attack Kyle or HardOCP. At the end of the day, these articles and editorials are intended to benefit the consumer...For those grasping at straws and claiming that [H] is an NVIDIA-fanboy...I must remind you of the dozens of times [H] has been on NVIDIA's case for an issue. In addition, you might remember that Radeon 9700 Pro's are in each review testbed and that we were less than "blown away" ( ironic ) by the GeForce FX.

In my opinion, many of us need to take a few steps back and look at the big picture. We need to lose the delusions of conspiracies and shady partnerships and realize that the situation at hand is very black and white. You have a large website with a big name in your corner...trying to keep the big vendors honest and ensure you get the best quality for your dollar.

I trust that this will help clarify this situation and bit and put things in the proper perspective...

pelly is offline   Reply With Quote
Old 02-13-03, 11:55 AM   #84
batterbrain101
"TAZ LIKE!"
 
batterbrain101's Avatar
 
Join Date: Sep 2002
Location: Spokane, Washington, USA
Posts: 191
Default

Quote:
Originally posted by sebazve
what i dont understan from you guys is that for years you haven been using 3dmark as a benchmark but now since nvidia says it sucks you think that too.
Nvidia and ATI do spend time optimizing their drivers for 3dmark so what???
It's not that 3dmark sucks, its that drivers get optimized for increased performace for the benchmark but the performance of games doesn't improve (not with the last dozen or so det relaese anyway) I personally have no prob with 3dmarks etc., I paid for a video card so I can play games at full tilt with no issues, etc, who wants to spend serious cash on a card only to get home and find that while the benchmark ran just fine, that awesome game (s) aren,t because of the driver /game/hardware are having compatibility issues. That's always a nice feeling right? Now your thinking great X amount of money for this. Nice.
__________________
] My village called, their Idiot is missing!

My rig:
Asus Maximus VI Hero
Intel i7 4770K@ 4.2 GHz
16GB G-Skill Trident X DDR3 2400 Ram
Corsair 750HX "Silver Certified" PSU
Corsair H70 Hydro CPU Cooler
2x PNY GTX 770 OC2 4GB in SLI
180 GB Intel 520 SSD
2 TB Barracuda HD, 2 TB WD Caviar Green
Logitech G-510 Keyboard
Windows 7 Professional 64 bit
3 Samsung 24" LED SyncMaster Monitors

batterbrain101 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
(PR) New 3DMark Trailer Shows Stunning DirectX 11 Graphics News Archived News Items 0 06-21-12 08:30 AM
Computex: 3DMark Announces Windows 8 Benchmarking - First Screenshots News Archived News Items 0 06-05-12 06:30 PM
poor 3Dmark score wysiwyg Benchmarking And Overclocking 4 09-27-02 04:25 AM
3dmark reports my fsb is 66? Gator Benchmarking And Overclocking 7 09-21-02 10:10 PM
3DMark, Fastest Webmasters and Me. intercede007 Benchmarking And Overclocking 4 08-17-02 10:49 AM

All times are GMT -5. The time now is 08:01 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.