Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-12-03, 01:09 AM   #13
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by jAkUp
supposly this is the geforcefx performance in 3dmark 2003
check out the pixel shader 2.0 score....


anyone knows if ATI was running the pixel shaders test at
96bit precision(fp24) and Nvidia in 128 bits (fp32)?

its looks to me the same issue of Doom3..
where Nvidia runs very slow at 128bits and very fast in 64bits.
wouldnt NVidia Cg could help in PS test ?
Nv40 is offline   Reply With Quote
Old 02-12-03, 01:27 AM   #14
Bigus Dickus
GF7 FX Ti 12800 SE Ultra
 
Join Date: Jul 2002
Posts: 651
Default

I'm not sure what precision the NV30 is running at, but I'm fairly sure the 9700 was using 96 bit. I'm fairly sure the 9700 always runs in 96 bit fp precision internally, regardless of input format.

I think the NV30 has a choice of int12, and bot fp16 and fp32 bpp (64 and 128 bit). My guess is that the NV30 is running 128 bit here in the 2.0 shader test, and I'd also guess that nVidia optimized the latest set of drivers to run the first four "game tests" in either int12 or fp16.
__________________
IMO, Mr. Derek Smart is a hypocrite: Only someone who is either (a) lying (b) ashamed of their products (c) just plain ashamed, would hestitate to give out some simple and straight forward information. - Derek Smart, Ph.D.
Bigus Dickus is offline   Reply With Quote
Old 02-12-03, 01:43 AM   #15
ChrisW
"I was wrong", said Chris
 
Join Date: Jul 2002
Location: standing in the corner!
Posts: 620
Default

Quote:
"That means Nvidia has to expend effort to make sure it runs well on our hardware. All that energy that we spend doesn't benefit the user. None. Zero. All that effort doesn't go to benefit any game, either. That's kind of depressing."
I can't believe I just read that! Isn't that exactly what ATI has had to do all this time? Hasn't ATI had to spend all their time making games that were optimized for nVidia cards work correctly on theirs? How can he complain that it wasn't optimized for their cards?
ChrisW is offline   Reply With Quote
Old 02-12-03, 03:12 AM   #16
PsychoSy
 
PsychoSy's Avatar
 
Join Date: Jul 2002
Location: Monroe, MI
Posts: 489
Send a message via ICQ to PsychoSy Send a message via AIM to PsychoSy
Lightbulb

There's a saying in Silicon Valley...

There's lies, damn lies, and benchmarks.

__________________
[b][i]A man's ambition must be small,
To write his name on a s**t-house wall.[/b][/i]
PsychoSy is offline   Reply With Quote
Old 02-12-03, 03:49 AM   #17
Smokey
Team Rainbow
 
Smokey's Avatar
 
Join Date: Jul 2002
Location: FRANCE
Posts: 2,273
Default

Quote:
Originally posted by jAkUp
supposly this is the geforcefx performance in 3dmark 2003
check out the pixel shader 2.0 score....

Yeah and look at game 4 test "heavy use of pixel 2.0 and vertex 2.0" so why is the GF-FX faster even if its on a slower cpu? Oh and dont look at the final score either
__________________
HTPC/Gaming
| Hiper Type R 580W PSU
| Intel Q9550 @ 4GHz | Gigabyte EP45 UD3R |4x 1024MB OCZ Reaper PC9200 DDR2 | Seagate 320GB/ Maxtor 320GB/ Maxtor 500GB HDD|Sapphire HD5850 | Creative SB X-Fi Titanium Pro | Harmon Kardon AVR135 reciever | Jamo S718 speakers | 42" Plasma 720p (lounge room)-Samsung P2450H (bedroom)
Smokey is offline   Reply With Quote
Old 02-12-03, 05:03 AM   #18
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by Smokey
Yeah and look at game 4 test "heavy use of pixel 2.0 and vertex 2.0" so why is the GF-FX faster even if its on a slower cpu? Oh and dont look at the final score either
I guess game test 4 is still doing a fair few other things apart from using pixel/vertex shaders, whereas the feature tests are almost exclusively using the features they are meant to test.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 02-12-03, 07:34 AM   #19
scott123
Registered User
 
scott123's Avatar
 
Join Date: Sep 2002
Location: USA
Posts: 473
Default

Obviously Nvidia (once again) chose to make their comments at a rather awkard time, but as a whole I agree with what they are saying. I use 3dmark to verifiy that nothing has changed with my computer from the last test, as a way to insure everything is ok. Beyond that, I have never seen a corralation between better scores from new drivers, and actual games that play faster. In every case that 3dmark goes up, the games I play remain the same.

I can see where they are coming from, because being a slave to 3dmark when making new drivers is simply an ugly rat race to get into. When new drivers are developed, there purpose should be broad based, and if they only improve the score of a benchmark, then why spend the resources in the first place?

3dmark has gotten somewhat perverted, in the sence that it has become a overclockers heaven, where folks ramp their computers up to the limits, just to achieve the highest score. The reality is that these systems in that state wouldn't be able to give you more then 10 minuits of gameplay.

Scott
__________________
ASUS P5B Deluxe
Intel Core Duo 6600
eVGA 8800GTX
Creative Xfi
Cooler Master CM Stacker STC-T01
scott123 is offline   Reply With Quote
Old 02-12-03, 08:02 AM   #20
PsychoSy
 
PsychoSy's Avatar
 
Join Date: Jul 2002
Location: Monroe, MI
Posts: 489
Send a message via ICQ to PsychoSy Send a message via AIM to PsychoSy
Default

I agree.

Since saturnotaku's not around at the moment, I'll go ahead and recite "The Script" in his absense: You can't FRAG with 3DMark!.

3DMark is something that I use once in blue moon just to get a general ballpark figure. I certainly as hell don't accept their scores as if it were gospel. It's a benchmark untility - not a game. Too many people, however, have a fetish for the darned thing and the moment they lose 200+ points, they're ready to pop a few Prozacs and check themselves into a damned 12-step program.

That's it - 3DMark buffs need their own official brick and mortar support group.

Let's call it "L33t|\|355 4|\|0|\|y|\|\0u5" (that's "Leetness Anonymous" for all you folks that require a translation - I'm not a leetspeek wiz by sense of the word so I probably butchered the hell of that attempt anyway).

I can see it now...

""Hi...name's Dave...I used to be supaleet..."

HI, DAAAAAAAVE!!!

"Thanks...uh, I lost 748 points in 3DMark today and I'm depressed. Hey Carmack, pass me one of those Ativans and a Zoloft, too, bro!"

Good grief!
__________________
[b][i]A man's ambition must be small,
To write his name on a s**t-house wall.[/b][/i]
PsychoSy is offline   Reply With Quote

Old 02-12-03, 08:19 AM   #21
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

The problem I have with nV about this steams from the past. Ask your self as 3dmark in any of its old forms been a dencent benchmark for how cards work in other games? No it never has. Why all a sudden is nV so upset with it? I mean last year nV thought it was ok to use their perfromance analyzer to recomend video cards based of 3dmark scores but now its not accurate? Why did nV say its ok then even though they/we know that 3dmark was not accurate of real world perfromance?

To me nV is sounding the a spolied brat that when they can not get everything their way they do something like this. Good greif Charile Brown.....
jbirney is offline   Reply With Quote
Old 02-12-03, 08:24 AM   #22
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Quote:
Everyone knows that the move in future games is AWAY from Multi-texture games, and towards Single texture games with lots of shaders.

NVidia Themselves have been pushing this idea right along woth everyone else. I have been in Countless discussions about this at B3D and other web sites...
ROFLMAO!!!!

Suuuuuure, it is. Thats why these new cards can access upto 16 textures in one pass huh?

The future is using _MORE_ textures. Not less. Pray-tell how u get glossmaps, specular maps, bump-maps, detail maps, basemap, anisotropic lighting map, etc etc.. in single texture games?!?!
Nutty is offline   Reply With Quote
Old 02-12-03, 08:28 AM   #23
scott123
Registered User
 
scott123's Avatar
 
Join Date: Sep 2002
Location: USA
Posts: 473
Default

Nvidia may not have market share this time next year, but when the leading graphics card maker bows out, and doesn't sanction your benchmark (along with some leading web-sites), you have problems. I think Futuremark has problems right now.

Scott
__________________
ASUS P5B Deluxe
Intel Core Duo 6600
eVGA 8800GTX
Creative Xfi
Cooler Master CM Stacker STC-T01
scott123 is offline   Reply With Quote
Old 02-12-03, 08:48 AM   #24
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by Nutty
ROFLMAO!!!!

Suuuuuure, it is. Thats why these new cards can access upto 16 textures in one pass huh?

The future is using _MORE_ textures. Not less. Pray-tell how u get glossmaps, specular maps, bump-maps, detail maps, basemap, anisotropic lighting map, etc etc.. in single texture games?!?!
Nutty when/if developers start using more Pixel Shaders then the need for the TMUs which helps out in Multi-texturing will soften. The trend is going away from multi-textured games.
jbirney is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 02:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 11:30 PM

All times are GMT -5. The time now is 02:40 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.