Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-11-03, 07:56 PM   #61
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default Re: No damage control

Quote:
Originally posted by T-Spoon
But check the PS2.0 test... amazing.. what a difference.
This coincides well with other PS 2.0 tests we've seen with the FX. I say that these tests should be ignored until availability (as an aside, I also do not suggest preordering in general...you're more likely to get the first batch of hardware, which is much more likely to have significant problems). Since the PS 2.0 test is so low, it only makes sense that this is holding back the FX in current benchmarks. Again, wait until the FX is available until judging from these benches. If the PS 2.0 performance hasn't changed significantly, then worry. Right now, I wouldn't.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-12-03, 06:20 AM   #62
T-Spoon
Registered User
 
Join Date: Jan 2003
Location: The Netherlands
Posts: 180
Default Re: Re: No damage control

Quote:
Originally posted by Chalnoth
This coincides well with other PS 2.0 tests we've seen with the FX. I say that these tests should be ignored until availability (as an aside, I also do not suggest preordering in general...you're more likely to get the first batch of hardware, which is much more likely to have significant problems). Since the PS 2.0 test is so low, it only makes sense that this is holding back the FX in current benchmarks. Again, wait until the FX is available until judging from these benches. If the PS 2.0 performance hasn't changed significantly, then worry. Right now, I wouldn't.
If the PS 2.0 test is holding the FX back, then why is GT4 better on the FX than on the 9700Pro? GT4 also uses PS2.0...

Taken from Rage3D 3DMark Analysis:

Quote:
Technical Summary
This DX9 test makes use of 2.0 pixel and vertex shaders. Each leaf and blade of grass is a separate object that moves independently. The leaf movements are calculated using 2.0 vertex shaders, using the new DX9 sincos instruction. The grass movement is modelled from 1.1 vertex shaders.

The lake is rendered with 2.0 pixel shaders. Multiple texture stages are used. It uses a ripple map, two reads of a normal map, a reflection map, a refraction map, and a reflection cube map for reflection from more distant objects. It also uses a transparency map and calculates per-pixel Fresnel.
Both these tests use PS 2.0 and yet in one the FX is way behind the 9700Pro... To me it seems that the latest Dets are optimized for the GameTests in 3DMark2k3, since these are the ones used for the endscore.
T-Spoon is offline   Reply With Quote
Old 02-12-03, 07:24 AM   #63
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by Nemesis77
Because scenes in MP had more overdraw than scenes in 3DMark? Because Kyro wasn't as good with polygons? There could be numerous reason. People are demanding that 3DMark must use real-life game-engine. It does. But now people are demanding that it must also use identical scenes than in some games?

The point is that the game-tests in 3DMark (Dragothic, Lobby, Race) could very well be from a "real" game. They are not "ungame"-like, they are like real games. And they are done using real game-engine. Of course there is variation

No thats not the point. The point is even thought they are game-like and using a game engine it still dose not even come close to real world perfromance and thats the whole issue here. I would really enjoy if they dropped all 4 of the game test and just posted the feature test scores with no weighing to them. Here we would get some detial info and not cry about xxx 3dmarks lost with yyy driver update.
jbirney is offline   Reply With Quote
Old 02-12-03, 08:04 AM   #64
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by Chalnoth
Not or. And.

These game engines are used as the basis for a number of different games. I think that the optimal one-program benchmark would use both a variety of different engines, and a variety of game scenarios for each engine.

Yes, it would take quite a long time to run this benchmark.
What you are asking is impossible. Futuremark would have to license all those engines and that costs alot of money. So they chose to use MAX-FX instead. What you are saying is propably ideal, but unrealistic.
Nemesis77 is offline   Reply With Quote
Old 02-12-03, 08:06 AM   #65
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by jbirney
No thats not the point. The point is even thought they are game-like and using a game engine it still dose not even come close to real world perfromance and thats the whole issue here. I would really enjoy if they dropped all 4 of the game test and just posted the feature test scores with no weighing to them. Here we would get some detial info and not cry about xxx 3dmarks lost with yyy driver update.
What is "game-like"-performance? I mean, the game-tests in 3Dmark are games that you just can't control. They could very well be "real" games.

In some games, card X gets good performance, in other games same card gets poor performance. That's just they way it is. Varying FPS is normal, so how can you say that differences between MP and 3DMark are "ungamelike"?
Nemesis77 is offline   Reply With Quote
Old 02-12-03, 08:59 AM   #66
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by Nemesis77
What you are asking is impossible. Futuremark would have to license all those engines and that costs alot of money. So they chose to use MAX-FX instead. What you are saying is propably ideal, but unrealistic.
I'd say it's closer to unlikely than impossible. It would probably be easy to do it if they stopped selling a version of 3DMark. Since 3DMark is obviously not a high-grossing product, it may be possible to work out special circumstances for licensing and still selling a version of the product.
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-12-03, 04:17 PM   #67
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nemesis77
What is "game-like"-performance? I mean, the game-tests in 3Dmark are games that you just can't control. They could very well be "real" games.

In some games, card X gets good performance, in other games same card gets poor performance. That's just they way it is. Varying FPS is normal, so how can you say that differences between MP and 3DMark are "ungamelike"?
the point is that while they could be games, they aren't games. as has already been pointed out, even the Max Payne game test in 3dmark2001 didn't reflect real world Max Payne performance. so it's obvious that either the engines being used are not indicative of games, or the scenes being portrayed are not examples of real world games. either way something should be done to rectify this.
  Reply With Quote
Old 02-12-03, 05:59 PM   #68
abb
Registered User
 
abb's Avatar
 
Join Date: Feb 2003
Posts: 27
Default

Well, I just ran the 3DMark 2003 on both, my Radeon 9700pro & my Ti4600. I scored an OK 5145 with my 9700 and a disgusting 1689 with my Ti4600. Remember that right now the Ti4600 is Nvidias flagship with the GFFX canned. No wonder that Nvidia does not approve of the 3DMark03. It makes Nvidia look pitiful and it will for some time because ATI is advancing to their R350 next month and Nvidia can't even get off the ground. I hate to say it but Nvidia now has the same future as 3DFX had when it was trying to launch the 6000- none. Oh well, lets hope that the success of their Nforce2 chipset will keep their heads above water, but I think their Graphics devision is being reduced to lower end cards and no longer top performers like in the past. ATI will be having the top spot for quite a while, and actually that is not a bad thing since ATI has improved 110% with their drivers to produce a stable, fast card with excellent image quality.
__________________

Athlon XP 2400+ (11.5x188fsb)
ThermalRight SLK-800
A7N8X Deluxe
2x256mb Corsair XMS PC3500
ATI Radeon 9700 Pro (Cat's 3.1)
Soundblaster Audigy2 Platinum
Promise TX2000 Raid Controller Card
Raid 0: 2x Maxtor 740DX 80GB ATA133
Pioneer 16x DVD Slot Load
Pioneer A04 DVD-RW
LiteOn 52x24x52 CDRW
Iomega Zip100
Enermax EG651 530W power Supply
OS: Windows XP Pro SP1
Thermaltake A6000A Xaser II Case
abb is offline   Reply With Quote

Old 02-12-03, 06:02 PM   #69
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

Quote:
Originally posted by StealthHawk
the point is that while they could be games, they aren't games. as has already been pointed out, even the Max Payne game test in 3dmark2001 didn't reflect real world Max Payne performance. so it's obvious that either the engines being used are not indicative of games, or the scenes being portrayed are not examples of real world games. either way something should be done to rectify this.
It is not just about the scenes that are portrayed, but about how the engine itself is built. There are many different ways to get to a certain output, or, in more cases, a similar output.

As a quick example, the very early DOOM3 alpha builds, supposedly, run many times faster than the Game 2 and Game 3 tests.

Oh, and abb, it's only the Ultra that is going to reach limited availability. The non-Ultra should be widely-available. The entire GeForce4 line should be on its way out by around April-May (once the NV31/34 reach the market).
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-12-03, 06:56 PM   #70
SnakeEyes
Registered User
 
SnakeEyes's Avatar
 
Join Date: Jul 2002
Location: Lake Zurich, IL
Posts: 502
Send a message via ICQ to SnakeEyes
Default

Quote:
Originally posted by Chalnoth
Oh, and abb, it's only the Ultra that is going to reach limited availability. The non-Ultra should be widely-available. The entire GeForce4 line should be on its way out by around April-May (once the NV31/34 reach the market).
It'd be extra-nice if nVidia finally moved a DX8+ class card down to the low-end mainstream level (one that still has decent performance, not the GF2 / GF4MX cards). Something like a Ti4200-8x. Unless of course, either NV31 or NV34 IS that card (I highly doubt they're going down that far with this core so soon, unless they've castrated one or both of those chips way more than rumors are saying- the MX cards have always lost quite a bit of feature support for whatever version of DX was out when they were launched, and I'd hate to see nV go the same way now. They really need cards all across the price spectrum if they want to compete with ATI, who is busily moving more fully featured cards into slots throughout the range.)
__________________
Snake-Eyes
SnakeEyes is offline   Reply With Quote
Old 02-12-03, 09:35 PM   #71
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

No, Snakeeyes, only the GeForce4 MX used a lesser programming-side featureset than its higher-end siblings.

The GeForce2 MX had the exact same featureset, and so did the Vanta and M64 (not that there was much to support in the latter...).
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-12-03, 10:56 PM   #72
SnakeEyes
Registered User
 
SnakeEyes's Avatar
 
Join Date: Jul 2002
Location: Lake Zurich, IL
Posts: 502
Send a message via ICQ to SnakeEyes
Default

Sorry, my mistake. But my point had more to do with the fact that the GF2MX was the entry / mid level chip even after GF2GTS/Ti went away and GF3 was here. Once GF4 came out, the MX version of THAT became the entry / mid level card, and it was also basically crippled compared to its higher end siblings featurewise (just like the 2MX was throughout the GF3 lifetime).

I had hopes that nVidia would release a GF3MX type of card when GF3Ti came out, so that the full feature set would be exposed to the entry-level crowd (even if it might not generally have been very fast, programmers could code for the extra features in their games). I thought when the GF4MX first arrived that it would finally signal that change, but then we found out that it was basically an enhanced GF2 core with the AA engine added (not that AA was bad, but where were the shaders?).

I really hope that this round of cards will actually have a low and mid range product supporting the full feature set of the same versions of APIs that the nV30 itself supports. ATI has been killing nVidia in this area, since they have cards in the low and mid range supporting the full spectrum of capabilities, even though they might be slow about it.

I have a feeling that nVidia's griping about 3DMark2003 has more to do with the probability that their low and mid range lineups are going to look very poor indeed when they can't run at least one of the games that the score is based on. In other words, while their cards looked great in the benchmark, they were happy. Now that they're challenged, they are resorting to badmouthing one of the companies that contributed to their past successes.
__________________
Snake-Eyes
SnakeEyes is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
What You Can Expect From GeForce GRID News Archived News Items 0 06-04-12 04:20 PM
Nvidia GeForce 301.42 WHQL drivers DSC NVIDIA Windows Graphics Drivers 5 05-29-12 10:12 PM
Enhance Max Payne 3, Diablo III with GeForce R300 Drivers News Archived News Items 0 05-22-12 06:30 PM
New GPU from Nvidia Announced Today, the GeForce GTX 670 News Archived News Items 0 05-10-12 01:50 PM
Gainward Unleashes the Sexy GeForce GTX 670 Phantom Graphics Card, Also launches the News Archived News Items 0 05-10-12 09:28 AM

All times are GMT -5. The time now is 09:34 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.