Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 02-12-03, 11:04 PM   #73
Chalnoth
Registered User
 
Join Date: Jul 2002
Posts: 1,293
Default

The key here is that the GeForce2 MX, which had the full featureset of the GeForce2, carried the NV11 name. The GeForce4 MX, also of GeForce2 technology, carried the NV17 name.

The upcoming low-cost part (NV34) is an NV3x part. nVidia has not yet watered down the programmability of any part within a family.

This doesn't absolutely guarantee that the NV34 will have the full programmability of the NV30, but it comes close to it (I think it may have a few "lesser" functions removed...such as high-precision log/exp/sin/cos).
__________________
"Physics is like sex. Sure, it may give some practical results, but that's not why we do it." - Richard P. Feynman
Chalnoth is offline   Reply With Quote
Old 02-13-03, 12:25 AM   #74
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

Quote:
Originally posted by Chalnoth
No, Snakeeyes, only the GeForce4 MX used a lesser programming-side featureset than its higher-end siblings.

The GeForce2 MX had the exact same featureset, and so did the Vanta and M64 (not that there was much to support in the latter...).
dunno if you can really call the gf4mx a part of the same family as the gf4ti cards in anything but name m8

though you are absolutely right about the lesser feature-set (for obvious reasons... look no farther than the core)...

Sazar is offline   Reply With Quote
Old 02-13-03, 02:13 AM   #75
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Nvidias new Drivers *cheat* in 3Dmark03

http://discuss.futuremark.com/forum/...=5&o=0&fpart=1

Looks like the GFFX is winning becuase they are not rendering any explosions, and missing other effects. Guess thats how they got so fast so quick, yet made no change to the non scoring benchmarks...

Dont you find it a little odd that you somehow kyle didn't notice that? or make any mention of it? I find it pretty odd.
Hellbinder is offline   Reply With Quote
Old 02-13-03, 03:17 AM   #76
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
Originally posted by Hellbinder
Nvidias new Drivers *cheat* in 3Dmark03

http://discuss.futuremark.com/forum/...=5&o=0&fpart=1

Looks like the GFFX is winning becuase they are not rendering any explosions, and missing other effects. Guess thats how they got so fast so quick, yet made no change to the non scoring benchmarks...

Dont you find it a little odd that you somehow kyle didn't notice that? or make any mention of it? I find it pretty odd.
I love it when video cards don't render fx.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Old 02-13-03, 06:01 AM   #77
Smokey
Team Rainbow
 
Smokey's Avatar
 
Join Date: Jul 2002
Location: FRANCE
Posts: 2,273
Default

Quote:
Originally posted by Hellbinder
Nvidias new Drivers *cheat* in 3Dmark03

http://discuss.futuremark.com/forum/...=5&o=0&fpart=1

Looks like the GFFX is winning becuase they are not rendering any explosions, and missing other effects. Guess thats how they got so fast so quick, yet made no change to the non scoring benchmarks...

Dont you find it a little odd that you somehow kyle didn't notice that? or make any mention of it? I find it pretty odd.
Read the whole thread I didnt see anyone posting that was using a GF-FX, did you? Also have you ever heard of BETA? Come back when you have something better to post, I mean your talking about beta drivers, and the most someone gained from those drivers was 200points, whoopie!
__________________
HTPC/Gaming
| Hiper Type R 580W PSU
| Intel Q9550 @ 4GHz | Gigabyte EP45 UD3R |4x 1024MB OCZ Reaper PC9200 DDR2 | Seagate 320GB/ Maxtor 320GB/ Maxtor 500GB HDD|Sapphire HD5850 | Creative SB X-Fi Titanium Pro | Harmon Kardon AVR135 reciever | Jamo S718 speakers | 42" Plasma 720p (lounge room)-Samsung P2450H (bedroom)
Smokey is offline   Reply With Quote
Old 02-13-03, 07:14 AM   #78
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

From a developer point of view, I have considered all the current NVidia cards (sans NV30) to be DX7 cards.

They do not support PS1.4, which is a DX8 feature.

NV30 supports a subset of DX9. I don't understand why NVidia does this. It's not like they did not know about the specifications of DX9. Yet they add things that are not supported by DX.

Supporting a subset of the DX specs really creates problems for developerss using DX to move on to the next level of graphics programmability.
Skuzzy is offline   Reply With Quote
Old 02-13-03, 04:37 PM   #79
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Skuzzy
From a developer point of view, I have considered all the current NVidia cards (sans NV30) to be DX7 cards.

They do not support PS1.4, which is a DX8 feature.
PS1.4 is part of the DX8.1 spec. PS1.0-1.1 existed in the DX8.0 spec. this is one of the most ridiculous arguments i've ever heard, calling obvious DX8 hardware DX7. maybe if you want to call the gf4 a DX8 part instead of a DX8.1 part your argument could hold some water.

what about all the other DX8.1 cards? SiS Xabre doesn't support PS1.4, neither does Matrox Parhelia. i don't remember if Trident does either, but the point is kinda moot since they haven't released the card.

but seriously, why does nvidia take so much flak when no one other than ATI supports 1.4?
  Reply With Quote
Old 02-14-03, 11:00 PM   #80
Shinri Hikari
Lantern in the dark
 
Shinri Hikari's Avatar
 
Join Date: Jan 2003
Location: nomadic
Posts: 175
Default

Quote:
Originally posted by Smokey
Read the whole thread I didnt see anyone posting that was using a GF-FX, did you? Also have you ever heard of BETA? Come back when you have something better to post, I mean your talking about beta drivers, and the most someone gained from those drivers was 200points, whoopie!
You said what I was going to say.
Good post
__________________
Insanity by definition is the repeated attempts to get different results from doing the same thing repeatedly...
Shinri Hikari is offline   Reply With Quote

Old 02-14-03, 11:02 PM   #81
Shinri Hikari
Lantern in the dark
 
Shinri Hikari's Avatar
 
Join Date: Jan 2003
Location: nomadic
Posts: 175
Default

Since nVidia has pulled out of 3dmark, is this damage control thread moot?
__________________
Insanity by definition is the repeated attempts to get different results from doing the same thing repeatedly...
Shinri Hikari is offline   Reply With Quote
Old 02-14-03, 11:47 PM   #82
gokickrocks
Registered User
 
Join Date: Nov 2002
Posts: 409
Default

Quote:
Originally posted by Smokey
the most someone gained from those drivers was 200points, whoopie!
you make it seem as though 200 points is easy to come by through tweaking (w/o oc'ing the core or mem) in 3dmark03
__________________
"never argue with an idiot, they will bring you down to their level, and beat you with experience"
gokickrocks is offline   Reply With Quote
Old 02-15-03, 06:17 AM   #83
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by StealthHawk
the point is that while they could be games, they aren't games. as has already been pointed out, even the Max Payne game test in 3dmark2001 didn't reflect real world Max Payne performance. so it's obvious that either the engines being used are not indicative of games, or the scenes being portrayed are not examples of real world games. either way something should be done to rectify this.
I still donšt see a problem. All I see is NV whining about nothing. Here are the facts:

NV was big supporter of 3DMark01. They had exactly ZERO problems with it, even though it was not "real-life" benchmark.

Now, all of a sudden they have big problem with 3DMark03. Why? Because it uses PS1.4 that is in every way superior to PS 1.1, 1.2 and 1.3 (1.2 and 1.3 didn't bring any big improvements to PS1.1, PS1.4 did). And that really is NV's problem. They were the ones who chose not to support PS1.4. You could say that NV is holding the industry back by pushing inferior shaders to mainstream.

Ati is one generation ahead of NV in low-end and mainstream: their products all support DX9 and PS1.4, NV's products do not. That is the reason why NV whines. They made a wrong design-decision in the past, and that decisions has come around and bit them in the ass. Seriously, that is NV's problem, and no-one elses. Why should Futuremark cripple their software just so NV's crippled hardware would look better on it? The point of 3DMark is to show how the cards perform using latest and upcoming technology. PS1.1 that NV supports is old and inferior. PS1.4 and 2.0 are considerably better and there are titles on the way that take advantage of it.

I repeat: the point of 3DMark is to test performance in new and upcoming technologies. PS1.1 is neither of those. It's only used as an emergency backup if the card doesn't support PS1.4. PS1.2 or 1.3 are not used since the differences between those and 1.1 aren't that great. They had mostly trivial changes.

PS1.4 and 2.0 are the things that will be used in future games (Doom3 anyone?), and that's what 3DMark tests. If NV wants top keep on pushing yesterdays tech, they can do so. But they can't then whine if they don't look so good in softwre that takes advantage of new tech!

If NV wants to find someone to accuse over this thing, I suggest they look in to mirror. Fact is that Ati's entire product-lineup supports PS1.4 and DX9 (well, 9100 doesn't support DX9, but it has PS1.4). Large part of NV's lineup is still DX7 (GF4 MX)! NV has been holding the industry back, it's about time they catch up!
Nemesis77 is offline   Reply With Quote
Old 02-15-03, 06:21 AM   #84
Nemesis77
Registered User
 
Join Date: Aug 2002
Posts: 114
Default

Quote:
Originally posted by abb
Well, I just ran the 3DMark 2003 on both, my Radeon 9700pro & my Ti4600. I scored an OK 5145 with my 9700 and a disgusting 1689 with my Ti4600.
And that's exactly how it should be. 3DMark03 is meant to test new tech (PS1.4 and 2.0 among others). GF4 does not support them, they have to use (slower) PS1.1 instead. And that REALLY hurts! But it's not problem with 3DMark, it's NV's problem for not supporting better tech.

Again: 3DMark is supposed to test new tech. GF4 doesn't support that new tech, so they don't do so well. It's NV's problem, not 3DMarks. Maybe NV should have decided to support new tech, instead of holding the industry back?
Nemesis77 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
What You Can Expect From GeForce GRID News Archived News Items 0 06-04-12 04:20 PM
Nvidia GeForce 301.42 WHQL drivers DSC NVIDIA Windows Graphics Drivers 5 05-29-12 10:12 PM
Enhance Max Payne 3, Diablo III with GeForce R300 Drivers News Archived News Items 0 05-22-12 06:30 PM
New GPU from Nvidia Announced Today, the GeForce GTX 670 News Archived News Items 0 05-10-12 01:50 PM
Gainward Unleashes the Sexy GeForce GTX 670 Phantom Graphics Card, Also launches the News Archived News Items 0 05-10-12 09:28 AM

All times are GMT -5. The time now is 05:43 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.