PDA

View Full Version : NVIDIA rant


SurfMonkey
02-14-03, 08:48 AM
Some strange things have happend in the GFX world over the past few months and nVidia seems to have been the most affected.

1. The NV3x series is held up due to some unspecified complications with it's production.

2. When it arrives it's slow, hot, and power hungry.

3. They pull out of the FutureMark beta program because they refuse to pay anymore cash towards its development/ They must have known that it would be used to show case new hardware, and basically gave ATi six months to optomise their drivers for 3DMark03. Then they complain when the NV30 gets slaughtered and dismiss 3DMark03 as a serious benching tool. Previously they were the biggest supporters as it promoted their hardware. Now the hoo har seems to be about PS1.4 support which is the weak point of the NV3x, It can do 1.1,1.2,1.3,2.0,2.0+ but no 1.4. They could have put that support in the NV25 and the current generation of cards would perform wonders in 3Dmark. But they didn't bother.

That basically means that the release of 3DMark03 has shown up their entire range of consumer cards to be completely underwhelming compared to ATI's range which can run the whole gamut of tests. Nvida claim it would be fairer of 3dmark03 to fall back to PS1.3 shaders rather than PS1.1 though it would make no difference to the render speed. The GF will still need two passes to complete the scene. Maybe nVidia should have included ps1.4 support from the NV25 onwards, that would have been fairer for its customers. And couldn't have been too hard.

It seems FutureMark has turned around and bitten them on the ass for being too pushy.

4. The FP16 format which would have given the FX much of it's speed boost seems to be competely useless. The minimum FP format supported in DX9 is 0-23. That's FP24 like the R300 uses. So the NV3x has to use costly FP32 for all it's PS operations, which is going to slow it down lots. nVidia interpreted the specifications wrong and there was a typo. ATi got it right though, maybe M$ gave them a little hint? A slap for nVidia over the X-Box (missed milestones and an underperforming solution).

It seems like nVidia were so sure that everyone would follow in their footsteps that they never turned around to check. And when they finally got to the party they found out everyone else had eaten the food and someone had p*ssed in the punch. And eveyone else was dressed normally, nVidia on the other hand turned up in fancy dress. Maybe a Dodo suit?

It just looks like they've had the rug well and truly pulled from under their feet. I just hope that their NV3x derivatives have got what it takes to win back the market. They still have good market share and lots of cash in the bank, lets hope they get it right next time.

PreservedSwine
02-14-03, 09:05 AM
PS1.4 support which is the weak point of the NV3x, It can do 1.1,1.2,1.3,2.0,2.0+ but no 1.4
PS 1.4 is a sub-set of PS2.0....it is my understanding that anythings that supports PS2.0 or must be backwards compatable and support all PS versions up to that one as well, including PS1.4.

The hoohah of Nvidia seem to be thier previous generation cards not supporting PS1.4...the GF3 and GF4 lineup..
As I understand it, PS1.4 is a substantial leap in technology from PS1.1, 1.2, and 1.3...

However, I've heard some weird rumours of the NV31 and NV34 *not* supporting many DX9 features...this may add fuel to Nvidia's fire for the assualt on 3D mark, as it ensures continued poor sub-[ar performance with the latest benchmarking tools...Again Nvidia is releasing new cards w/ old technology......

Sazar
02-14-03, 09:40 AM
Originally posted by PreservedSwine
PS 1.4 is a sub-set of PS2.0....it is my understanding that anythings that supports PS2.0 or must be backwards compatable and support all PS versions up to that one as well, including PS1.4.

The hoohah of Nvidia seem to be thier previous generation cards not supporting PS1.4...the GF3 and GF4 lineup..
As I understand it, PS1.4 is a substantial leap in technology from PS1.1, 1.2, and 1.3...

However, I've heard some weird rumours of the NV31 and NV34 *not* supporting many DX9 features...this may add fuel to Nvidia's fire for the assualt on 3D mark, as it ensures continued poor sub-[ar performance with the latest benchmarking tools...Again Nvidia is releasing new cards w/ old technology......

considering that a lot of the rumours may be unsubstantiated... and that the overall design of the nv3x lineup is using roughly an iteration of the same core.. I think it would be highly logical to assume that they ARE dx9 compliant... perhaps doing some work in software (shaders ?)

it would be highly implausible for nvidia to claim something once again and not deliver...

perhaps the nv31 may not be quite a speedy little devil :) specially with (perhaps ?) a 4x1 architecture and 128bit memory... but it is a low/mid range solution...

PreservedSwine
02-14-03, 09:50 AM
Originally posted by Sazar
considering that a lot of the rumours may be unsubstantiated... and that the overall design of the nv3x lineup is using roughly an iteration of the same core.. I think it would be highly logical to assume that they ARE dx9 compliant... perhaps doing some work in software (shaders ?)



You're right, of course, it's just that they came from reasonably accurate sources..:)

digitalwanderer
02-14-03, 10:05 AM
Originally posted by SurfMonkey
3. They pull out of the FutureMark beta program because they refuse to pay anymore cash towards its development/ They must have known that it would be used to show case new hardware, and basically gave ATi six months to optomise their drivers for 3DMark03. Then they complain when the NV30 gets slaughtered and dismiss 3DMark03 as a serious benching tool. Previously they were the biggest supporters as it promoted their hardware. Now the hoo har seems to be about PS1.4 support which is the weak point of the NV3x, It can do 1.1,1.2,1.3,2.0,2.0+ but no 1.4. They could have put that support in the NV25 and the current generation of cards would perform wonders in 3Dmark. But they didn't bother.

That basically means that the release of 3DMark03 has shown up their entire range of consumer cards to be completely underwhelming compared to ATI's range which can run the whole gamut of tests. Nvida claim it would be fairer of 3dmark03 to fall back to PS1.3 shaders rather than PS1.1 though it would make no difference to the render speed. The GF will still need two passes to complete the scene. Maybe nVidia should have included ps1.4 support from the NV25 onwards, that would have been fairer for its customers. And couldn't have been too hard.

It seems FutureMark has turned around and bitten them on the ass for being too pushy.

Great thumbnail analysis of the last couple of weeks in the world of viddy cards, I really liked the above quote and think it's gonna be a lot more of an issue in a month than it is right now and am trying to understand it better.

Is it just me, or is nVidia seem to be handling the 3dm2k3 situation in a horribly hypocritical and sort-of bullsh!tty-ish way? I keep thinking that everytime they're faced with some bad news they just seem to pick the absolute worst approach to dealing with it and I just don't get WHY. :confused:

Captain Beige
02-14-03, 10:08 AM
Originally posted by PreservedSwine
Again Nvidia is releasing new cards w/ old technology......

of course, they've spent years writing drivers for old tech and have a decent reputation for it. new-fangled tech scares their driver team.

Skuzzy
02-14-03, 01:08 PM
It is quite correct that if a video card supports PS2.0, it automatically supports PS1.4 as 2.0 contains all the 1.4 instructions.

Now, if code is written to explicitly look for 1.4, that is wrong.

if(pixel_shader_version != 1.4) {
printf("No PS 1.4 support\n");
}

The above is wrong and if 3dmark did it this way, then they are making an error. The code should read:

if(pixel_shader_version >= 1.4) {
printf("HOUSTON, WE HAVE LIFT-OFF\n");
}

I hope 3dmark did not hard code test explicitly for 1.4. That would be a major error.

Bigus Dickus
02-14-03, 01:31 PM
They did "hard code" some tests for PS1.4, and it is not an error. Futuremark contains tests game tests for DX7, DX8, and DX9.

In the DX8 tests, Futuremark decided to allow hardware capable of using PS1.4 to use it (DX8.1), and if it wasn't available then allow cards to fallback to PS1.1 (same speed as 1.2 and 1.3). Falling up to PS2.0 would make no sense for a DX8 test, since PS2.0 is part of DX9 and not DX8. It is also not representative of the way a DX8.1 game would be coded (the game would fall back to PS1.3 or earlier... if it was PS2.0 capable, it would be a DX9 game, not a DX8.1 game).

Since the aim of futuremark is to emulate future gaming performance to a reasonable degree, it is only logical to imitate this behavior. What you are suggesting would essentially make all of the shader enabled game tests DX9 tests, and that wasn't the goal of futuremark.

Skuzzy
02-14-03, 01:57 PM
Bigus,..I am saying that if you want to use 1.4 PS, then you have to check as I did above.

Now, here is the rub. If you init using DX8 on a PS2.0 capable card, the card should report PS1.4 as 2.0 was not available for DX8. However, if you init the card using DX9, then the card should report PS2.0. A PS2.0 card, by default, also supports PS1.4.

If you want to run a PS1.4 on a DX9 initialized card, which supports PS2.0, then you have to check as I show above.

That make more sense?

silence
02-14-03, 05:36 PM
Originally posted by Bigus Dickus
Since the aim of futuremark is to emulate future gaming performance to a reasonable degree, it is only logical to imitate this behavior. What you are suggesting would essentially make all of the shader enabled game tests DX9 tests, and that wasn't the goal of futuremark. [/B]

hmmmmm.......isn't DX9 future???.....correct me if i am wrong but they have one partial DX9 test, 2 DX8(current games) and one DX7 (past tense) test.

future gaming with DX7???.......please........what kind of future games you can benchmark using DX7??........future games ARE developed with DX9 in mind and most of the tests should be DX9.

PreservedSwine
02-14-03, 06:17 PM
Originally posted by silence
hmmmmm.......isn't DX9 future???.....correct me if i am wrong but they have one partial DX9 test, 2 DX8(current games) and one DX7 (past tense) test.

future gaming with DX7???.......please........what kind of future games you can benchmark using DX7??........future games ARE developed with DX9 in mind and most of the tests should be DX9.

Good question, and here's an answer:Although the test is billed as a DX7 test, its not exclusively so as, curiously, Futuremark have opted to use Vertex Shaders for the geometry processing rather than fixed functional T&L, which is what was introduced with DX7. 3DMark03 is designed as a DX8 and DX9 test, but studies conducted by Futuremark indicated that plenty of DX7 games would still become available over the coming year, so some representation in 3DMark03 was required, however they still signal 3DMark2001SE as a more definitive DX7 test.

You can learn more here (http://www.beyond3d.com/articles/3dmark03/), at a non-biased overview written by people who know...

silence
02-15-03, 02:04 AM
okie.....

The multiplication number seems a little arbitrary upon first looks, but these have been calculated by running the tests on a number of high end systems and weighting the scores such that Game Test 4 equates to 20% of the final score, while the remaining 80% is split evenly to the other three tests.

this means that partial DX9 takes one fifth of the score and i don't buy that non-bias thinggy also........here is why >> if they wanted non-bias DX9 benchmark they wouldn't listen to studios;)
it simply isn't true DX9 benchmark it's more current situation on market benchmark.......which makes me agree with nvidia to some point.....
3dmark03 isn't game,will never be game and no game uses it's engine......so making drivers for sole use of higher score in 3dmark is wasting of time and energy........and IMO futuremark missused their popularity among gamers and produced something thet isn't what it should be......true test of future v-card capabilities.......hehehe>>when u put it like this it sound like GFFX :D :D :D :D

Myrmecophagavir
02-16-03, 06:20 PM
Originally posted by Skuzzy
If you init using DX8 on a PS2.0 capable card, the card should report PS1.4 as 2.0 was not available for DX8.Does it? Just because 2.0 wasn't released with the DX8 spec, doesn't mean its existance can't be reported... 1.4 is less than 2.0 so even if they reported 2.0 when a DX8 app queries it would still pass the test as outlined in the DX8 SDK. I can't check this though... can anyone with an R300/NV30 run the DX8 SDK caps viewer and see what it returns?

Chalnoth
02-16-03, 07:15 PM
The GeForce FX does PS 1.4 just fine, apparently. It's just that nVidia hasn't yet had enough time to properly-optimize for that version, meaning its PS 1.4 performance on the FX will be slow.

Regardless, I think that nVidia's main gripes are not with the FX's performance, but with the GeForce4's performance.

StealthHawk
02-16-03, 09:00 PM
Originally posted by Chalnoth
The GeForce FX does PS 1.4 just fine, apparently. It's just that nVidia hasn't yet had enough time to properly-optimize for that version, meaning its PS 1.4 performance on the FX will be slow.

apparently PS2.0 performance is slow too

Regardless, I think that nVidia's main gripes are not with the FX's performance, but with the GeForce4's performance.

i agree.

Chalnoth
02-17-03, 01:46 AM
Originally posted by StealthHawk
apparently PS2.0 performance is slow too
Most definitely, according to some other shader benches that have been displayed.