PDA

View Full Version : nVidia confirm Pixel Shader 2.0 driver problems?


Pages : [1] 2

Hanners
07-22-03, 06:32 AM
This is only according to The Inquirer (http://www.theinquirer.net/?article=10602), so make of it what you will...

Nvidia confirms PS 2.0 driver problems

DEVELOPERS AT the Meltdown Microsoft DirectX conference heard Nvidia make an interesting comment about its drivers.
One of the developers using FX 5900 hardware for an upcoming title tested the new shaders 2.0 and made a comment that he'd had problems with them.

Apparently, some shaders 2.0 simply do not work making Nvidia's claim that this card is DirectX 9 hardware puzzling, as Pixel Shader 2.0 is one of the essences of DX 9.

Subsequently, Nvidia staff at the conference agreed the shaders are not feeling very well, and that the recent drivers are quite buggy. But hey! They run Quake, UT and 3Dmark just fine.

We are sure that this carelessly honest Nvidia boy will be decapitated for this statement but we are sure that this will brought much more attention to making things work in future revisions.

As you might expect, there are not many ways to test these shaders as DirectX 9 games are still far away in the future. Ironically, the same shaders run just fine in ATI Radeon 9800 class hardware.

combat_wombat
07-22-03, 08:22 AM
sounds like more inquirer baloney to me. Don't those so-called "developers" have names?

saturnotaku
07-22-03, 08:27 AM
We are sure that this carelessly honest Nvidia boy will be decapitated for this statement

Ahh, the wonders of the Inq's "journalism."

MuFu
07-22-03, 09:38 AM
I can believe it. There are many shader hacks that run partially in software (fastex) that they had no end of problems with during R&D.

MuFu.

Spotch
07-22-03, 10:02 AM
It seems that if this is true, NVIDIA has surely lost just about everything that made them a real competitor in the GFX industry. Now they cannot even make due with the most important DX specs. I guess PS 2.0+ means something less than PS 2.0?

If I were a GFFX owner I would get a refund as the product does not live up to its advertised capabilities. This is just getting out of hand. :nono:

combat_wombat
07-22-03, 10:27 AM
Originally posted by Spotch
It seems that if this is true, NVIDIA has surely lost just about everything that made them a real competitor in the GFX industry. Now they cannot even make due with the most important DX specs. I guess PS 2.0+ means something less than PS 2.0?

If I were a GFFX owner I would get a refund as the product does not live up to its advertised capabilities. This is just getting out of hand. :nono:

no way in hell would I return my card based on an unsubstantiated rumor, from The Inquirer of all places. Sorry bub

GlowStick
07-22-03, 10:29 AM
Also the fact of the matter is, if there is certin ps functions that dont work right on nv cards, we will never see them used.

Hanners
07-22-03, 10:47 AM
Originally posted by GlowStick
Also the fact of the matter is, if there is certin ps functions that dont work right on nv cards, we will never see them used.

Like Centroid sampling?

(Sorry, couldn't resist) :p

The Baron
07-22-03, 10:53 AM
Originally posted by combat_wombat
no way in hell would I return my card based on an unsubstantiated rumor, from The Inquirer of all places. Sorry bub
I remember the days in which when someone mentioned the Inquirer, everyone else would go, "JEEBUS, MAN! WHY ARE YOU LISTENING TO THAT SACK OF LIES?!"

The times, they are a-changin'.

reever2
07-22-03, 10:56 AM
Originally posted by combat_wombat
no way in hell would I return my card based on an unsubstantiated rumor, from The Inquirer of all places. Sorry bub

Like Ps2.0 benchmarks and demos running at <10fps on NV35 and 40+ fps on R350 arent enough to prove Nvidia sucks at ps2.0.....

Uttar
07-22-03, 10:57 AM
Exclusive:
nVidia announces their marketing departement has been fired after figuring out they just cost them $800 million in potential profit due to multiple reasons, including lower efficiency due to legacy.
"They ****ed. Now they are ****ed." said Jen Hsun Huang, CEO and President of nVidia, in a press conference.

"I can't believe it!" cried an anonymous member of nVidia marketing team. "We always gave Jen good coffee! David got his daily massage from our female personnel! Damn, what else did you want?"

No, this not-funny-at-all joke doesn't *seem* related.


Uttar

Hanners
07-22-03, 10:59 AM
Originally posted by Uttar
Exclusive:
nVidia announces their marketing departement has been fired after figuring out they just cost them $800 million in potential profit due to multiple reasons, including lower efficiency due to legacy.
"They ****ed. Now they are ****ed." said Jen Hsun Huang, CEO and President of nVidia, in a press conference.

"I can't believe it!" cried an anonymous member of nVidia marketing team. "We always gave Jen good coffee! David got his daily massage from our female personnel! Damn, what else did you want?"

No, this joke doesn't *seem* related.


Uttar

You realise this joke will turn up on The Inquirer as a serious story in a few days now, don't you? ;)

Uttar
07-22-03, 11:42 AM
Originally posted by Hanners
You realise this joke will turn up on The Inquirer as a serious story in a few days now, don't you? ;)

Hehe! :) Nah, Fudo ain't stupid.

What I meant by the "seem" is really quite simple - but did anyone get it yet? :)


Uttar

GlowStick
07-22-03, 11:45 AM
Originally posted by Uttar
Hehe! :) Nah, Fudo ain't stupid.

What I meant by the "seem" is really quite simple - but did anyone get it yet? :)


Uttar

Hm, could you be playing on the fact that the inq story is.

Our anonomous informant overherd part of a converstation with another guy?

Zeno
07-22-03, 12:27 PM
I don't know if the above story is true or not, but I wanted to put in my 2 cents as a developer.

I've been programming using ARB_fragment_program and ARB_vertex_program on both NV30 and R300/FireGL for a couple of months now. I've run into one glitch on NV30 and three or four on ATI's hardware. On the flip side, the ATI cards are faster.

reever2
07-22-03, 12:34 PM
Ahh i got banned from hardocp for discussing this article, better watch out...

GlowStick
07-22-03, 12:38 PM
Originally posted by reever2
Ahh i got banned from hardocp for discussing this article, better watch out...

heh

Hanners
07-22-03, 12:39 PM
Originally posted by Zeno
I don't know if the above story is true or not, but I wanted to put in my 2 cents as a developer.

I've been programming using ARB_fragment_program and ARB_vertex_program on both NV30 and R300/FireGL for a couple of months now. I've run into one glitch on NV30 and three or four on ATI's hardware. On the flip side, the ATI cards are faster.

I get the feeling it's the DirectX side of things developers are struggling with, particularly those who haven't gone the Cg (read NV3x optimised) route.

rokzy
07-22-03, 12:59 PM
Originally posted by Hanners
As you might expect, there are not many ways to test these shaders as DirectX 9 games are still far away in the future. Ironically, the same shaders run just fine in ATI Radeon 9800 class hardware.

er, why is it ironic ?

3dmark2003 and nvidia's cheating have shown us FX's are sh*te at PS2.0 and nvidia know it and are worried about it.

reever2
07-22-03, 01:03 PM
There are also quite a bit of dx9 demos to test dx9 shading on...

MuFu
07-22-03, 01:44 PM
Originally posted by reever2
Ahh i got banned from hardocp for discussing this article, better watch out...

You got banned because you probably didn't discuss it in the right way. Please don't use posts with words or pictures in them next time.

MuFu.

reever2
07-22-03, 02:01 PM
Originally posted by MuFu
You got banned because you probably didn't discuss it in the right way. Please don't use posts with words or pictures in them next time.

MuFu.

Lol, ill try conveying my thought with sign language over a messageboard then

creedamd
07-22-03, 02:05 PM
Originally posted by reever2
Lol, ill try conveying my thought with sign language over a messageboard then

telepathy is the best way...

Hanners
07-22-03, 03:32 PM
Originally posted by creedamd
telepathy is the best way...

I'm sure Kyle is already wearing his tinfoil hat to block our mind rays. :(

aapo
07-22-03, 06:07 PM
Originally posted by Hanners
I'm sure Kyle is already wearing his tinfoil hat to block our mind rays. :(

So it's teleapathy then, I guess. :rolleyes: