PDA

View Full Version : Shader Model 3.0


Pages : [1] 2 3 4 5 6 7 8 9 10 11 12 13

nForceMan
03-23-04, 01:40 PM
NVIDIA's next generation hardware platform (NV4x) natively supports (http://biz.yahoo.com/prnews/040323/sftu074_1.html) shader model 3.0 :thumbsup:
Exactly as I said (http://www.nvnews.net/vbulletin/showpost.php?p=283083&postcount=104) before. :cool2:

MUYA
03-23-04, 01:46 PM
well if NV are demonstrating Shader Model 3 tools at GDC...then NVIDIA will demonstrate that on NV40s? I hope someone breaks a NDA :D

DSC
03-23-04, 01:57 PM
Wonder what VS/PS 3.0 demos will be released with NV40.... :box: :thumbsup:

theultimo
03-23-04, 02:08 PM
And this is how NV40 will have a 16x0 implementation! :jumping:

PoorGuy
03-23-04, 02:20 PM
Wow, has nVidia got a winner this time or what? :birthday:

If this 3.0 technology doesn't win over the 5% of developers using ATi FireGL to nVidia, nothing will.

MUYA
03-23-04, 02:28 PM
Mufu is saying "certain conditionals" that was seen on nv30 is being used again for nv40....hmmm

ChrisW
03-23-04, 02:38 PM
LOL! It's nothing more than their old tricks again. New "software tools" made by nVidia to support PS 3.0! I smell Cg 2.0 that will convert shaders down to Int 12/PS 1.1 and slow down other graphics cards. Can you really trust nVidia's software "tools" after what they did with their last generation of software?

Evildeus
03-23-04, 02:47 PM
The last couple of days days people started to shout basically anything to get a bit more attention and spread well let's say it a bit more bluntly: BS, seek in the news from last week, that's all I'm gonna say about it ::end hint:: Ah okay, one more .. also seek up a rumor on encoding certain stuff.
http://www.guru3d.com/article/article/125/3/

Nutty
03-23-04, 03:15 PM
I smell Cg 2.0 that will convert shaders down to Int 12/PS 1.1 and slow down other graphics cards.
Cg never slowed down other graphics cards. Had ATI bothered to write their own backend compiler, given it was completely open to all IHV's, then there would've been no problem. But they didn't, so anything compiled with Cg, ended up using the compiler that outputed NV hardware optimized instructions.

Theres nothing wrong with Cg, its just a language. Its what you do with it that counts. NV made it work to their hardware, while ATI did nothing and prayed that it would die.

theultimo
03-23-04, 03:26 PM
Cg never slowed down other graphics cards. Had ATI bothered to write their own backend compiler, given it was completely open to all IHV's, then there would've been no problem. But they didn't, so anything compiled with Cg, ended up using the compiler that outputed NV hardware optimized instructions.

Theres nothing wrong with Cg, its just a language. Its what you do with it that counts. NV made it work to their hardware, while ATI did nothing and prayed that it would die.


Also, wan't CG open Source to use?

Demirug
03-23-04, 03:30 PM
Cg never slowed down other graphics cards. Had ATI bothered to write their own backend compiler, given it was completely open to all IHV's, then there would've been no problem. But they didn't, so anything compiled with Cg, ended up using the compiler that outputed NV hardware optimized instructions.

Theres nothing wrong with Cg, its just a language. Its what you do with it that counts. NV made it work to their hardware, while ATI did nothing and prayed that it would die.

Yes, but they Cg compiler works very bad if you build PS 2.0 oder PS 2.X shader with it. PS 1.1 are sometimes better than they output of the MS-compiler if you run the shader on nVidia hardware.

Dazz
03-23-04, 03:50 PM
I think it will go the as Cg, simple reason developers hate extra work.

Toaster
03-23-04, 04:26 PM
nVidia has released some GDC2004 documents

Aren't up yet at the developer site, but you can find um here

ftp://download.nvidia.com/developer/presentations/GDC_2004/

Haven;t fully read them yet but the last one contains a shader 3.0 presentation.
ftp://download.nvidia.com/developer/presentations/GDC_2004/RenderingTechniquesNVIDIA.pdf

Ninja Prime
03-23-04, 04:34 PM
Welcome to yesterdays news nforceman.

I can't wait to play these PS3.0 games on NV40... in 2005 at less than 30 fps. :rolleyes2

Toaster
03-23-04, 04:52 PM
LOL! It's nothing more than their old tricks again. New "software tools" made by nVidia to support PS 3.0! I smell Cg 2.0 that will convert shaders down to Int 12/PS 1.1 and slow down other graphics cards. Can you really trust nVidia's software "tools" after what they did with their last generation of software?

If you want to bash something then you should at least understand what you are talking about..

TheTaz
03-23-04, 06:14 PM
Well... I don't want to speculate. I'll just wait and see the R420 vs. NV40 benchmarks / reviews. :D

As for PS3.0... I really don't see how it will be a visual advantage. Sort of like the difference between PS 1.0 and 1.4. I could care less that my Geforce 3 Ti500 was only doing PS 1.0 and Radeon 8500's were doing PS 1.4. Why the hell would I care about a PS 2.0 card vs. a PS 3.0 card?

It's cool if nVidia can squeeze some performance out, with "said trick". More power to them. I just don't see how PS 3.0 is going to make much of a difference in the eye candy.

/shrug

Taz

Toaster
03-23-04, 06:51 PM
Official link to GDC 2004 presentations:

http://developer.nvidia.com/object/gdc_2004_presentations.html

photophreak314
03-23-04, 07:52 PM
LOL! It's nothing more than their old tricks again. New "software tools" made by nVidia to support PS 3.0! I smell Cg 2.0 that will convert shaders down to Int 12/PS 1.1 and slow down other graphics cards. Can you really trust nVidia's software "tools" after what they did with their last generation of software?

You actually think they would be that stupid? I don't think so. You should head on over to guru3d.com and read their cebit article, as they had an NDA with Nvidia, and said that they are confident in nvidia this time around. And their opinions are usually right.

mikechai
03-23-04, 08:31 PM
Official link to GDC 2004 presentations:

http://developer.nvidia.com/object/gdc_2004_presentations.html

Link is dead.

Oops. Link is up again.

ChrisW
03-23-04, 11:09 PM
If you want to bash something then you should at least understand what you are talking about..
Is that all you do is take cheap shots at other people? Please, if you are not going to take the time to discuss something, don't say anything at all. :rolleyes:

ChrisW
03-23-04, 11:11 PM
You actually think they would be that stupid? I don't think so. You should head on over to guru3d.com and read their cebit article, as they had an NDA with Nvidia, and said that they are confident in nvidia this time around. And their opinions are usually right.
The same thing was said to me the last time and I was proven correct. Remember the giant list of "experts" with "inside information" on this site before the NV30 was released? Remember how all those people were wrong? I really hope this card is a great card this time, but it seems people forget the past real easy.

ChrisW
03-23-04, 11:17 PM
Cg never slowed down other graphics cards. Had ATI bothered to write their own backend compiler, given it was completely open to all IHV's, then there would've been no problem. But they didn't, so anything compiled with Cg, ended up using the compiler that outputed NV hardware optimized instructions.

Theres nothing wrong with Cg, its just a language. Its what you do with it that counts. NV made it work to their hardware, while ATI did nothing and prayed that it would die.
It was proven on this very site that ATI cards took a 30-50% performance hit when the exact same shaders were compiled with Cg. It does not matter if ATI took the time to write a special compiler for it or not. The fact is the one written by nVidia slows down ATI cards. And we all know they are strongly encouraging developers to use Cg by throwing lots of money at them. The argument about ATI not writing a special compiler for it is mute as everyone knows they would not be stupid enough to write something for use with something nVidia has proprietary control over and can change and/or drop support for anything they want without a moments notice. I don't see any other companies providing compilers for it either.

ChrisRay
03-23-04, 11:30 PM
It was proven on this very site that ATI cards took a 30-50% performance hit when the exact same shaders were compiled with Cg. It does not matter if ATI took the time to write a special compiler for it or not. The fact is the one written by nVidia slows down ATI cards. And we all know they are strongly encouraging developers to use Cg by throwing lots of money at them. The argument about ATI not writing a special compiler for it is mute as everyone knows they would not be stupid enough to write something for use with something nVidia has proprietary control over and can change and/or drop support for anything they want without a moments notice. I don't see any other companies providing compilers for it either.


So, Because Nvidia wrote it, Ati Automatically shouldnt do anything to support it? Being prideful doesnt make you smart or stupid, It just makes you uncooperative,

Both ATI and NVidia are guilty of it, They should have optimised it, Tryen not to support just hurt ATI users, While your blaming Nvidia, ATi could have nodded there head and supported it, Personally I think ATI not writing a compiler for it was a bad idea.

ChrisW
03-24-04, 12:38 AM
So, Because Nvidia wrote it, Ati Automatically shouldnt do anything to support it? Being prideful doesnt make you smart or stupid, It just makes you uncooperative,

Both ATI and NVidia are guilty of it, They should have optimised it, Tryen not to support just hurt ATI users, While your blaming Nvidia, ATi could have nodded there head and supported it, Personally I think ATI not writing a compiler for it was a bad idea.
If they wrote a compiler for it, they would be helping to support it. This would encourage developers to use it. Then, after all the developers have adopted it, nVidia could change it any way they wanted. They could even purposely slow down other cards or remove the special compilers for other cards. They could do any thing they want and history has shown they always do whatever only benefits themselves. NVidia is a ruthless company that will do anything they have to do, even try to destroy a benchmark just because their card does not do well in it and will cook the books/cheat in benchmarks just to make their cards appear to come out on top. So, yes, I believe they would purposely slow down other cards using their proprietary software if they found it necessary to make their cards look faster/make more money. It's a business...and only a fool would put their future in the hands of a competing business.

How come everyone always blames ATI for not adding a compiler to Cg but they always forget to mention that all the other graphics card makers have also chosen not to include compilers for it and have stated how dangerous such a tool would be in the hands/control of one graphics card company? None of the other graphics card companies can place their trush/future in their competitor's hands.

ChrisRay
03-24-04, 12:55 AM
If they wrote a compiler for it, they would be helping to support it. This would encourage developers to use it. Then, after all the developers have adopted it, nVidia could change it any way they wanted. They could even purposely slow down other cards or remove the special compilers for other cards. They could do any thing they want and history has shown they always do whatever only benefits themselves. NVidia is a ruthless company that will do anything they have to do, even try to destroy a benchmark just because their card does not do well in it and will cook the books/cheat in benchmarks just to make their cards appear to come out on top. So, yes, I believe they would purposely slow down other cards using their proprietary software if they found it necessary to make their cards look faster/make more money. It's a business...and only a fool would put their future in the hands of a competing business.

How come everyone always blames ATI for not adding a compiler to Cg but they always forget to mention that all the other graphics card makers have also chosen not to include compilers for it and have stated how dangerous such a tool would be in the hands/control of one graphics card company? None of the other graphics card companies can place their trush/future in their competitor's hands.



Dang.. I cant believe you'd think Nvidia would do that, Why would Nvidia purposely sabotage its compiler to people it licensed it too? Naw, I dont see it happening.

Nvidia would have loved people adopting CG because it would have allowed them to better optimise their cards, Nvidia trying to get the best performance out of their cards does not neccasarily Equate to them trying to sabotage ATI,

"What If" has gotta be the most mundane argument there is, It comes down to this. If ATI had compiled for it, They techniqally would have supported it. And cant have that. Its Pride,


From My understanding anyway. Nvidia has made a ton of its fragment shaders non licensed anyway. So ATI could easily implement some of Nvidias fragment shaders into its drivers like SiS has.