PDA

View Full Version : nVIDIA Fx 5900 Ultra and future


Pages : [1] 2 3 4

sandeep
10-10-03, 12:37 AM
Lets talk some sense here.....

Now hearing that Nvidia is working on their compiler in an endeavour to provide competitive gaming experience i.e forthcoming det. 52.XX vastly improves Image/performance as I gather from AnandTech...

Now what my curiosity is that Fx5900 Ultra doing things in a manner distinct from the standards(DX9), is it really detrimental. It seems that NV35 uses 16 and 32 bit floating point and as said by Nvidia that in cases where 24 bit precision is required NV35 will resort to 32 bit higher precision. What impact will this have really? If anyone answers me with Half-Life 2 then....If I am correct when Gabe Newell initially posted the Half-life 2 results on shader day, nVIDIA was trashed. However, recently in AnandTech posted results with unofficial detonator 52.XX where nVIDIA had vastly improved the numbers in Half-life 2.

It seems that by the time the game will be out, I doubt NV35 will have any problems playing it competitively....I am drawing this conclusion in a tentative affirming to the results which det 52.XX is producing.

Looks like in this forum some people are bashing Nvidia like theres no tomorrow.

But I will be very delighted if someone can provide me where exactly Nvidia has gone wrong? I am not an expert so if I say directX 9 uses 24 bit floating point will directx 10 use 32 bit floating point ? ATI has 4 stacks of 24 bit registers and NV35 has 4 stacks of 32 bit floating point registers.

What exactly is pixel shader 2.0? NV35 does not support pixel shader 2.0 why? Is it because of those floating point ****?


Fill me in...
Cheers

SuLinUX
10-10-03, 01:58 AM
Here is my look on things, the benchmarks on HL2, Doom3, the 50's dets and all invalid.

I would rather see for myself as benchmarks I have seen are strange to say the least anddonot tally with what I have experienced on my card.

StealthHawk
10-10-03, 04:10 AM
There are a lot of problems with Anand's conclusions. Some of which have been addressed and brought up here, and some of which have been discussed in other forums.

Discussion (http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=19182)

Greg
10-10-03, 05:32 AM
I'll make a few brief comments in your post.

Originally posted by sandeep
Lets talk some sense here.....
Now hearing that Nvidia is working on their compiler in an endeavour to provide competitive gaming experience i.e forthcoming det. 52.XX vastly improves Image/performance as I gather from AnandTech...


Yes, nVidia is still learning how to use their own hardware efficiently. They stated from before the 30 series hardware was released that it was complex and internally hard to optimise for. Hopefully these 50 series drivers and future ones will significantly improve game performance.

Originally posted by sandeep
Now what my curiosity is that Fx5900 Ultra doing things in a manner distinct from the standards(DX9), is it really detrimental. It seems that NV35 uses 16 and 32 bit floating point and as said by Nvidia that in cases where 24 bit precision is required NV35 will resort to 32 bit higher precision. What impact will this have really? If anyone answers me with Half-Life 2 then....If I am correct when Gabe Newell initially posted the Half-life 2 results on shader day, nVIDIA was trashed. However, recently in AnandTech posted results with unofficial detonator 52.XX where nVIDIA had vastly improved the numbers in Half-life 2.

It seems that by the time the game will be out, I doubt NV35 will have any problems playing it competitively....I am drawing this conclusion in a tentative affirming to the results which det 52.XX is producing.
[/B]

The DirectX9 'standard' was written by Microsoft, so it applies only as a paper standard for Windows DirectX API. It specified things like floating point precision of 'at least 24 bits' for full precision. (please excuse my paraphrase when I say things like...) nVidia thought, why not have higher full precision for when it's needed, and double performance with lower precision (16bit float) for most things. And thus they got caught out. Their 16bit performance while fast, was lower quality than the competition, and their 32bit performance while barely noticeably better than the competition was ridiculously slow. Not only that, but the DirectX9 helped their hardware to run in the slow 32bit mode most of the time, so nVidia had to replace shader programs, or do things in the driver to optimise on a case by case basis.

It's not that MS, nVidia or ATI particularly did any wrong things, just that they were going in different directions. ATI clearly has the best performing hardware when combined with the high features of the DX9 standard. MS has recently modified DX9 and the standard to allow for the lesser and greater precisons used by nVidia, who defend their HW design decisions.

Forget about PR excercises like the Half Life 2 stunt. I could waste your time pointing out stuff and bore you to death with it. Talk about that game when it comes out.

Originally posted by sandeep
Looks like in this forum some people are bashing Nvidia like theres no tomorrow.
[/B]

Yes, people have gone completely mental, even blaming nVidia for things like the HalfLife2 code theft! I'm surprised they weren't blamed for recent bad weather. The bottom line is this: a) They made benchmark specific optimisations which are effectively cheating, and didn't own up to it. b) Their hardware doesn't perform quite as good as the competition at various price points including the top. My view is that ALL hardware makers optimise for benchmarks wheather they admit it or not. It's just ugly when they get caught at it. nVidia and ATI are the two top performing brands, and for the first time in some years, they are in #2 position. If you just spent $500 on a new video card and then found out the competitions $480 card was a lot faster in some games and a tiny bit faster in others, you might be angry, but really, as a consumer, you can 'vote with your wallet' and tell 'em what you think. The truth is, most video cards sold are the budget cards and most systems ship with integrated video that doesn't get upgraded. Another truth is that if you own a FX5900 or R9800, you will enjoy a fantastic game experience dispite other issues.

Originally posted by sandeep
But I will be very delighted if someone can provide me where exactly Nvidia has gone wrong? I am not an expert so if I say directX 9 uses 24 bit floating point will directx 10 use 32 bit floating point ? ATI has 4 stacks of 24 bit registers and NV35 has 4 stacks of 32 bit floating point registers.
[/B]

I think I already touched on that in the DirectX9 and float precision bit earlier. Probably best to check a few web sites for more info.

Originally posted by sandeep
What exactly is pixel shader 2.0? NV35 does not support pixel shader 2.0 why? Is it because of those floating point ****?
[/B]

Pixel shader 2.0 is just a feature set with instruction set that a hardware maker must meet or excede in order to run 'shader programs' (how those dots are drawn on screen) of that standard. the NV30 series all support pixel shader 2.0.

OWA
10-10-03, 08:15 AM
Heh, next B1942 will be brought up and of course the resolution that didn't perform well on the 5900U was 1600x1200x32 with 4xAA and 8xAF which is all fine and dandy but then a lot of the same people bringing this up will complain about Anand using those type settings in his benchmarks (b/c it didn't show ATI in a favorable light -- that is, made them pretty much even with nvidia).

Hellbinder
10-10-03, 09:58 AM
I know that this just does not matter to some of you but...

This "Automatic shader Optomizer" is complete Nonsense. It's Pr lingo for *We replace Game Code with whatever we want wether you like it or not*. All they are doing is Detecting the Shader Code on a per application basis (Specifically Bencmarks and well known benchmark levels) and replacing the code with PS1.4 or whatever they can get away with or get the closest approximation of the real output.

Thats the Real *Shader Optomizer*. Have they Improved their Compiler? yes. But thats talking perhaps 10-15% honest increase at MOST.

Some of you seem to be happy with that. Obviously because you shelled out all that cash. Would I recomend these cards and these practices to anyone for Future gaming HELL NO. And Neither are any of the major web sites.

You like the shader "Optomizer" fine. But dont go posting stuff about how its suddenly Equal or better somehow than the Radeon Products. Because thats complete Rubbish.

(One More thing. The information i have seen shows that BF1942 is Slower on FX cards accross the board an average of 20FPS including Minimums. Thats at normal resolutions. Just ask some of the posters here who have both cards)

Behemoth
10-10-03, 10:00 AM
again (http://www20.tomshardware.com/graphic/20030930/radeon_9800-31.html)
it is always fun to see how some people dodging the fx problems and how they claim they have both cards and dont notice how 5900u get their ass handed to them by 9800pro. they keep spreading false info over and over again its starting to become funny. :lol:

Behemoth
10-10-03, 10:11 AM
Originally posted by Hellbinder
I know that this just does not matter to some of you but...

This "Automatic shader Optomizer" is complete Nonsense. It's Pr lingo for *We replace Game Code with whatever we want wether you like it or not*. All they are doing is Detecting the Shader Code on a per application basis (Specifically Bencmarks and well known benchmark levels) and replacing the code with PS1.4 or whatever they can get away with or get the closest approximation of the real output.

Thats the Real *Shader Optomizer*. Have they Improved their Compiler? yes. But thats talking perhaps 10-15% honest increase at MOST.

Some of you seem to be happy with that. Obviously because you shelled out all that cash. Would I recomend these cards and these practices to anyone for Future gaming HELL NO. And Neither are any of the major web sites.

You like the shader "Optomizer" fine. But dont go posting stuff about how its suddenly Equal or better somehow than the Radeon Products. Because thats complete Rubbish.

(One More thing. The information i have seen shows that BF1942 is Slower on FX cards accross the board an average of 20FPS including Minimums. Thats at normal resolutions. Just ask some of the posters here who have both cards)
judging from the nvidia's recent acts, i would not be surprised if this Automatic shader Optomizer turned out to be yet another PR stunt, that the actual performance gains were actually come from the driver team who have been working hard to optimize all benchmarkable games on the market. lol

Rabbitfood
10-10-03, 10:13 AM
Originally posted by Hellbinder
I know that this just does not matter to some of you but...
Thats right, it doesnt matter what you post, nobody cares...

digitalwanderer
10-10-03, 10:25 AM
Originally posted by Behemoth
again (http://www20.tomshardware.com/graphic/20030930/radeon_9800-31.html)
it is always fun to see how some people dodging the fx problems and how they claim they have both cards and dont notice how 5900u get their ass handed to them by 9800pro. they keep spreading false info over and over again its starting to become funny. :lol:
Yeah, it IS funny...in a pathetic-kind-o-way. No amount of reasoning or proof will change their opinions. :lol:

Originally posted by Rabbitfood
Thats right, it doesnt matter what you post, nobody cares...
Mmmmmm......bunny for lunch!

I care, why do you say nobody does?

goofer456
10-10-03, 10:27 AM
Originally posted by Rabbitfood
Thats right, it doesnt matter what you post, nobody cares...

Browsing through the NVnews forums I think that HB's posts attract much much more attention than yours.

Not everybody might agree with him but they do care:rolleyes:

Hanners
10-10-03, 10:34 AM
Originally posted by Hellbinder
This "Automatic shader Optomizer" is complete Nonsense. It's Pr lingo for *We replace Game Code with whatever we want wether you like it or not*. All they are doing is Detecting the Shader Code on a per application basis (Specifically Bencmarks and well known benchmark levels) and replacing the code with PS1.4 or whatever they can get away with or get the closest approximation of the real output.

Thats the Real *Shader Optomizer*. Have they Improved their Compiler? yes. But thats talking perhaps 10-15% honest increase at MOST.

I think it's kind of hard to say that outright without knowing exactly what nVidia are doing to shaders in the Detonator 50 drivers. You can't say it's wrong just like that (unless you want to tag ATi in the same manner, seeing as they are starting to do similar things in their drivers since the Catalyst 3.7s).

I think it's more than fair that people are skeptical about any new nVidia driver release these days (We all know deep down they can't be trusted at present), but I'm going to hold my breathe and wait and see what these drivers have to offer from a shader perspective before I pass judgement. Obviously, there are other image quality reductions being made that I'm far from happy to see (read: no true trilinear filtering in Direct3D), but I don't think it's going to be all bad this time around.

digitalwanderer
10-10-03, 10:45 AM
Originally posted by Hanners
I think it's kind of hard to say that outright without knowing exactly what nVidia are doing to shaders in the Detonator 50 drivers. You can't say it's wrong just like that (unless you want to tag ATi in the same manner, seeing as they are starting to do similar things in their drivers since the Catalyst 3.7s).

I think it's more than fair that people are skeptical about any new nVidia driver release these days (We all know deep down they can't be trusted at present), but I'm going to hold my breathe and wait and see what these drivers have to offer from a shader perspective before I pass judgement. Obviously, there are other image quality reductions being made that I'm far from happy to see (read: no true trilinear filtering in Direct3D), but I don't think it's going to be all bad this time around.
Fanboi. :rolleyes:





































;)

SuLinUX
10-10-03, 10:50 AM
I'll laugh my head off if the performance is double what you say with top IQ.

*runs off* :D

digitalwanderer
10-10-03, 10:57 AM
Originally posted by SuLinUX
I'll laugh my head off if the performance is double what you say with top IQ.

*runs off* :D
What? You don't wanna stay and play? :(

OWA
10-10-03, 10:58 AM
(One More thing. The information i have seen shows that BF1942 is Slower on FX cards accross the board an average of 20FPS including Minimums. Thats at normal resolutions. Just ask some of the posters here who have both cards)
I have both cards and I've seen about a 20fps difference most of the time but that's with different speed CPUs (2800+ vs 2400+ and an AGP 8x vs AGP 4x). While I doubt having the same CPU speed and conditions would bring them equal (unless maybe AA and AF was off) I would expect it to be closer than 20fps.

The only point that I was trying to make is that many here go out of their way trying to find conditions where the 5900U performs badly and once a condition is found that is the holy grail for testing the 5900U. But, on any test that plays more to the strength of the 5900U and makes the 5900U and 9800P more more evenly matched, everyone cries bloody murder.

It also gets a little old with people saying things like the 5900U sucks b/c it only gets 80-100fps in a game versus like 120+fps or so on a 9800P. I think most people would find 80+ fps more than acceptable in almost any game. I also don't think most nvidia users are trying to argue that the 5900U is better hardware than the 9800P (at least for DirectX9). But, that doesn't mean we can't be excited or hopeful that they can squeze a little more performance out of the drivers or whatever. Why should you even care? You're not even using one so why be all hot and bothered over something you don't even have? If someone here is excited or whatever over some driver improvement and some speed increase why would you feel the need to crush that. It just makes no sense to me. When I tried the AIW 9700 and didn't have a good experience I didn't stay on at Rage3D making comment after comment about how much the AIW 9700 Pro sucks. I came back here and then basically just lurked here. I didn't even really start posting all that much until I got tired of seeing ATI fan after ATI fan trying to dominate every single thread. I don't know...it just seems a little ridiculous to me. I've brought up the AIW 9700P experience here a lot but that has almost always been in response or to counter "the ATI is perfect and can do no wrong" type posting. I could be perfectly happy not mentioning ATI at all in most of the nvidia threads and just discussing ATI in the ATI forums here but then that would make a little too much sense, right? Since afterall, you've got to make sure that all the nvidia users see the light so they're not happy with their purchase and so they know how superior and wonderful it is to play a game at 120 fps versus a 100fps. Pitty the poor soul that doesn't know that.

I'm not sure if you realize this or not but I'm not trying to say the 5900U is better than the 9800P. I don't think it is. I have both and can test it and I think the 9800P is the better hardware and is the better deal (especially since it's even cheaper than the 5900U). But, that doesn't mean the 5900U sucks, can't play games, acceptably, etc. It gets old seeing thread after thread taken down the gutter by ATI fans that basically are here just trying to bag on nvidia. Fine, go to nvidia's website and rant all you want but leave the users that are happy with their cards and their purchase out of it. I guess I don't understand the insecurity. It's obvious many are insecure about their purchase and feel the need to criticize other people's choices to make them feel more secure about their own purchase but please, it is getting a little ridiculous. The only thing I can think of that people could be afraid of is that the best hardware doesn't always win out so I guess I can see users wanting to continue to criticize the competition hoping to help their brands market share (so they don't go under like Aureal did, for example -- since most thought they had better hardware). I think that is extremely unfair to the users of nvidia cards though, that just want to discuss and chat about their purchases without a ton of ATI evangelists taking the thread in a certain direction...almost like clock-work. Do they have auto-responding bots setup or what?

it is always fun to see how some people dodging the fx problems and how they claim they have both cards and dont notice how 5900u get their ass handed to them by 9800pro. they keep spreading false info over and over again its starting to become funny
True, it could be funny but I think a lot of people would agree that it is not really funny but strange that so many people that are supposedly happy with their ATI purchase continue to try to knock the 5900U in just about every thread. Why the need to dominate all the nvidia threads? Why the insecurity over your purchase. Does it really bother you that much that someone may have made a different decision than you did and may be happy with their purchase? Wouldn't you have more fun posting at a site with other like-minded individuals? I mean, if you had a MAC, would you get more out of hanging out and posting at a MAC site or a PC site? If you're a MAC user and frequenting a MAC forum, do you really want all your threads being dominated by people with PCs saying the MACs sucks?

SuLinUX
10-10-03, 11:08 AM
Thats saved me alot of time, good, i'm glad aleast someone see's my point of view as well.

Well said man. ;)

nForceMan
10-10-03, 11:11 AM
Originally posted by Rabbitfood
Thats right, it doesnt matter what you post, nobody cares...

You are absolutely right! He has already knocked out himself with tons of hoax, backbiting, lying, false rumors posted here. :rant:
Flamers never change. :screwy:

digitalwanderer
10-10-03, 11:15 AM
Originally posted by OWA
True, it could be funny but I think a lot of people would agree that it is not really funny but strange that so many people that are supposedly happy with their ATI purchase continue to try to knock the 5900U in just about every thread. Why the need to dominate all the nvidia threads? Why the insecurity over your purchase. Does it really bother you that much that someone may have made a different decision than you did and may be happy with their purchase? Wouldn't you have more fun posting at a site with other like-minded individuals? I mean, if you had a MAC, would you get more out of hanging out and posting at a MAC site or a PC site? If you're a MAC user and frequenting a MAC forum, do you really want all your threads being dominated by people with PCs saying the MACs sucks?
If MAC's sucked compared to PC's, sure.

No one is "defending" their ATi purchase here, we're just all acting as a self-elected counter-balance to all of nVidia's PR lies as of late....someone has got to put the truth out there.

I'm glad you're happy with your card and all, but whenever I see people posting up about how the 5900U is the best video you can get I will be posting up rebuttals. :)

digitalwanderer
10-10-03, 11:16 AM
Originally posted by nForceMan
You are absolutely right! He has already knocked out himself with tons of hoax, backbiting, lying, false rumors posted here. :rant:
Flamers never change. :screwy:
And some nVidia employees never change either, eh nForceMan? :afro:

Behemoth
10-10-03, 11:18 AM
Originally posted by OWA

True, it could be funny but I think a lot of people would agree that it is not really funny but strange that so many people that are supposedly happy with their ATI purchase continue to try to knock the 5900U in just about every thread. Why the need to dominate all the nvidia threads? Why the insecurity over your purchase. Does it really bother you that much that someone may have made a different decision than you did and may be happy with their purchase? Wouldn't you have more fun posting at a site with other like-minded individuals? I mean, if you had a MAC, would you get more out of hanging out and posting at a MAC site or a PC site? If you're a MAC user and frequenting a MAC forum, do you really want all your threads being dominated by people with PCs saying the MACs sucks?
the way i see it, people are not really knocking 5900U itself. people are mainly knocking the marketing BS, lies and false informations.
its those blind supporters support everything nvidia does that encourage nvidia's contineous cheats and hacks, what a shame.

nForceMan
10-10-03, 11:25 AM
Excellent post OWA. I agree with you. Thank you.
ATI flamers should go home and do something useful (hoeing, lawning, ...etc. ;) ), instead of terrorizing nV forums.

digitalwanderer
10-10-03, 11:29 AM
Originally posted by nForceMan
Excellent post OWA. I agree with you. Thank you.
ATI flamers should go home and do something useful (hoeing, lawning, ...etc. ;) ), instead of terrorizing nV forums.
And nVidia employees should go and work on some bloody decent IDE drivers for the nForce2 mobo rather than flaming people in forums. :p

PreservedSwine
10-10-03, 11:31 AM
Originally posted by nForceMan

ATI flamers should go home and do something useful (hoeing, lawning, ...etc. ;) ), instead of terrorizing nV forums.

ATI Flamers?

Don't you mean nVidia flamers?


How about simple honest critisizm, is there any room for that on a "fan" site, or is reality something nVidia wishes to avoid right now?

digitalwanderer
10-10-03, 11:34 AM
Originally posted by PreservedSwine
How about simple honest critisizm, is there any room for that on a "fan" site, or is reality something nVidia wishes to avoid right now?
I think nVidia's new internal corporate motto is, "Reality is for those who can't do PR." ;)