Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-23-03, 11:45 PM   #13
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
would you pay for something say $1 dollar or 1 millions
if you feel you dont have to , because you dont believe in it.?
Not if I knew I had millions of gullible people out there that believed every line I fed them. It's not about the money Nv40. It's about them looking bad because their card runs dx9 shaders poorly. They were in the beta program for most of the development process. They didn't have a problem with it until they got their nv30's and saw how poorly it ran. They chose the only option they thought they had. Denounce the benchmark that shows this. They can't exactly denounce it if they're still a beta member can they?
jjjayb is offline   Reply With Quote
Old 05-23-03, 11:48 PM   #14
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by jjjayb
It's not our fault. We're a product of our environment and bad engineering decisions.

You raised a lot of excellent points most well, but the above is still making me giggle!
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-23-03, 11:54 PM   #15
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by jjjayb
Not if I knew I had millions of gullible people out there that believed every line I fed them. It's not about the money Nv40. It's about them looking bad because their card runs dx9 shaders poorly. They were in the beta program for most of the development process. They didn't have a problem with it until they got their nv30's and saw how poorly it ran. They chose the only option they thought they had. Denounce the benchmark that shows this. They can't exactly denounce it if they're still a beta member can they?

and who have say it about money?

it because NVIDIA believes they DONT HAVE TO PAY.
and support something they dont agreed,or believe.
Microsoft is Billionare ,but that doesnt means they have to support
every one who ask $$$ to test their products ,in a test they
Dont believe. for X,Y reasons ..

dont you think is possible ,that Nvidia knows something about
3dmark2003 that you dont ? yes thats impossible.. i know.
but lets say you are the president at Nvidia
and lets say Nvidia is right ..about 3dmark in their claims
of ineficiency and biassed for other cards of the benchmark..
(i know we already have the guarantee of journalists about 3dmark reliability and integrity) beta-members of Futuremark.. BTW.

but lets suppose this is true. and that by accident or coincidence
Nvidia is right about 3dmark+ATI conpirancy.
(again, i know multibillionare companies in Busines$$ ,will
never take an oportunnity ,in an unfair way ,to decrease the sales of another one .. ) just ask microsoft.
so the question is ,Would you still pay Hundreds of thousands dollars for something you dont believe?

the answer is -> NO
but the opposite ,you will warn the public about the benchmark.
and will discredit it ,as a reliable test for their cards.

Last edited by Nv40; 05-24-03 at 12:29 AM.
Nv40 is offline   Reply With Quote
Old 05-24-03, 12:02 AM   #16
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

I think thing is this.

ATi and Nvidia have very different visions about the future of shaders and how they will come to pass.


Currently it seems Microsoft seems to favor ATI's Vision, As ATI's vision is a part of Microsoft's standard.

Nvidia favors its own ideals on how shader aplications will be ran.

We need to diagnose the real problem here, Not who is right and who is wrong.


Nvidia cards perform very well under there own Compiled Enviroments, ATI cards run very well under Microsofts Standards.

Nvidias Hardware has been optimised very much for its own "vision" of how API's will handle shader instructions.

IN this Case, Nvidia calls upon partial, fragmented, and interger precision at specific times for rendering a given scene.

Under many circumstances this is probably acceptable. And this why the Nv30 hardware was designed as it was. Nvidia seems to be trying to hold onto it's old architecture as it moves towards future architecture,(you have to take note that Nvidia was very close with Microsoft when the DX 8.0 standard was created)

This would ensure ultimate performance in old/modern/ But not exactly future products. Assuming All future products conform to the DirectX standard.

However, If they do, And they go the Nv30 path, And choose to use its partial, fragmented, and integer precision. Then it would perform fine in future aplications. It's obvious that Nvidias Pixel Shader is meant to handle a great deal of various precisions which are not listed in DirectX 9.0 specifications

It would also seem Nvidia doesn't seem to want to give up Multitexturing quite just yet. Which is a good indication by their architectures, Which seem quite strong in this field. (Multi Texturing for todays and yesterdays game is quite relevent performance wise)

However I'm not quite sure how relevent multi texturing will be with shaders taking over the way textures are mapped.




Now We have ATI's stance, which is purely conforming to The DirectX 9.0 specification given to them by Microsoft. This specification is the very heart of ATI's design and it has developed a completely new architecture around it.

You can gather this by some of the way it handles its features, Such as Multi Sampling Anti Aliasing ect, Purely conforment with DX 9.0 specifications.

I actually see nothing wrong with this. However it has left some things to be desired with older forms of aplications. Which are more dependent on the T&L engine. And older DirectX 7.0/6.0 Aplications. And Less on Multi texturing. Where ATI cards have not quite been as powerful as the Nvidia counterparts.

However for the most part they are functioning correctly. But not always optimally.

In this case I would have to say ATI is looking "forward" with its technology in this respect, And trying harder not to hold onto its Legacy support, (I use legacy as a term loosely)


So here we have it. ATI is definately following Microsofts DirectX to the letter, (I am certain microsoft is loving this. As they really do like to control the standards for all things regarding PCs)

And then we have NVidia, Which is trying to develop its own standards for shader aplications.

Who's right in this issue I cannot say,Unfortunately it's going to be a wait and see scenerio, Will Nvidias PR relationship with Developers be there saving grace?

Will nvidias standard Be the standard for Which Shader games are played? Or will the standard Microsoft And ATI used be the future in tommorrows aplications? It should be noted that Nvidia does "conform" with the standard when it is forced too. Not exactly Yielding Nvidia product performance in the best of Light.



Now I know some of you must be thinking? Where does this Leave OpenGL? Which really doesn't have a "standard" thats forced upon everyone, Being completely open Source. Nvidia is pretty much free to do whatever they want with it.

In This case you'll see a great deal of OpenGL titles (obviously anything based off the doom 3 engine) Optimised for the Nvidia hardware and shader pathways.

Could we see a similar scenerio back in the days of 3dfx Where Nvidia really did destroy in OpenGL aplications due to its hardware being used in fullest? And only mediocre results in Direct3d compared to its competitors (3dfx)

I guess only time will tell at this point. So Who knows. Btw I apologise for any typos I might have made in advance and it should be noted everything I have written here is purely speculative.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 05-24-03, 12:19 AM   #17
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Thumbs up

Quote:
Originally posted by ChrisRay
I think thing is this.

ATi and Nvidia have very different visions about the future of shaders and how they will come to pass.


Currently it seems Microsoft seems to favor ATI's Vision, As ATI's vision is a part of Microsoft's standard.

Nvidia favors its own ideals on how shader aplications will be ran.

We need to diagnose the real problem here, Not who is right and who is wrong.


Nvidia cards perform very well under there own Compiled Enviroments, ATI cards run very well under Microsofts Standards.

Nvidias Hardware has been optimised very much for its own "vision" of how API's will handle shader instructions.

IN this Case, Nvidia calls upon partial, fragmented, and interger precision at specific times for rendering a given scene.

Under many circumstances this is probably acceptable. And this why the Nv30 hardware was designed as it was. Nvidia seems to be trying to hold onto it's old architecture as it moves towards future architecture,(you have to take note that Nvidia was very close with Microsoft when the DX 8.0 standard was created)

This would ensure ultimate performance in old/modern/ But not exactly future products. Assuming All future products conform to the DirectX standard.

However, If they do, And they go the Nv30 path, And choose to use its partial, fragmented, and integer precision. Then it would perform fine in future aplications. It's obvious that Nvidias Pixel Shader is meant to handle a great deal of various precisions which are not listed in DirectX 9.0 specifications

It would also seem Nvidia doesn't seem to want to give up Multitexturing quite just yet. Which is a good indication by their architectures, Which seem quite strong in this field. (Multi Texturing for todays and yesterdays game is quite relevent performance wise)

However I'm not quite sure how relevent multi texturing will be with shaders taking over the way textures are mapped.




Now We have ATI's stance, which is purely conforming to The DirectX 9.0 specification given to them by Microsoft. This specification is the very heart of ATI's design and it has developed a completely new architecture around it.

You can gather this by some of the way it handles its features, Such as Multi Sampling Anti Aliasing ect, Purely conforment with DX 9.0 specifications.

I actually see nothing wrong with this. However it has left some things to be desired with older forms of aplications. Which are more dependent on the T&L engine. And older DirectX 7.0/6.0 Aplications. And Less on Multi texturing. Where ATI cards have not quite been as powerful as the Nvidia counterparts.

However for the most part they are functioning correctly. But not always optimally.

In this case I would have to say ATI is looking "forward" with its technology in this respect, And trying harder not to hold onto its Legacy support, (I use legacy as a term loosely)


So here we have it. ATI is definately following Microsofts DirectX to the letter, (I am certain microsoft is loving this. As they really do like to control the standards for all things regarding PCs)

And then we have NVidia, Which is trying to develop its own standards for shader aplications.

Who's right in this issue I cannot say,Unfortunately it's going to be a wait and see scenerio, Will Nvidias PR relationship with Developers be there saving grace?

Will nvidias standard Be the standard for Which Shader games are played? Or will the standard Microsoft And ATI used be the future in tommorrows aplications? It should be noted that Nvidia does "conform" with the standard when it is forced too. Not exactly Yielding Nvidia product performance in the best of Light.



Now I know some of you must be thinking? Where does this Leave OpenGL? Which really doesn't have a "standard" thats forced upon everyone, Being completely open Source. Nvidia is pretty much free to do whatever they want with it.

In This case you'll see a great deal of OpenGL titles (obviously anything based off the doom 3 engine) Optimised for the Nvidia hardware and shader pathways.

Could we see a similar scenerio back in the days of 3dfx Where Nvidia really did destroy in OpenGL aplications due to its hardware being used in fullest? And only mediocre results in Direct3d compared to its competitors (3dfx)

I guess only time will tell at this point. So Who knows. Btw I apologise for any typos I might have made in advance and it should be noted everything I have written here is purely speculative.

You saved me a lot
of work.

That's about what I've been thinking. Wrong or right.

This is a big reason I'm going to hold off on buying any new card for a bit.

Yes the NV30 and up will rock in OGL.

DOWN WITH MS OPEN SOURCE IS THE WAY!!!!!!!!!!!!!!

Last edited by bkswaney; 05-24-03 at 12:23 AM.
bkswaney is offline   Reply With Quote
Old 05-24-03, 12:47 AM   #18
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by bkswaney
You saved me a lot
of work.

That's about what I've been thinking. Wrong or right.

yes ChrisRay have many really Good points ..

but also i will like add ,that in games TWO video cards ,from diferent
companies that are 100% directx9 compatible , to the letter.. as you say it. can have a huge diferences in performance in games..
even if both follow Microsoft . why?

because there can be also other hardware diferences ,like
PIpelines organizations,Fillrate ,bandwidt and other hardware diferences ,
that can make an aplication or a game benefit more the design of
X company than the design of Y company..

in other words the same GAME or benchmark (3dmark)
can be written in many ways ,with exactly the same IQ and even better,
but with the performance switching in favor card#1 over Card#2 and in the other way.

it all depends of the game developer choices, efficiency.
and at the worst their integrity. good game developers will program
in a balanced way . the best IQ/performance for both cards.
Design 3dmark2003 with heavy use of multitexturing ,heavy use of NVidia PS1.x ,or 2.0+ (MS standar) and force ATI to multipass at all times,what Nvidia can do in just ONE PASS ,and you will see ATI claiming the benchmark is not reliable..

Last edited by Nv40; 05-24-03 at 01:02 AM.
Nv40 is offline   Reply With Quote
Old 05-24-03, 12:55 AM   #19
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Very well thought out and well spoken post Chris. You pretty much explained the situation perfectly.

I think Nvidia should have focused less on the older and modern products and more on the future products. They are selling the cards for $500 afterall. It's not like you're just going to throw the card away after 6 months. Even if Nvidia want's to call upon partial, fragmented, and interger precision at specific times for rendering a given scene, they should make sure the full precision is up to snuff first. Basically, they shouldn't use the partial precision to be "as good as, or slightly better than" anyone else using full precision. They should have targeted the full precision to be "as good as" or better than their competitors and the partial to completely blow the competition away.

I think it's great Nvidia went above and below the standard. But it's bad that they don't have the performance when going above the standard or even meeting the standard. The above the standard doesn't do much good if it runs too slow to implement.

I don't think so much that they made a mistake with this architecture. It's more because the r300 that it seems like a failure. Because they were not expecting ATI to implement as well as they did when they did they didn't feel they needed to push the bar so high . Who really expected ATI to pull of what they did with the r300? I certainly didn't. I'm sure they weren't expecting it either. Not with ATI's past record on releasing cards. Especially when they heard ati was using .15 micron. Most of the "experts" didn't think they'd be clocked much over 225mhz on 15 micron with all the dx9 features packed in there.

If ATI hadn't released the r300 we would all be in agreement that the nv30 is a kick ass card. But after the 9700 the nv30 was a dissappointment. Especially for anyone used to Nvidia being on the top. I was expecting alot more from them after seeing the 9700. But I guess that wasn't realistic. If the r300 was never released, it would be the pahrelia against the nv30. Then the nv30 would have looked like a whopper.

I think if the r300 was never released, or only performed on the same level as the pahrelia, nvidia never would have had a problem with 3dmark03. They would have used it's marketing value for all it's worth. Just like they did with 3dmark2001. It's not that 3dmark03 makes nvidia looks bad. It's that the r300 makes the nv30 look bad.

What bothers me is the way Nvidia has reacted to all of this though. Rather than sucking it up and moving on, they have gone to slimeball tactic mode.

Look at the nv30. They used reduced quality in the drivers on release. Some of this quality has only now been cleared up. This is Nvidia, the company with great drivers. I don't feel for one bit that the reduced quality was an accident. It served it's purpose. It got them better fps in games when the card was initially reviewed. The misleading control panel sliders. Suggesting review sites use control panel settings that they now darn well are not comparable when benching against the competition. Flaming futuremark rather than just dealing with not being first for once. Then after flaming futuremark, cheating in the benchmark they said they didn't like in the first place. Then after being caught, blaming futuremark again. It all just sits really bad with me.

These tactics are going to lose Nvida more customers than just having a slower card would have. After they saw the r300 they should have just accepted that they weren't going to have the fastest card for a few months. They should have just dealt with being slower in 3dmark. They should have kept overall image quality perfect, even if it meant running slower (in games too) and said hey, we'll work harder on the next card. If they'd done that, I would have actually considered buying their next card. As it is now, know way. They have a lot of confidence to regain before I'll do that. Ati is no saint either, but i'll take the lesser of two evils. I do feel ATI is actually getting better in the public image department though while Nvidia is getting worse.

Nvidia really needs to clean up their image. Die hard Nvidia fans will ignore the tactics of late, but the average joe won't. Alot of former Nvidia fans have already jumped ship to get the r300. These kinds of things will only make them less likely to come back. And I've seen enough people on the web today who have Nvidia cards now say they won't buy another one after this fiasco. Sure, it's only a benchmark. But it's also your image.

Just my 1 a.m. rambling thoughts.
jjjayb is offline   Reply With Quote
Old 05-24-03, 01:14 AM   #20
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
I think Nvidia should have focused less on the older and modern products and more on the future products.

would you explain that more?

if ATI is more focused with the future?
the why John carmack use the Nv3x for their next game ,
next to Doom3 and maybe final one..?

have you ever read ,when JC told.. "i have already reached the limits of R300" not in the Nv30. did you know that NVdia hardware PS/VS is a hair
close to microsoft PS/VS 9.1?
Nv40 is offline   Reply With Quote

Old 05-24-03, 01:27 AM   #21
mbvgp
Registered User
 
Join Date: Nov 2002
Location: Behind my desk. Staring at the monitor
Posts: 22
Default

On a different subject ( not related ) something should be done about the APIs. The problems with the two current ones are
1) DirectX - MS Control and not cross platform . But has a fairly fixed standard.
2) OpenGL - Not under some company's control but has the propreitary extensions issue which makes it hell for game devs to optimize for a certain platform.

Basically we need a new standard ( maybe opengl 2.0 can address this ) which incorporates good points from both and thus game devs have a fairly fixed, cross platform, free standard to target to.
Then game devs can say to hell with MS
mbvgp is offline   Reply With Quote
Old 05-24-03, 01:29 AM   #22
indio
Registered User
 
Join Date: May 2003
Posts: 116
Default

My take is this , Nvidia tried to deliver a knock out blow to ATI with CG and proprietary extensions. They wanted to be in the same relative position Microsoft or Intel is to their competition. Basically the tried to leverage there marketshare a little too early in my opinion and tried to push the industry in a direction that was favorable to them. Alas the R300 proved to be a brick wall and Nvidia broke it self apart on it.
I think some ppl. don't realise how much trouble Nvidia is really in. They are loaded with debt and liabilities. There product line is a flop. As far as optimising DX9 for the FX because of marketshare , keep dreaming. Nvidias marketshare lead can be attributed to the GF4mx which is DX7 . I don't think they will be coding for that. They will code for the most prevalent DX9 compatible hardware which at this point is the r300 series.
indio is offline   Reply With Quote
Old 05-24-03, 01:32 AM   #23
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
have you ever read ,when JC told.. "i have already reached the limits of R300" not in the Nv30. did you know that NVdia hardware PS/VS is a hair
close to microsoft PS/VS 9.1?
The question is, Have you read it? Actually, I'm sure you have. But, Do you understand it?

Quote from john carmack:

Quote:
For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.

This is not with code that is actually going into the game. With the instruction lengths he was using it would run like a slideshow on the nv30.
By the way, have you looked at the 9800? It actually allows for MORE instructions than the nv30 or the nv35. It is a hair and a half closer to ps/vs 9.1 than the nv30 and the nv35 ;-)
jjjayb is offline   Reply With Quote
Old 05-24-03, 01:44 AM   #24
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

I think a lot of it boils down to Nvidia using 16 and 32 precision only and not adding 24 like ati did.
It's killing Nvidia trying to use Full 32 precision against ati's 24.
They just cannot get the performance mark in 32 up to ati's 24.


One things for sure. It's no wonder Mr Nvidia did not get any bonus. NV has made some bad choices over the past year.
Starting by not putting a 256bit memory controler on the NV30.

They need to get there head out of the rain for sure.

Last edited by bkswaney; 05-24-03 at 01:48 AM.
bkswaney is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 07:02 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.