Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-24-03, 01:57 AM   #25
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by ChrisRay
IN this Case, Nvidia calls upon partial, fragmented, and interger precision at specific times for rendering a given scene.
A very nice post. I do take issue with the above quoted statement though.

NV35 dropped Integer support from the shader pipeline.
  Reply With Quote
Old 05-24-03, 02:04 AM   #26
StealthHawk
Guest
 
Posts: n/a
Default Re: So now we know

Quote:
Originally posted by bkswaney
I think these two things say a lot.

____________________________________________

Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.


__________________________________________________ __

ATI stated:

The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST.

__________________________________________________


As I see it Nvidia got ****ed by not staying with FM's beta program.

ATI worked hand in hand with FM to make sure there shader was at top performance for "there bench".

Nvidia did not. But "Game Companies" do work very close with nvidia. Even more so than ati.
That is why you see the FX's suxing on PS 2.0 3DM03 and kicking butt in games.

It's easy to read between the lines on this one just from what both companies have said.
You could read through the lines and interpret things that way. But that really doesn't make sense considering some other facts. Copied and pasted from another thread, I said this:
Quote:
Some people seem to believe that Futuremark is out to crucify nvidia. The fact is that the NV3x architecture is not good with shaders. Not only Futuremark's 3dmark03, but also Shadermark and Rightmark3d show the same poor shading performance from NV3x cards. That's two other programs from independent companies. Is there really some vast conspiracy against nvidia?
As you can see, you are implying that there is some massive conspiracy against nvidia, if you believe Futuremark is out to sabotage them, then that must mean other companies are too. There's no evidence to suggest or support this, all the evidence we have points to weaknesses in the NV3x architecture.


I should also point out that right until 3dmark03 was launched, everyone believed it would favor nvidia cards. Because 3dmark2001 did. No one ever called 3dmark2001 biased to nvidia. So, let's take a look at some 3dmark2001 benchmarks. First off, I think everyone will admit that NV3x is weaker in vertex shading than R3xx is, that's pretty much a fact. But it is also interesting to look at the Advanced Pixel Shader test. Both cards are hypothetically using PS1.4.

http://www6.tomshardware.com/graphic...x_5900-27.html

The fact is that every benchmark using advanced shaders shows nvidia at a disadvantage.
  Reply With Quote
Old 05-24-03, 02:22 AM   #27
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

I'm really starting to wonder if there is a way for NV to fix this on the FX cards. It's starting to look like drivers are not going to do it.

Now for me it does not matter. I'll have 2 more cards at least before any 2.0 PS games even hit the market.
But I'm sure most people will be buying a 500 cards to last at least 2 years. I update every 6 months or so. Most of the time I upgrade CPU's at the same time.

If they do not get the performance up on it I'll be buying a 9800 Pro-256.
But I'll wait till all the 3rd parties have there cards out. I want one that looks as good as it runs.
I wonder if ST is ever going to release there 256?

________________________________________________-

IT'S COMING -The 'ULTIMATE' Silent Partner Is Coming!!!

__________________________________________________ _

I thought it was coming at E3.

Last edited by bkswaney; 05-24-03 at 02:30 AM.
bkswaney is offline   Reply With Quote
Old 05-24-03, 02:34 AM   #28
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by bkswaney
I'm really starting to wonder if there is a way for NV to fix this on the FX cards. It's starting to look like drivers are not going to do it.

Now for me it does not matter. I'll have 2 more cards at least before any 2.0 PS games even hit the market.
But I'm sure most people will be buying a 500 cards to last at least 2 years. I update every 6 months or so. Most of the time I upgrade CPU's at the same time.
That's decent man. At least you can acknowledge that it is a serious issue which may have some bearing on other users who thought the gfFX would be a great future proof card.
  Reply With Quote
Old 05-24-03, 02:53 AM   #29
bloodbob
Registered User
 
Join Date: Oct 2002
Posts: 123
Default

Quote:
Originally posted by ChrisRay
I think thing is this.

ATi and Nvidia have very different visions about the future of shaders and how they will come to pass.


Currently it seems Microsoft seems to favor ATI's Vision, As ATI's vision is a part of Microsoft's standard.

Nvidia favors its own ideals on how shader aplications will be ran.

We need to diagnose the real problem here, Not who is right and who is wrong.

Well I feel like saying whats right and wrong. Nvidia have done some very good things in the past but they also made a massive mistake and thankfully microsoft didn't listen to them.

Nvidia developed on of the sega chips and do you know IT DIDN'T SUPPORT TRIANGLES I'm serious you could only render quads and when microsoft was developing directx nvidia tried to convience them to use quads thankfully microsoft ignored them and went with the strong industry standards and used triangles.

Now when directx 9.0 came around microsoft feel under the pressure and of nvidia and allowed a second shader language Cg.

Now I don't no the whole story of Cg but I do know this in Cg arrays are indexed with FLOATS OMG that is complete stuiped because a) you can have fractions now I believe ( presume atleast ) that if you tried to enter 1.5 as the index it would spit chips at you but never the less its still stuiped. b) in theory this is problem in practise it isn't but when you get large float and preform operations on them you don't get exact answer and large numbers can't be expressed accurately even though you aren't going to the limit of the float you still can't use it in a array index.



On a side note NV30/31/34 was a flop we are still waiting for MikeC to post his review I think he never will NV35 looks to be a decent card and I wish good luck to the owners of NV35s.


OH AND SOMEONE MAKE NVIDIA RENABLE MY FSAA ON MY TNT2 DRIVERS I DON'T LIKE USING 2.XX DETS!!!!!
__________________
I come from planet viper days so don't call me noob. I own 2 nvidia cards and one ati card so don't call me biased.
bloodbob is offline   Reply With Quote
Old 05-24-03, 02:56 AM   #30
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by StealthHawk
A very nice post. I do take issue with the above quoted statement though.

NV35 dropped Integer support from the shader pipeline.

Actually I was referring the the Nv30/Nv31/NV34 line which were in development before the r300 came around


I think the Nv35 is something a little different all together
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 05-24-03, 04:06 AM   #31
mbvgp
Registered User
 
Join Date: Nov 2002
Location: Behind my desk. Staring at the monitor
Posts: 22
Default

Quote:
Originally posted by bloodbob
Well I feel like saying whats right and wrong. Nvidia have done some very good things in the past but they also made a massive mistake and thankfully microsoft didn't listen to them.

....snip....

OH AND SOMEONE MAKE NVIDIA RENABLE MY FSAA ON MY TNT2 DRIVERS I DON'T LIKE USING 2.XX DETS!!!!!
It cant be that difficult using one of these -> "." ( its called a full stop ) little buggers can it .

And what games can you play with FSAA on TNT2 .
mbvgp is offline   Reply With Quote
Old 05-24-03, 04:14 AM   #32
bloodbob
Registered User
 
Join Date: Oct 2002
Posts: 123
Default

lol I remeber I used to have it set to 2x1 on my tnt2 in the old days. You better watch it or i'll end up using semi-colons instead of fullstops and thats probably more annoying.
__________________
I come from planet viper days so don't call me noob. I own 2 nvidia cards and one ati card so don't call me biased.
bloodbob is offline   Reply With Quote

Old 05-24-03, 05:18 AM   #33
Moose
Cheating is for losers!!!
 
Join Date: Jul 2002
Posts: 241
Default

Quote:
Originally posted by Nv40
but the opposite ,you will warn the public about the benchmark. and will discredit it ,as a reliable test for their cards.
hmmmmm So we should thank Nvidia for cheating in 3dmark to expose how unfair it is????


ok......
Moose is offline   Reply With Quote
Old 05-24-03, 06:33 AM   #34
zakelwe
Registered User
 
Join Date: Dec 2002
Posts: 768
Default

I think people are getting carried away with 3dmark03, including nvidia, and that is because most of the websites now quote it even though it has little relevance on todays games.

It's the future, but it's not now.

For instance, the FX 5200 plays DX9 games. The GF4 4200 only does up to DX8 games. What are we playing now so what should you buy now ? Fiorget about futureproofing, there used to be 6 month cycles and actually now there are 3 month cycles, although people keep saying 24 month cycles ( not when there is a two horse race and 0.09 micron coming along there isn't).

Compare the 3dmark2001, mainly DX7 with one DX8 game, score for 4200 and 5900 on the orb. Ok a lot of overclockers use the 4200 but even so it is 18000 v 10 000.

3dmark2001 seems to have been forgotten in all this but it is a great benchmark currently and shows you at least some cpu capping on gpu scores.

3dmark03 at the moment is more for the league tables IMHO. I should know, I am an active competitor.

Regards

Andy
zakelwe is offline   Reply With Quote
Old 05-24-03, 08:10 AM   #35
Slappi
Fanboy to the 576mm2 Pwr
 
Join Date: Feb 2003
Posts: 259
Default

Quote:
Originally posted by indio

I think some ppl. don't realise how much trouble Nvidia is really in. They are loaded with debt and liabilities. There product line is a flop. As far as optimising DX9 for the FX because of marketshare , keep dreaming. Nvidias marketshare lead can be attributed to the GF4mx which is DX7 . I don't think they will be coding for that. They will code for the most prevalent DX9 compatible hardware which at this point is the r300 series.

You obviously cannot read a financial statement. nVidia has a small Debt/equity. Here do some reading:

http://biz.yahoo.com/p/n/nvda.html
Slappi is offline   Reply With Quote
Old 05-24-03, 08:23 AM   #36
Nazgul
Witch-King Of Angmar
 
Join Date: Oct 2002
Posts: 18
Default Re: So now we know

Quote:
Originally posted by bkswaney
ATI worked hand in hand with FM to make sure there shader was at top performance for "there bench".
I don't see that at all. If ATI had that much influence in how the shaders were coded, they wouldn't have had to supercede them in the Catalyst drivers. The fact that their replacement shaders were 8% faster on that test suggests that the 3DMark shaders were not -specifically- coded for ATI cards.
Nazgul is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 02:35 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.