Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 10-02-03, 08:26 AM   #49
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

If the ATI card was 30 fps ahead in all the tests, then people like Hellbinder would cheer and celebrate. If the numbers are closer, than they will moan and complain.

Did anyone actually expect NVIDIA to run these directX9 games with full FP32 precision? Of course not.

Clearly ATI still has the edge in games like HL2. At the same time, things will never be apples to apples, as the Radeon's do FP24 precision and the FX cards can do FP32 or FP16. I would think that NVIDIA is doing what they can to make the game more playable on their cards with newer driver revisions. We will just have to wait and see exactly how much IQ is compromised because of this.
jimmyjames123 is offline   Reply With Quote
Old 10-02-03, 08:37 AM   #50
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default Re: Re: Anand posts Initial HL2 results with new Dets

Quote:
Originally posted by DMA
Uhm..why is it disturbing?

Drop your attitude for one minute and think of the "not so happy" FX5900 owners out there

I'm sure they don't think it's disturbing at all that they are gonna be able to play H-L2 with their $400 cards.
Thats fine. But to PUT UP A BENCHMARK SCORE THAT YOU DID NOT RUN YOUR SELF, KNOWING THAT IQ HAS BEEN COMPRIMISED IN THE PAST IS ABSOLUTLY WRONG!!! There is no point in debating that as its a well known fact. FX users get gain =good. Benchmark scores with no IQ comparisons, forcing one to run mixed mode vrs the other one running full DX9 code, using un-released drivers = very bad.

Folks its not about what the numbers of the scores show. Its about HOW YOU GOT them.


Ruined,

wow. I hope you dont believe some of the stuff your typing.
To see how well the ATI card does HDR, see the TechReport's review on the 9800Xt.

You also dont seem to understand that the FX only has 4 shaders units. Those are split duty. The R9800 has 8 fully dedicated. In a shader limited benchmark such as HL2, there is no way the FX can win. Its all math. The FX can NOT output as many pixels through its shaders as the 9800 can. Thus in a shader limit test, the FX will always loose. Unless the play driver tricks.

Quote:
Now that Nvidia DX9 hardware is out and developers can start using it to develop their games from the ground up, Nvidia won't be in this situation as often in the future.
Uhmmm maybe because their hard ware is much slower than ATIs in DX9 games there is not much a developer can do. Unless you want them to spend x5 times as long to develope an NV path because their DX9 path sucks. Newsfalsh, develoeprs dont like to do this. Skuzzy knows as he is a developer. Listen to him for a change.


Quote:
At the same time, things will never be apples to apples...
Yeap but there is a little thing called DX9 why not force each card to run the DX9 code path then? Thats much closer when trying to get apples to apples.
jbirney is offline   Reply With Quote
Old 10-02-03, 08:42 AM   #51
rth
Registered User
 
Join Date: Sep 2003
Posts: 152
Default

Quote:
Originally posted by Ruined
Now that Nvidia DX9 hardware is out and developers can start using it to develop their games from the ground up, Nvidia won't be in this situation as often in the future.
I don't think any proper (non-BridgeIt) developers would base their games on nvidia hardware.

the 5600 and 5200 are too slow for DX9, and the 5900 has virtually no users (percentage of market share) - not worth the attention.

if developers want to use DX9, most won't have the time valve did to spend 5 times longer on FX's and just follow the standards.

reminder: ATI has at least 97% of $300+ card market share.

plus 3rd parties are dropping nvidia. if nvidia wants special code they'll have to pay for it, whereas ATI just have to give them a copy of DX9 specs.
__________________
Ifnvidiaturnedoffcheatsforonedayit'sworldwouldfall apart
rth is offline   Reply With Quote
Old 10-02-03, 08:48 AM   #52
The Baron
Guest
 
Posts: n/a
Default

*sigh*

I wondered when this would come out. Most of you have said it's just shader replacement. From what I've heard, it's not, at all.

Say hello to our good friend clip planes. I can't confirm it; it is simply what I've heard from what I consider a reputable source.
  Reply With Quote
Old 10-02-03, 08:52 AM   #53
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Baron,

is that the new feature, SCP or have they developed Dynamic Clip Planes
jbirney is offline   Reply With Quote
Old 10-02-03, 08:55 AM   #54
The Baron
Guest
 
Posts: n/a
Default

Heh. Maybe they have. Wouldn't be a bad feature...

And this score seems particularly telling:

Quote:
e3_techdemo_5 83.5 64.5
This is the only one where ATI demolishes NVIDIA. On others, ATI is ahead by less than 10%, and in the rest, NVIDIA is equal or ahead of ATI's performance.

I don't know. Is it clip planes? I hope not, but I wouldn't be surprised.
  Reply With Quote
Old 10-02-03, 09:09 AM   #55
jimbob0i0
ATI Geek
 
jimbob0i0's Avatar
 
Join Date: Apr 2003
Posts: 268
Send a message via ICQ to jimbob0i0 Send a message via Yahoo to jimbob0i0
Default

Baron I really, really hope you are wrong about that. If they clip plane the HL2 benchmark like they did 3DMark03 I want Valve to damn them. I want a public statement declaring effective war against them for destroying another benchmark tool - and this time for an actual game too!

If NV goes down I no longer care. XGI and S3 can provide competition for ATi and NV can just ___ ___.

As I said in another thread - when futuremark's deadline of 'proper' drivers comes to pass at the end of this month I am looking forward to what they have to say about NV.

P.S. Gabe was ______ enough at the 51.xx quality hacks... if NV attempts to do that to *his* benchmark and game imagine his wrath then!
jimbob0i0 is offline   Reply With Quote
Old 10-02-03, 09:13 AM   #56
The Baron
Guest
 
Posts: n/a
Default

Believe me--I hope the guy who told me is simply full of crap. I don't know. I won't know until I can make my own timedemos (or Dave Baumann can ). Clip planes wouldn't show up in fresh demos (or maybe they would and cause goofy visual artifacts)--either way, it'd be easy to tell when they're being used.

That techdemo_05 score looks weird, though. Simply doesn't make sense in the rest of the benches; if NV had it, they would either A. replace so many shaders it's not funny or B. add clip planes. Some of the shaders that were replaced for other levels should apply here, but still. 20FPS down. Makes you wonder.

And it's sad, isn't it? Here it is, NVIDIA gets a huge performance increase in a new game, and we all assume they're cheating. Of course, they probably are. Remember when things used to be simple?
  Reply With Quote

Old 10-02-03, 09:14 AM   #57
PaleGreen
Registered User
 
Join Date: Sep 2003
Posts: 25
Default

Quote:
Originally posted by rth

reminder: ATI has at least 97% of $300+ card market share.
Numbers and source, please? Or did you pull that out of your arse?

Linked from the main NVNews page:

"Overall segment (2D and 3D professional) share is:
NVIDIA 73%
ATI 13%
3Dlabs 4%
Other 9% (Matrox, Appian)"

http://www.amazoninternational.com/index.asp

Are you telling me all of those people are buying Quadro2 EX's?
PaleGreen is offline   Reply With Quote
Old 10-02-03, 09:17 AM   #58
PaleGreen
Registered User
 
Join Date: Sep 2003
Posts: 25
Default

Getting back to the numbers... I'm very impressed with what NVidia has done!

Putting all biases aside, what do you think these new drivers mean for the 5200/5600 cards? Is it possible budget-minded (cheap!) gamers like me might have some good sub-$100 options for HL2? (I can live with 800x600)
PaleGreen is offline   Reply With Quote
Old 10-02-03, 09:18 AM   #59
The Baron
Guest
 
Posts: n/a
Default

Quote:
Originally posted by PaleGreen
Getting back to the numbers... I'm very impressed with what NVidia has done!

Putting all biases aside, what do you think these new drivers mean for the 5200/5600 cards? Is it possible budget-minded (cheap!) gamers like me might have some good sub-$100 options for HL2? (I can live with 800x600)
If they're clip planes, they are less than useless. Benchmark scores will have nothing to do with game performance.
  Reply With Quote
Old 10-02-03, 09:27 AM   #60
Socos
Registered User
 
Join Date: Jul 2003
Location: Michigan
Posts: 137
Default Re: Re: Re: Re: Anand posts Initial HL2 results with new Dets

Quote:
Originally posted by Richthofen
as long as i don't see a huge difference in quality i won't care what Nvidia replaces in that game.
ATI payed VALVE to code that game the way they wan't to have it.
If Nvidia now replaces some things for their cards. Fine...
Pretty funny. Nice 8 Mio dollar investment
As long as there are people in the world like you, [N] will be in business.

ATI runs the, "STANDARD" DX9 codepath. Valve did not code for ATI hardware, they coded to the DX9 spec. I believe you will see this when XGI's card comes out. If it runs well on both ATI and XGI hardware and not [N]'s, it must be that ATI and XGI both paid valve???? Yeah thats it.
__________________
AMD 64 3000 + - 1 GB Kingston HyperX - Chaintech ZNF-150 MB - Audigy 2 Gamer ZS - 200GB SATA HD -
ATI X800 Pro OC 520/540 - 21" Cybervison monitor - Thermaltake Butterfly 450 watt PS - UFO Custom Case
Socos is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 11:27 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.