Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-13-03, 02:09 PM   #25
vampireuk
**** Holster
 
vampireuk's Avatar
 
Join Date: Mar 2001
Location: The armoury
Posts: 2,813
Send a message via AIM to vampireuk
Default

I give it about a week
__________________
I put children in microwaves.
vampireuk is offline   Reply With Quote
Old 11-13-03, 02:51 PM   #26
Miksu
Registered User
 
Join Date: Apr 2003
Posts: 76
Default

Quote:
Originally posted by ChrisW
ATI compiler? I don't know anything about it but I want to know what that is too. If ATI is doing the same thing and calling it a "compiler" then they should be equally chastised.
They have a compiler. Since Cat 3.6 I think. They aren't doing the same thing.

edit: You can find some info from here atleast: http://www.beyond3d.com/forum/viewto...ati%20compiler

Last edited by Miksu; 11-13-03 at 02:59 PM.
Miksu is offline   Reply With Quote
Old 11-13-03, 02:51 PM   #27
aapo
Registered User
 
aapo's Avatar
 
Join Date: May 2003
Location: Finland
Posts: 273
Default

Quote:
Originally posted by TheTaz
That's why I said it's a bad tool, nothing to do with it's technology.
Mmkay, Sorry, but I just misunderstood and thought your point was technical. I suppose you are right about these political issues, but what can you do?
__________________
no sig.
aapo is offline   Reply With Quote
Old 11-13-03, 03:14 PM   #28
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
Originally posted by Uttar
AAARGH!
The compiler is for real guys. Stop inveting BS - everyone is, and it's just annoying me.

The effects of the compiler cannot be as good as hand-tuned code most of the time, but it's pretty darn good. And developers will have making better than what the compiler does.

Disabling the compiler seems like BS though. It's still on, or their scores would be even significantly lower.


Uttar
Uttar.. Everyone has a "compiler". So obviously its real. But you know as well as I do that that is not whats happening here nor is that what is bringing the big FPS increases in the well known Benchmark games like HALO. The "Optomizer" should work without Application detection and thats all there is to it.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 11-13-03, 03:26 PM   #29
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

To start with the accusation that all 3dmark did is disable Nvidias "optimizer" is a flat out LIE. If Kyle had the least bit of an understanding about how Applications, D3D, Drivers, and Hardware work together he would be spending his time pointing out what a damn lier Nvidia is again instead of how evil 3Dmark03 is.

The Application talks to D3D not the hardware or Drivers. Then D3D makes its requests to the GPU where it talks to the Drivers which in Turn talks to the Hardware. The "Optimizer" (which is a damn lie to start with) takes place at the Driver level. When Nvidia is using application detection they are simply setting up certain conditions so that When certain Requests are made they are expecting it and can replace it at the driver level with what they want.

This is not and never has been a "Dynamic Shader Optimizer". That is a complete load of PR Crap. It is just a Term that they use to cover up what they are really doing. Which is the same thing they have been doing. Replacing game and Benchmark code as they see Fit. Simply to make their hardware work. In fact Futuremark responded to the accusation yesterday. Apparently Kyle either did not read it or did not understand it.

Here is another Case where Kyle does not even understand the fundamentals enough to know what’s going on. I used to think kyle was a Biased Nvidiot. What I have come to realize is its simply a case of him not having a Clue and going with whoever’s PR sounds the best to him. Which usually ends up being Nvidia because lets face it.. They are good. So Good in fact that nearly everyone has bought into and believes their "Shader optimizer" Crap. Yet again Blaming 3mdark03 as the evil does when they are simply Trying to make everyone play fair. Something Nvidia can not do.

1. There are IQ differences even if they are hidden fairly well. They are Still there. Thus it is NOT Mathematically Equivalent.

2. The point is that this test is supposed to be a true measure of baseline performance. That will demonstrate what you can expect from the card without 6 months of carefully planned and executed "optimizations".

3. For some unknown reason Kyle does not seem to understand that 3Dmark03 is simply using BASIC D3D. It has nothing in there that favors ATi or Nvidia. Yet he keeps insinuating that off of nothing more than Something Nvidia is telling him. They are Telling him whatever they have to to cover up or make excuses for what they are doing.

People seem to forget that new games like Max Payne 2 are performing badly on Nvidias hardware. Why? because they did not include it in their 6 month "optimization" session. This is why you need 3dmark03 and honest results. Nvidia is simply not going to be able to "optimize" for every single game that comes out. Its impossible. When they do "optimize" it will not appear until 3-6 months later whenever they release their next driver.

The only exceptions to this will be when they hand program the game for the developer like in the case of STALKER.

Kyle Why dont you do the Right thing for ONCE In your careeer. Talk to all parties involved. Hear what everyone has to say and then write an Informed editorial based on the Truth and not Nvidias lying PR.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]

Last edited by Hellbinder; 11-13-03 at 03:31 PM.
Hellbinder is offline   Reply With Quote
Old 11-13-03, 03:30 PM   #30
Sazar
Sayonara !!!
 
Sazar's Avatar
 
Join Date: Jan 2003
Location: Austin, Texas
Posts: 9,297
Default

:blink:

did you just repeat yourself there m8 ?
Sazar is offline   Reply With Quote
Old 11-13-03, 03:32 PM   #31
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
Originally posted by Sazar
:blink:

did you just repeat yourself there m8 ?
Yeah i did hehehehe.. This is a Combination of 2 other messages from soemwhere else.. I screwed up..

Hey.. At least i am not Quoting myself and arguing about it heheheh
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 11-13-03, 03:35 PM   #32
vampireuk
**** Holster
 
vampireuk's Avatar
 
Join Date: Mar 2001
Location: The armoury
Posts: 2,813
Send a message via AIM to vampireuk
Default

Ahh Sunlinux, that guy was a riot
__________________
I put children in microwaves.
vampireuk is offline   Reply With Quote

Old 11-13-03, 04:00 PM   #33
sxotty
Registered User
 
Join Date: Feb 2003
Posts: 522
Default

HB, you have mislabled what Uttar said he was very clear.

He said it DOES NOT disable the compiler, it only gets rid of the cheats.

He said the compiler IS better than it was before, and that if someone found some old drivers, and ran them (w/o cheats) the new ones would go faster (w/o cheats) thus the compiler is accomplishing something.

He also clearly stated that the cheats are not part of the compiler although PR would have you beleive they are.


edit: I think ATI should just run Nvidia's demos and say geewhiz our card is faster... then Nvidia could not say ah but we coded them so we would be at a disadvantage .
__________________
ATI x800
AMD 3000+ , nf4,WD Raptor,Antec 430
sxotty is offline   Reply With Quote
Old 11-13-03, 04:09 PM   #34
ginfest
"Oderint dum metuant"
 
Join Date: Jan 2003
Location: USA
Posts: 142
Default

Quote:
People seem to forget that new games like Max Payne 2 are performing badly on Nvidias hardware. Why? because they did not include it in their 6 month "optimization" session
HB, I agree with some of what you are saying but have to question the above:
I have a 5900 and a 9800 and yes the 9800 runs Max Payne and my other games very well. I run all my games at 1280x960 4xAA/8xAF. I had to put the 5900 back in to get Call of Duty running and have since continued playing Max Payne and haven't changed any settings. I run it with the console enabled to get FPS showing and haven't noticed a big diff. I don't have and exact numbers but both cards run it at the above settings at 60 FPS or better.
Anyway, if you're talking IQ, I'm not sitting here while playing the game and other games that I play saying "...s**t, that looks crappy compared to my 9800.." and so on. Yes I know that comparing individual frames will show better AA for the 9800 but I'm talking game experience. And no I'm not saying that it doesn't count if you can't see it, just that the difference is not enough to ruin the game for me.
I suppose you could say it's just me-I wonder if others a have run both cards recently and saw a noticeable difference, ie something that makes you say "WTF, I can't play this game like this.."?

My $0.02

Mike G
ginfest is offline   Reply With Quote
Old 11-13-03, 04:16 PM   #35
TheTaz
Registered User
 
Join Date: Jul 2002
Posts: 621
Default

Quote:
Originally posted by aapo
Mmkay, Sorry, but I just misunderstood and thought your point was technical. I suppose you are right about these political issues, but what can you do?
No Prob. Kinda figured that's what you were thinkin'.

I seriously think Futuremark should come up with some other way of funding... hell even spamming ads in the application dialog, and ads between the tests, like a P2P program, would be more desireable than knowing they take money from IHV's. (For the free version, obviously, pay version... no ads. Also no Graphics chip / card ads)

Don't get me wrong... I hate ad spamming. It was just a quick suggestion. Any type of funding they can come up with, that wouldn't be "politically" considered as a "possible bribe", would be ok with me.

Regards,

Taz
TheTaz is offline   Reply With Quote
Old 11-13-03, 04:24 PM   #36
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Smile

Quote:
Originally posted by NickSpolec
I especially love this quote from Derek Perez, in regards to the performance drop in 3DMark03 on the GFFX's (with the newest patch)



This just proves that Nvidia cares far more about benchmarks then actual games. How completely, utterly stupid do you have to be to make a comment like this? It was unintentional on Perez's part --- He was trying to make Nvidia the victum once again, ala "Oh, look what FutureMark is going to force us to do... Now we have to devote more resources to cheating because they keep disabling our cheats!"


Hey, Nvidia, hear's an idea... STOP WORRYING ABOUT 3DMARK AND WORRY MORE ABOUT GAMES.

It is just so infuriating.

Nvidia's like an addict, and the 3DMark score is their crack.


Well, i see more and more of these kind of post now. STOP WORRING ABOUT 3DMARK and worry more about games.
Well, how can they when everyone and there brother is bitching about lossing 15% on a benchmark?

I for one think we should use ONLY games to benchmark.
That would stop this and the FM of the world would die.

I did like Kyles writeup.
PR aside.
__________________
Notebook!
Compaq Presario CQ60-215DX
AMD 64 Athlon X2 @ 2GHz (QL62)
15.6 inch HD WideScreen
Nvidia 8200M-G 895mb
2Gig system ram
250Gig SATA 5400rpm HDrive
Vista Premium
bkswaney is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Benchmarking Analytical Queries on a GPU News Archived News Items 0 05-20-12 07:00 AM
NVIDIA GeForce GTX 670 Video Card Review @ [H] News GeForce GTX 670 Reviews 0 05-10-12 10:11 AM
unigine Benchmarking with GTX285 and 302.07 on KDE4. This is normal? sl1pkn07 NVIDIA Linux 3 05-10-12 06:11 AM
Benchmarking AMD's 768-Shader Pitcairn: Not For Public Consumption News Archived News Items 0 05-08-12 01:30 AM
Hardball Editorial legion88 Feedback Forum 1 09-02-02 05:45 PM

All times are GMT -5. The time now is 01:26 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright ©1998 - 2014, nV News.