Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-03-03, 12:18 PM   #253
dsessom
Boobs are good.
 
dsessom's Avatar
 
Join Date: Feb 2003
Location: Lawton, OK USA
Posts: 134
Send a message via ICQ to dsessom Send a message via AIM to dsessom Send a message via Yahoo to dsessom
Default

With all the allegations flying, it's hard to tell just exactly what is truth and what is not, but here are a few thoughts...

nVidia has never backed 3DMark'03 since it was released, although they were involved with development - almost right up to the finish. But since it's public release, nVidia has not supported the benchmark for whatever reason, probably because their FX5800 was getting lower scores than the 9700 Pro, but I digress...

I think that Futuremark is trying their best to have a reliable, respectable product but they have gotten muddled in "application optimization" accusations from both ATI and nVidia because both companies want the fastest product and they are pretty neck and neck right now.

So, here is the real question:
Are nVidia and ATI's "Optimizations" so dramatic that you consider it cheating? OR - are they darned ingenious pieces of code that have potential to be used in the real world? ie:
if nVidia's drivers can "optimize" for 3DMark03.exe, then surely they could do the same for a number of other specific games that would benefit.

I don't really think Futuremark did anything to deserve all the slamming they have gotten. From my point of view, ATI and nVidia are the ones to blame if anyone at all - but that just goes back to the paragraph above.

So, am I just way out in left field? Or perhaps just too optimistic??
__________________
Main PC - | Pentium5 @ 6.4ghz (400x16) | Vapochill XC (XtraChilly) | Abit ICU812 v2.0 | 8GB (4x2GB) Corsair XMS PC12800 @ 2-3-3-5-1T | XFX Geforce 9850 Super Ultra GTX X4 | 4x1TB WD SATA3 RAID 0 | Sony DLQ32 DVD+/-RW | ThermalTake Chameleon case w/thermal sensitive paint. & 1000W PSU | Windows Valley SemiPro w/SP1.25
dsessom is offline   Reply With Quote
Old 06-03-03, 12:22 PM   #254
Dazz
"TOON ARMY!"
 
Join Date: Jul 2002
Location: Newcastle, United Kingdom
Posts: 5,138
Send a message via AIM to Dazz
Default

This got me.......
Quote:
Translation VIA HardOCP: FutureMark reneges on previous statements and confirms NVIDIA was not cheating on their benchmark and NVIDIA will not take a legal action against FutureMark that would bankrupt them. All about the $, but that is just our opinion. Still this does not change our thoughts on the FutureMark and their benchmark. Editorial coming...
Looks like an Microshaft company, Win at any cost
__________________
"Never interupt your enemy when he is making a mistake."

Processor: AMD Phenom II X6 1090T Black Edition @ 4.25GHz
Motherboard: Gigabyte GA-990FXA-UD3
Graphics: ASUS ENGTX470
Memory: 4GB Kingston HyperX Blu PC12800 DDR3
Monitor: LG E2260V-PN Full HD WLED 22" & DELL 20" 2005FPW,
Power: Coolermaster Silent Pro Modular 850w PSU
Sound: Logitech Z5500 Digital.
Cooling: Thermalright Silver Arrow.
1st Storage: Kingston V100 SSDNow128GB SSD
2nd Storage: Samsung Spinpoint F1 750GB
Dazz is offline   Reply With Quote
Old 06-03-03, 12:34 PM   #255
Solomon
 
Join Date: Sep 2002
Location: In a House
Posts: 502
Default Re: Re: Re: Re: Re: Re: Want to let Futuremark know you're displeased?

Quote:
Originally posted by digitalwanderer
I wouldn't even waste your time arguing with him. Either he's a totally brainwashed fool who actually believes nVidia's version of reality or he's just looking to provoke a reaction. Whichever it is, I'm a little tired of wasting me time on the clueless and the attention seekers.

This whole thing just really sucks and makes me feel all dirty about something I love, very unhip.
Yeah, I came to that conclusion too. Maybe he's Derek Perez in disguise? LOL

Regards,
D. Solomon Jr.
*********.com
Solomon is offline   Reply With Quote
Old 06-03-03, 12:37 PM   #256
Solomon
 
Join Date: Sep 2002
Location: In a House
Posts: 502
Default

Quote:
Originally posted by dsessom
So, here is the real question:
Are nVidia and ATI's "Optimizations" so dramatic that you consider it cheating? OR - are they darned ingenious pieces of code that have potential to be used in the real world? ie:
if nVidia's drivers can "optimize" for 3DMark03.exe, then surely they could do the same for a number of other specific games that would benefit.
I would consider it cheating if they don't follow the guide lines of optimizing their code. The clip plane was the nail in the coffin. Like I said awhile ago, if they want to optimize thier drivers that's fine, but when it comes to missing graphics and all this mess up. Then it's cheating. The "Cheating Drivers Hidden Dragon" picture sums up everything.

Regards,
D. Solomon Jr.
*********.com
Solomon is offline   Reply With Quote
Old 06-03-03, 12:44 PM   #257
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Well then ATi did not cheat either. Further it is now open Season on doing whatever you want to 3dmark03.

Question..

Didn't B3D prove that they used LOWER than Dx9 standard Rendering to achieve some of their speed boost??? How can Futuremark say that using FX12 etc in a Dx9 benchmark is legit???

I just dont understand this.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 06-03-03, 12:49 PM   #258
MrNasty
MrNasty
 
Join Date: Dec 2002
Location: Deli
Posts: 213
Default

Futuremark are sellouts. Nuff said.
__________________
You are not your system specs.
MrNasty is offline   Reply With Quote
Old 06-03-03, 01:03 PM   #259
Morrow
Atari STE 4-bit color
 
Morrow's Avatar
 
Join Date: May 2003
Posts: 798
Default Re: Re: Re: Re: Re: Want to let Futuremark know you're displeased?

Quote:
Originally posted by Solomon
So from this you are saying you rather have an nVidia optimized benchmark then a standard benchmark? There is a reason why it's a standard benchmark. nVidia can optimize for the benchmark. Hell no one is saying they can't. They can't cheat is what it's all about. The "Cheating Drivers Hidden Dragon" picture says it all. The clip plane says it all.

From what I'm getting from you is that it's o.k. to have all that not show and still be legit? You got to be kidding me.
no, you still don't get me. Let me put it this way, I don't like any benchmark from which company whatsoever. My opinion on benchmarks is that is only for the bragging rights which seems quite popular these days.

Considering decreased IQ in 3dmark03. You didn't understand my previous post. I said that nvidia probably did this cheating/optimization deliberately to show the weakness of benches (clip plane or not clearing back-buffer) for example. Those "optimizations" are hopefully not the kind of optimizations nvidia will now work on future games or the next 3dmark. The clip-plane was just one thing nvidia to demonstrate how easy it is to manipulate benchmark results.

The optimizations done in games however are a different thing and since the first Q2 nv optimizations we all know that you get a speed increase from them WITHOUT a IQ decrease and exactly those optimizations is what separates 3dmark from "real-world" performance. Did I finally made myself clear?

I do not endorse cheating in games in regards of IQ nor am I blindfully believing nvidia whatsoever but I believe that nvidia succeeded in making it pretty clear that optimizations matter these days, that the graphiccard industry has changed quite a bit and if you want a "fair" benchmark, you need to put the tested hardware in the best light possible and not in just an odd position where it never was designed to be!

Again, I'm not talking about the "optimizations" nvidia did in 3dmark03 or ATI did in Quake3 about which people should get over now, but I'm talking about real optimizations like they happened in all importants games the last 10 years.

Let's burry Quack, 3dmark and benches in general and move on to the real thing which matters the most and hopefully the reason why people buy their shiny new graphiccards: Games and applications like 3ds max.
Morrow is offline   Reply With Quote
Old 06-03-03, 01:11 PM   #260
DivotMaker
 
Join Date: Jul 2002
Posts: 823
Default

I guess I have a problem with FM's position on this in that they will now allow IHV-specific optimizations.

Last time I checked, the whole purpose for creating a benchmark is to compare products on a level playing field. The problem is establishing a "level playing field". This is likely never going to happen because the IHV's insist on differentiating their products. Therefore, any time the benchmark does not favor any of the "specific features" that differentiates each product, then each company is going to be screaming about it. The point I am making is in reference to nVidia's decision to use FP 32 instead of FP 24 as is the DX9 standard in the high end shader.

If I understand correctly, the FP 24 standard for DX9 was estabilshed before or during design work for GF FX 5800. If so, I am not certain why nVidia would not change this for the 5900 unless it is essentially similar silicon and changing at the stage of production (Feb 03 til now) would have meant a re-taping of a new chip. I think it is likely that nVidia underestimated the horsepower needed to compete with FP32.

I guess what I am trying to say is that unless companies are REQUIRED to adhere to SPECIFIC requirements of identically classed cards through Direct X and or Open GL, then I am afraid you will see more incidents like this one which are not helping a "less-than-healthy" PC gaming industry. To my knowledge, ATI has complied to the DX9 standard. nVidia has not at least from the shader perspective.

I am very satisfied that ATI and nVidia are competing and trying to give us more features with every hardware release. However, unless each of them are forced to adhere to specific features with which to compare equally on a level playing field, then MEANINGFUL, LEGITIMATE, and ACCURATE benchmarking has a very long tough road ahead.
DivotMaker is offline   Reply With Quote

Old 06-03-03, 01:26 PM   #261
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
ATI has complied to the DX9 standard. nVidia has not at least from the shader perspective.
No, Nvidia Has complied. Their Hardware is 100% Dx9 Complient in every way. The problem is they made some rather poor Design choices in the light of Raw Performance. They can Run Dx9 complient all they want. But, They did not include the hardware under the hood to compete at Full DX9 percision.

I want to point out again that FP32 is not any harder to do than FP24 *if* you have the hardware in place to handle the load.

Thus Nvidia Could run Full Dx9 all day long. But, if they do it all by the book with no *cheating* *hacking* *reducing Quality* *shortcuts* etc.. they will get spanked.

This is really a tempory issue for them though. As I am pretty confident that the Nv40 will have full power to the Phaser banks so to speak
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 06-03-03, 01:41 PM   #262
gordon151
Registered User
 
Join Date: Mar 2003
Posts: 264
Default

Hellbinders comments making sense and offering unbiased useful information. Hrm, that's not good....
__________________
||- A64 2800+ @ 2.0GHz -||- 1GB Buffalo DDR400 @ 225Mhz -||- PNY 5700 128MB -||
gordon151 is offline   Reply With Quote
Old 06-03-03, 01:44 PM   #263
Solomon
 
Join Date: Sep 2002
Location: In a House
Posts: 502
Default

Quote:
Originally posted by BigBerthaEA
I guess I have a problem with FM's position on this in that they will now allow IHV-specific optimizations.

Last time I checked, the whole purpose for creating a benchmark is to compare products on a level playing field. The problem is establishing a "level playing field". This is likely never going to happen because the IHV's insist on differentiating their products. Therefore, any time the benchmark does not favor any of the "specific features" that differentiates each product, then each company is going to be screaming about it. The point I am making is in reference to nVidia's decision to use FP 32 instead of FP 24 as is the DX9 standard in the high end shader.

If I understand correctly, the FP 24 standard for DX9 was estabilshed before or during design work for GF FX 5800. If so, I am not certain why nVidia would not change this for the 5900 unless it is essentially similar silicon and changing at the stage of production (Feb 03 til now) would have meant a re-taping of a new chip. I think it is likely that nVidia underestimated the horsepower needed to compete with FP32.

I guess what I am trying to say is that unless companies are REQUIRED to adhere to SPECIFIC requirements of identically classed cards through Direct X and or Open GL, then I am afraid you will see more incidents like this one which are not helping a "less-than-healthy" PC gaming industry. To my knowledge, ATI has complied to the DX9 standard. nVidia has not at least from the shader perspective.

I am very satisfied that ATI and nVidia are competing and trying to give us more features with every hardware release. However, unless each of them are forced to adhere to specific features with which to compare equally on a level playing field, then MEANINGFUL, LEGITIMATE, and ACCURATE benchmarking has a very long tough road ahead.
Well said...

Regards,
D. Solomon Jr.
*********.com
Solomon is offline   Reply With Quote
Old 06-03-03, 01:51 PM   #264
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Thumbs up OOooooh!!! A Jeopordy question!

Quote:
Originally posted by gordon151
Hellbinders comments making sense and offering unbiased useful information.
I know!

"How can you tell when Hellbinder is REALLY pissed off?"
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
User Response : PR Response to Linus Torvald's Inflammatory Comments Blackcrack NVIDIA Linux 16 06-29-12 05:57 AM
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 01:00 AM
PR Response to Linus Torvald's Inflammatory Comments MikeC NVIDIA Linux 0 06-18-12 11:14 PM
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 02:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 11:30 PM

All times are GMT -5. The time now is 09:43 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.