Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-03-03, 05:43 PM   #289
StealthHawk
Guest
 
Posts: n/a
Default

Ok, time to take a deep breath and take a step back.

Someone answer this question, please. nvidia and ATI's driver optimizations for 3dmark03 are still invalid, are they not? Futuremark said yesterday that nvidia's optimizations weren't cheats, but also weren't allowed for 3dmark03.

Does this mean they will recall patch 330? I don't think so. Then why does everyone say that nvidia is free to cheat in 3dmark03 now? Because as I see it, that isn't the case at all.
  Reply With Quote
Old 06-03-03, 06:02 PM   #290
Miester_V
Apprentice's Master
 
Join Date: May 2003
Location: U.S.A
Posts: 140
Send a message via AIM to Miester_V
Default

Just for fun, I'd like to see Ati make a FutureMark 'optimized' driver set and use it to run the benchmarks. And when Nvidia sees that ATi used their own strategy against them and BEAT them, they can't cry 'CHEAT!' anymore. Hahah lol. What a predicament.
__________________
OS: Win XP Pro
CPU: AMD XP 2400+
Mobo: Epox 8K7a
PSU: Mortec 300w
Memory: Crucial 2100 512MB DDR
VGA: ATI Radeon 9800 Pro Retail
Hard Drive: Maxtor 40GB 7200rpm
DVD ROM: Pioneer 16X
CD-RW: Plextor 16/10/40
Sound Card: Sound Blaster Audigy
Speakers: Boston Acoustics 7800 4.1

------------------------------------------------
It is my DUTY to not purchase any Microsoft products.
Miester_V is offline   Reply With Quote
Old 06-03-03, 06:36 PM   #291
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Stealthhawk...

Here is what you dont understand.

If the extremes that Nvidia went to are only hardware *optimizations* then it does not really matter whether Futuremark says they disprove or not. It is literally open season to do whatever you can get away with until another patch comes out, then find another way to *optomize*, then another patch comes out.. etc etc etc...

If there is no cheating. Then its up to IHVs and their fanbase and partners to decide what is legit or not. Which now at this time means its completely open season anything goes. Becuase Frankly 3dmark03 just became nothing more than another run of the mill Application like any other game.

Catalyst Maker already posted a public statement that they are officially suspending all work on Features etc in drivers and will concentrate *Exclusively* on 3dmark *optimizations*. He speculates that this will result in a 25% increase in performance. I saw another post in a private forum that the ATi D3D driver Guys where jokingly putting together a driver that replaces every single texture in 3dmark with the ATi logo.

Now these guys are jesting a little you understand but it makes the point. If its not a *Cheat* and all these things are specific hardware *optimizations* then anything goes and Who care wether 3dmark approves of it or not.

Dont you see the ramifications?
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 06-03-03, 09:11 PM   #292
muzz
 
muzz's Avatar
 
Join Date: Feb 2003
Posts: 816
Default

Quote:
Originally posted by Mulciber
that was blatant sarcasm
No sa.........
__________________
muzz
muzz is offline   Reply With Quote
Old 06-03-03, 10:50 PM   #293
ImaginaryFiend
Registered User
 
Join Date: Jun 2003
Location: Oregon
Posts: 9
Default

Quote:
Even though i am completely aware that you looked a couple pattents up. Let me tell you that many patents owned by Nvidia, ATi, Intel, IBM etc etc were aquired becuase they Purchaced the IP from or Purchaced the Company holding the IP.
You're right that companies license patents from each other and they also buy rights to own and prosecute patents. Nvidia bought 3dfx's patents, for example. Microsoft bought SGI's graphics patents, too. But if you would bother to look the patents up, as I did, you would discover that every one I cited had entirely Nvidia employees as inventors. This ought to fit your criteria for specifying who invented it. If it's good enough evidence for the US government and the whole legal and business communities, it ought to be good enough for us.

Quote:
Originally posted by Hellbinder
Again, when a product is released has nothing to do with who invented the technology.

Radeon had Pixel shader 1.0 at the Time of the GF2. Or were you even aware of that. which by your logic makes ATi the inventors of Pixel shader Technology. Which makes the Radeon always 1 step AHEAD of Nvidia in Pixel shader generation.


"Pixel shader 2.0" and beyond refers to programmable pixel processing, the new DX9 stuff that Nvidia and ATI both separately developed and ATI and Nvidia worked with Microsoft to define the DX9 interface to it. The "Pixel Shader 1.0" term is a DirectX 8 term. Before DX8 it was just called register combiners. Either way, PS 1.0 through 1.4 are essentially register combiners, an Nvidia patented technology. David Kirk, Nvidia's chief scientist, is the principal inventor. All the other inventors are Nvidia employees.

Register combiners were first introduced in the Geforce 256 in August 1999. The Geforce 2 GTS came out in early May 2000 and the Radeon in May or June 2000. The GF2 made nearly no changes to the register combiners. The Radeon was nine months after the Geforce 256, the first machine to use register combiners, or "Pixel Shaders".

Quote:
But that is not the case. Becuase Pixel shaders, Vertex shaders, Cube maps and nearly all 3D technologies were developed by Groups of people in think tanks, or at Colledges, or at Pure Technology companies. T&L was Fist developed for a retail product By Rendition who now makes Ram for Micron. S3 had T&L designs going long before Nvidia but never introduced them. PowerVR offered Full Hardware Geometry Engine over a year before Nvidia in their ARcade Devision.

The list goes on.

dont you people get that I am not trying to say that it Was ATi over Nvidia here???
First, we need to distinguish between T&L and programmable vertex processing. Both have existed for decades. Jim Clark, founder of SGI, had a 1984 Siggraph paper on a T&L engine. SGI and other graphics hardware has had it forever. Some of them were even programmable, but only in microcode by the manufacturer.

In the PC graphics era, several companies have had T&L on the same card as the rasterizer. 3D Labs might have been the first. Diamond was early also, with the FireGL 5000. Rendition was also in there, as you say. However, these need to be distinguished from what is currently done - having T&L on the same chip as the rasterizer, not just the same card. This was first done for PC graphics by Nvidia with the Geforce 256, hence its name (Ge for geometry). Nvidia patented having it on the same chip and all the inventors listed are Nvidia employees. Again, this was nine months before the Radeon. The S3 Savage 2000 was introduced between the Geforce 256 and the Radeon.

The next step is programmable vertex processing, which was developed entirely by Nvidia - with Microsoft, the OpenGL ARB, and ATI coming in later in the design to work on interfaces to define vertex programs. Please see the Siggraph 2001 paper on the programmable transform engine by Erik Lindholm, an Nvidia architect, and other Nvidia employees. Programmable vertex processing was also one of the patents I listed, with Erik Lindholm being the main inventor.

You're right that cube maps were not invented by Nvidia. I didn't say they were. They were first used in the early 80s in software renderers at universities and animation companies, as you say. I said they were first put in hardware by Nvidia. Again, this was in the Geforce 256 vs. the Radeon era. If Nvidia hadn't put cube maps in their hardware Microsoft wouldn't have exposed them in DX7 and ATI wouldn't have implemented them as soon. The original point still stands.

You have Nvidia to thank for several graphics technologies and your gaming life is better because of Nvidia. That doesn't mean they can do no wrong but I think it means that they will be worthy of respect in the future as they have been in the past.
ImaginaryFiend is offline   Reply With Quote
Old 06-03-03, 11:04 PM   #294
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Hellbinder
Stealthhawk...

Here is what you dont understand.

If the extremes that Nvidia went to are only hardware *optimizations* then it does not really matter whether Futuremark says they disprove or not. It is literally open season to do whatever you can get away with until another patch comes out, then find another way to *optomize*, then another patch comes out.. etc etc etc...

If there is no cheating. Then its up to IHVs and their fanbase and partners to decide what is legit or not. Which now at this time means its completely open season anything goes. Becuase Frankly 3dmark03 just became nothing more than another run of the mill Application like any other game.

Catalyst Maker already posted a public statement that they are officially suspending all work on Features etc in drivers and will concentrate *Exclusively* on 3dmark *optimizations*. He speculates that this will result in a 25% increase in performance. I saw another post in a private forum that the ATi D3D driver Guys where jokingly putting together a driver that replaces every single texture in 3dmark with the ATi logo.

Now these guys are jesting a little you understand but it makes the point. If its not a *Cheat* and all these things are specific hardware *optimizations* then anything goes and Who care wether 3dmark approves of it or not.

Dont you see the ramifications?
Ok, I GOTTA get a copy of the 3dm2k3 with the ATi logo replacement thingy...too much screenshot fun!

Quote:
Originally posted by ImaginaryFiend
You have Nvidia to thank for several graphics technologies and your gaming life is better because of Nvidia. That doesn't mean they can do no wrong but I think it means that they will be worthy of respect in the future as they have been in the past.
They are going to have to earn that respect, the same way they've earned the communities scorn and disgust.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 06-03-03, 11:07 PM   #295
Zenikase
Registered User
 
Join Date: May 2003
Posts: 99
Default

Correct me if I'm wrong, but aren't they called "pixel shaders" for their programmability? The major differences between PS1.4- and PS2.0/3.0-type shaders is that the latter is capable of floating-point precision and branching/conditional code support. Essentially, that's it. The pixel shader found in NV2x isn't fixed function (like register combiners), but it doesn't offer the flexibility of its successor.
Zenikase is offline   Reply With Quote
Old 06-04-03, 12:26 AM   #296
SmuvMoney
Registered User
 
SmuvMoney's Avatar
 
Join Date: Feb 2003
Location: Chicago, IL
Posts: 147
Send a message via ICQ to SmuvMoney Send a message via AIM to SmuvMoney Send a message via MSN to SmuvMoney Send a message via Yahoo to SmuvMoney
Default Re: Re: 06/02/2003 - The Day The Benchmark Died...

Quote:
Originally posted by frenchy2k1
Actually, this is probably not such a bad thing. Now, instead of just spending our days running useless benchmarks to compete for bragging rights, we could return to playing games...
OT, but I have never actually run or installed 3DMark03. I stopped running 3DMark01 sometime in early 2002. That being said, let me get back on topic...

I think there is a place for both syn(thetic) and game benchmarking. Syn benching can be used to show theoretical potential - PS 2.0 shaders and such. Meanwhile, game benching show a closer correlation to actual potential/ability based on game engine. Both are needed to assess a card's ability.

However, how can I be sure that nVidia (or to be fair, ATi, Sis, Matrox, etc) won't take some of these "less than noble" application optimizations and use them in game benchmarks? I welcome the valid application optimizations - it's the invalid ones (aka known as cheating but nVidia didn't see me type that ) that concern me. Worse, If you can strongarm your way into getting away with it on a syn benchmark, what stops it from occurring on a game benchmark? What happens that potential based on benchmark is shown to be a fallacy once you start the game? Heck how do you know if the game benchmark you run is even truly valid anymore? At least FM had dev tools to make sure things were legit, which is how nVidia got into this into the first place. Will games be able to have this same type of functionality? Do we want to burden the game devs as such? They have enough deadlines and resource management issues to deal with trying to create the game - a game engine benchmark suite could be considered superfluous if push comes to shove.

Maybe I should have changed the title to "The Day The Objective Benchmark Died..."

Quote:

Seriously, as stated in the PR, all the other bench are using processor specific optimisations. The graphic chips are getting closer and closer to general purpose processors, with shaders used instead of fixed treatment. Intel wins in most benchmarks because they are using their extensions SSE2, do you see AMD complain? They won in FP before thanks to their architecture. Graphic is just becoming the same way.
My concern here is the potential lack of objective benchmarking for GPUs. I do agree with you that different CPUs and GPUs do use their own methods and optimizations to achieve the same thing. However, that doesn't eliminate the need to bench based on a standard - be it an API, a set workload, a set/minimum IQ, etc. If you optimize in such a way that you violate a standard that you're testing against (reduced IQ/precision, hardwired clipping planes), then the optimization isn't valid, i.e., the "ch-" word. Optimization while keeping within the rules and standards is valid.

Quote:

And dont talk about 3DFX. They were the first to optimize this way: nobody remember the miniGL drivers? What do you think those were?
Well nVidia is trying to push their own initiative/standard that benefits them the most instead of a third-party industry-wide standard - what is the difference between that and what 3dfx did? Not to talk about 3dfx, but you brought them up. In fact, wasn't it nVidia that helped prevent that from happening the first time? It seems that nVidia is following the road that 3dfx attempted to pave not so long ago - just using different terms...
I'd rather all IHV's optimize their cards based on an external standard, not one IHV's standard.

Quote:

After all, if nvidia spends time to optimize their drivers on popular games, more power to them. Sure, they look better at UT2k3 Flyby, but the game benefits also. 3Dmark is only here for bragging rights! (and so is Quake3 now, with its 300+ fps...)
See my point above for why this has potential for backfiring...I agree with the principle, but it is the execution that determines whether or not this is beneficial to the user playing the game. If said game benchmark runs at 100 FPS with tons of stuff onscreen (enemies and such) in a given spot, but standing in the spot in an empty map yields one quarter that framerate using the same settings, something is gravely amiss.

All in all, I see the potential of this being a bad thing for the industry as a whole. All standard, objective benchmarks - game and syn - are potentially suspect for invalid optimization where benchmark performance is a higher priority than ingame engine performance.

So much for trying to keep this short...my bad...
__________________
Peace & God Bless,

$muvMoney
John 14:27 & Numbers 6:24

A64 3500+ 939 | Epox 9NDA3+ | 1 GB RAM | X800XT PE | Audigy2 ZS | 120 GB HD
3com 3C905C-TX | USR 56K | Thermaltake Tsumani | Silent Purepower 410W PSU
SmuvMoney is offline   Reply With Quote

Old 06-04-03, 01:16 AM   #297
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Hellbinder
Stealthhawk...

Here is what you dont understand.

If the extremes that Nvidia went to are only hardware *optimizations* then it does not really matter whether Futuremark says they disprove or not. It is literally open season to do whatever you can get away with until another patch comes out, then find another way to *optomize*, then another patch comes out.. etc etc etc...

If there is no cheating. Then its up to IHVs and their fanbase and partners to decide what is legit or not. Which now at this time means its completely open season anything goes. Becuase Frankly 3dmark03 just became nothing more than another run of the mill Application like any other game.

Catalyst Maker already posted a public statement that they are officially suspending all work on Features etc in drivers and will concentrate *Exclusively* on 3dmark *optimizations*. He speculates that this will result in a 25% increase in performance. I saw another post in a private forum that the ATi D3D driver Guys where jokingly putting together a driver that replaces every single texture in 3dmark with the ATi logo.

Now these guys are jesting a little you understand but it makes the point. If its not a *Cheat* and all these things are specific hardware *optimizations* then anything goes and Who care wether 3dmark approves of it or not.

Dont you see the ramifications?
This is only dangerous to people who don't read between the lines. Granted, there will be a lot who don't. I don't care, let the sheep believe what they will.

If 3dmark says with one hand that nvidia is not cheating, but says with the other hand that their optimizations are not allowed, then it obviously means they are cheating. I don't think that needs to be spelled out for anyone to understand.

How is it different if Futuremark still alledged nvidia was cheating? nvidia could still find ways to cheat around 3dmark03 patches. nvidia had a suprising amount of support aroudn here through this whole debacle. Do you honestly think that more people would care if nvidia did it repeatedly? Of course, they already have.
  Reply With Quote
Old 06-04-03, 02:11 AM   #298
SlyBoots
Registered User
 
SlyBoots's Avatar
 
Join Date: Jul 2002
Location: La Grande, OR
Posts: 339
Lightbulb Good read

http://www.limerick-leader.ie/issues/20020323/box.html

I like the last paragraph>

"At the time a leading Wall Street Banker said: "We must shift America from a 'needs' to a 'desires' culture. People must be trained to desire; to want new things even before the old has been entirely consumed. Desires must overshadow needs".

They have succeeded.
SlyBoots is offline   Reply With Quote
Old 06-04-03, 02:28 AM   #299
solofly
Registered User
 
Join Date: Jan 2003
Posts: 213
Default

In case you guys didn't know, this thread was linked from within rage3d forums found here...

http://www.rage3d.com/board/showthre...0&pagenumber=1

PS
Had a good laugh reading that thread btw. Rage3d is the biggest joke of a forum on the Net in my opinion or in other words waste of time. Sites like that give ATI a bad name...

Last edited by solofly; 06-04-03 at 04:31 AM.
solofly is offline   Reply With Quote
Old 06-04-03, 05:06 AM   #300
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
Originally posted by solofly
In case you guys didn't know, this thread was linked from within rage3d forums found here...

http://www.rage3d.com/board/showthre...0&pagenumber=1

PS
Had a good laugh reading that thread btw. Rage3d is the biggest joke of a forum on the Net in my opinion or in other words waste of time. Sites like that give ATI a bad name...

Hey now. No need to insult the rage3d community. I think some of the people there are over zealous, But as a whole they aren't all that bad


Just because someone owns an ATI card does. Not make them an ATI zealot. Just because someone owns an Nvidia card, Doesn't make them an Nvidia zealot.

we all people here
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
User Response : PR Response to Linus Torvald's Inflammatory Comments Blackcrack NVIDIA Linux 16 06-29-12 04:57 AM
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 12:00 AM
PR Response to Linus Torvald's Inflammatory Comments MikeC NVIDIA Linux 0 06-18-12 10:14 PM
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 01:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 10:30 PM

All times are GMT -5. The time now is 08:16 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.