PDA

View Full Version : Microprocessor Report awards GeForce FX as "Best Graphics Processor" of 2002


Pages : [1] 2 3 4 5

CherryPopper
02-18-03, 11:23 PM
If I did'nt know they were serious it would be almost laughable...I mean Jeez.... how can they possibly say somethin like that and keep a straight face...:p

I wonder if it's a co-incidence that In-Stat/MDR, the publisher of the Microprocessor Report, is located in San Jose, just a lil stone's throw from Nvidia's front doors :angel:

kyleb
02-19-03, 12:00 AM
lol that is some funny stuff, probably would have got mcroprocessor of the decade if it actualy came out. ;)

Sazar
02-19-03, 12:08 AM
well its highly possible the award was 'bought' :)

or perhaps the lads who did the award giving were @ the conference call where it was stated nvidia had the most powerful desktop gpu ON THE MARKET

Lezmaka
02-19-03, 12:10 AM
I think it was the award for most delayed graphics processor of the year. Pretty sure Clawhammer got an award too, for most delayed cpu of the year

AngelGraves13
02-19-03, 03:48 AM
first of all guys......give me a name of a card that is more programmable than the nv30?? None....award goes to NV30! That simple!

kyleb
02-19-03, 04:10 AM
well now give us a nv30 and we will see if you are right about that 99 to Life ;)

ChrisW
02-19-03, 04:23 AM
I'm sorry, but how can a card that is not even released today win an award for the previous year? And where did they get their information about the card from? Tom's hardware? What a joke. And checkout the front page at nVidia.com. LOL! You can expect that award to be on the front of the GFFX box.

Myrmecophagavir
02-19-03, 06:48 AM
It wasn't even out in 2002, not to mention the R300 which I'd say was the biggest and best surprise of 2002. ATI should blatantly have won, there must be some backhander going on here.

Smokey
02-19-03, 07:47 AM
This is just the same as the XBox winning best console before it was even released, dont even think it was a final build either?

Megatron
02-19-03, 08:07 AM
LOL...Best Graphics processor of 2002?

Its 2003 and I still cant find one for sale. What a joke...Im sure that is an award coveted by the industry.
:rolleyes:

Nv40
02-19-03, 11:46 AM
Originally posted by Megatron
LOL...Best Graphics processor of 2002?

Its 2003 and I still cant find one for sale. What a joke...Im sure that is an award coveted by the industry.
:rolleyes:


atlon64 won the best processor award.. and its not available yes ;)

the awards is for the best technology (with more future)
something the NV30 *IS* , the first card in the world with 128bit
quality precision and Very long pixel shaders intructions enough
for any Computer graphics STudio movie production..

and Nvdia demostrated the NV30 at comdex ,running in realtime cinematic quality demos..

silence
02-19-03, 12:17 PM
Originally posted by Nv40
atlon64 won the best processor award.. and its not available yes ;)

the awards is for the best technology (with more future)
something the NV30 *IS* , the first card in the world with 128bit
quality precision and Very long pixel shaders intructions enough
for any Computer graphics STudio movie production..

and Nvdia demostrated the NV30 at comdex ,running in realtime cinematic quality demos..

as much as i like nvidia.......god knows what we saw back then......there is not a single card out there on which u could run those demos in real time to see if they are really what they are supposed........

wasn't like gf3 supposed to render scenes from final fantasy in real time.........and after was discovered thet they used 1/10 polygons??.......

tech demos are nice.........and i was very impressed by them......but before we see retail cards doing something like thet i'll doubt they are what nvidia claims........

way TOO MUCH BS coming lately from nv........

as much as i like nv cards i think thet r300 is THE BEST vid card of 2002.....i mean >> show me ANYBODY,but nvidia itself having at least 1 nv30 during 2002.......

Sazar
02-19-03, 12:46 PM
ok guys I was kidding round in my first post... did not expect this thread to take the direction it has...

let me quote something...

San Jose, Calif., February 14, 2003 - In-Stat/MDR, publisher of the Microprocessor Report, announced, yesterday evening, the winners of its fourth annual Analysts' Choice Awards, honoring the best new microprocessor chips and the most promising new microprocessor technology unveiled in 2002. The award winners were selected by In-Stat/MDR's technology analysts, the team behind the internationally recognized industry newsletter, Microprocessor Report and the annual Microprocessor Forum and Embedded Processor Forum

therefore it is highly possible the gpu won the award as a PROMOSING technology...

nv40... I agree with your comparison to the athlon64.. but the 64 has been shown to work with real 64 bit apps throughout the year... it has also yet to be LAUNCHED imo... :) basically demoed... the official release date seems to be q3'03... AFAIK...

concerning teh demo's you have to understand the demos are written exclusively for nvidia by their design team... as are ati's and other companies demos...

the gpus today do not have the power to do the things (that theoretically they can do) in real time... the demo's are not an example of real world performance.. heck look @ the bear demo from ati and the ogre demo from nvidia... they are both amazing.. dawn gets more press because she is almost nekid but I was more impressed with the ogre.. but I can't see either companies top of the line gpu pushing that in a real world game...

Megatron
02-19-03, 12:53 PM
Originally posted by Nv40

the awards is for the best technology (with more future)
something the NV30 *IS* ,

The only thing the Nv30 *IS*....is ...absent.



"the first card in the world with 128bit
quality precision and Very long pixel shaders intructions enough
for any Computer graphics STudio movie production.."


And why pray tell would the world care which card was the first with 128bit precision????
Like anyone can tell the difference between the Radeons 96 bit and Nvidias 128 bit.

Oooh thats right...higher number means its better. Atari tried to run that campaign when they launched their Jaguar system..lol...didnt work for them either.

Sazar
02-19-03, 01:59 PM
Originally posted by creedamd
This is so funny I can't even come up with something to say. Even the people at nVidia have to be scratching their heads on this one. Lmao.

nope... they have their front page plastered with a massive headline and a little trophy :)

definitely not scratching their heads...

Sazar
02-19-03, 02:01 PM
Originally posted by Hellbinder
The Nv30 is not even for sale yet!!!! :rolleyes:

So what????

It cant even use its 128bit percision in any practical manner due to design flaws!!! Or didn't you read Carmacks plan file????

This is so patentlly ridiculous its not even funny. the list of advantages that the R300 has over the Nv30 is a MILE LONG. the only posotive thing about the Nv30 is its shader support length, wich wont be correctly implimented until the Nv35 later this year. Or did you fail to notice that the Nv30 loses every single raw PS 2.0 tes by a large margin.

Further.. You still cant buy one

Damn it just pisses me off to no end that ATi releases the best graphics card overall ever designed. The best ballance of features and power. The Best IQ ever offered. The Fastest AA wioth Equal qiality ever offered. Full speed processing at 96bit percision all the time. A 325mhz core that outperforms a 500mhz core when the cards are evenly matched with Quality enhancing features.. etc etc etc...

Nvidia somehow STILL Has sites out there that post pathetic junk like this. The disrespect for what ATI accomplished is unforgivable.

I agree with the premise of your post... but the award is for promising tech based on their "ANALysts" :rolleyes: perception...

we may not agree wth their judgement... but the r300 core has won enough awards to not really make it worthwhile to bother about this..

Myrmecophagavir
02-19-03, 02:01 PM
Originally posted by silence
wasn't like gf3 supposed to render scenes from final fantasy in real time.........and after was discovered thet they used 1/10 polygons??.......

tech demos are nice.........and i was very impressed by them......but before we see retail cards doing something like thet i'll doubt they are what nvidia claims........That was never a secret, they took the data set for the movie and trimmed down the poly count so that it would run in real-time on GF3 but still looked very good.

silence
02-19-03, 02:32 PM
Originally posted by Myrmecophagavir
That was never a secret, they took the data set for the movie and trimmed down the poly count so that it would run in real-time on GF3 but still looked very good.

well......i wasn't into stuff so much back then so i wrote what i knew....thnx for update:)

kyleb
02-19-03, 02:40 PM
Originally posted by Sazar
the demo's are not an example of real world performance.. heck look @ the bear demo from ati and the ogre demo from nvidia... they are both amazing.. dawn gets more press because she is almost nekid but I was more impressed with the ogre.. but I can't see either companies top of the line gpu pushing that in a real world game...

i have seen it run in real time in on my computer, there is no reason they cannot have it in a game. sure, not with a lot of other cool stuff going on at the same time as well, but in a cutseen or even just tucked off in some small corner of the maps were it won't bring preformace to its knees.


also, Hellbinder, you have been noticeably absent lately comared to usaul, here and elsewere i usauly find ya. i have missed haveing your often insighful and always agressive openions in all of this; so were ya been?

Sazar
02-19-03, 03:33 PM
Originally posted by kyleb
i have seen it run in real time in on my computer, there is no reason they cannot have it in a game. sure, not with a lot of other cool stuff going on at the same time as well, but in a cutseen or even just tucked off in some small corner of the maps were it won't bring preformace to its knees.


also, Hellbinder, you have been noticeably absent lately comared to usaul, here and elsewere i usauly find ya. i have missed haveing your often insighful and always agressive openions in all of this; so were ya been?

this has been debated to some degree on various forums... I will not get into it beyond that it is not possible to have the graphics seen in the demo's run in a real game environment without a massive performance hit...

there is also no reason to have a cut scene rendered using the gpu to its max when a little video can accomplish the same task...

not like the games will be much smaller or whatever...

Typedef Enum
02-19-03, 03:49 PM
If ever there was a bigger load 'o poop than this, I would like to see it.

This product has been the single biggest nut-roll to come out of nVidia since their 1st failed attempt @ producing a video card, and they get rewarded for it?

Not only that, but this is a product that not only cannot be purchased, but whose highend brother will essentially never see the light of day...

Then you add the hardware bugs....

Then you add the heat...

Then you add the noise...

And if that's not quite cutting it, you then have to factor in the 9700, which has been selling for 1/2 year already, and offers better image quality AND performance to boot!

If one were to ever consider a situation in which one outfit paid the other to give them a glowing endorsement, this would be it.

kyleb
02-19-03, 04:04 PM
ohh come on Sazar i gave a vaild argument to counter your orignal point that such things cannot make it into a game running on a top of the line gpu:


Originally posted by kyleb
sure, not with a lot of other cool stuff going on at the same time as well, but in a cutseen or even just tucked off in some small corner of the maps were it won't bring preformace to its knees.


please don't just stick your nose up at me and call me wrong, if i am wrong i would like to see a valid resoning for you saying so. i have worked with quite a few games in my time, makeing models, maps and even code and i have delt with various preformace issues invovled with such tasks, i fail to see how you can claim my statement is invalid. now on the other hand i will tell you why i belive we will see nothing of the sort in a game any time soon; no one wants to put all that effort into a project when only a few people capeable hardware will be able to enjoy it and everyone else is still useing lesser hardware and missing out.

ChrisW
02-19-03, 04:08 PM
I love this quote:
Gamers eager to realize the full potential of titles such as Doom III and Unreal II -- and developers creating even more advanced software -- will plunk down the big bucks for these boards right away
:confused:

I can see them winning an award for "Best new 3D workstation card" or something but best graphics card of 2002? :confused:

PeterGlaskowsky
02-19-03, 04:15 PM
I'm the editor in chief of Microprocessor Report and a principal analyst with In-Stat/MDR. I'm also the primary analyst for graphics technology at In-Stat/MDR, and the most highly respected technology analyst in the graphics industry.

I have many years of engineering experience designing graphics cards and chips, and I've been an analyst in this area for about five years now.

I am aware of all of the facts regarding the GeForce FX and its competition-- better than any of you, because it's my job, and just something you all think about from time to time as a hobby.

Key facts of which none of you seem to be aware:

Our eligibility criteria are that nominees must be commercially available during the calendar year preceding the announcement of the award, and that enough information be available about the nominees to permit us to reach a good decision about the award.

The GeForce FX did become commercially available during 2002. NVIDIA manufactured and sold these chips in 2002 to board makers, who then began manufacturing boards using these chips. Yes, sales volumes are very low, but the Microprocessor Report awards are not concerned with sales volumes, only the qualities inherent to the chips themselves.

The GeForce FX does in fact outperform the Radeon 9700 Pro in most ways, and in the ways that we regard as most important.

We stand by our award.

If it makes any of you feel any better, I did speak with ATI representatives several times from November through January about the availability of the R350, and ATI confirmed that it was not eligible for our awards this time around. I do expect the R350 to outperform the GeForce FX in many ways, and to be a greater commercial success; the open question is which chip will offer superior overall performance.

. png

volt
02-19-03, 04:30 PM
We are not aware of the facts regarding GeForce FX and its competition?

That's weird, very weird, rather ignorant. Now care to elaborate what do you mean by "GeForce FX does in fact outperform the Radeon 9700 Pro in most ways" ?