PDA

View Full Version : Ok what is up with the new Articles on the main page?


Pages : [1] 2

Hellbinder
09-11-03, 07:21 PM
I read them.. But i still cant believe them...

First of all the ID comment..

The GeForce FX is currently the fastest card we've benchmarked the Doom technology on and that's largely due to NVIDIA's close cooperation with us during the development of the algorithms that were used in Doom. They knew that the shadow rendering techniques we're using were going to be very important in a wide variety of games and they made some particular optimizations in their hardware strategy to take advantage of this and that served them well.

Give me a break. Want to see which one is really Faster? Run the game with the ARB2 DEFAULT HIGH QUALITY PATH and see what happens. The Nv3x cards are currently the fastest for a reason. Carmack spcifically coded an entire REndering path just for Nvidia Lowing the Quality at every possible oportunity to make it the fastest. Further its a DX7/DX8 Game and mainly uses jsut ONE Shader. Which it repeats over and over.

Secondly.. From Nvidia themselves...

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers.

Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

That is a complete Load of PR Dishonest Jibberish. Look at that Crap at the end talking about the 100+ million Nvidia users. most of which will run the game in Dx7 mode anyway... :rolleyes:

I can clearly see the path Nvidia is going to go down now. These guys have completely lost it. I dont have even a shred of Respect left for them at all.

vampireuk
09-11-03, 07:27 PM
*shrugs* its news, we reported it:)

bkswaney
09-11-03, 07:33 PM
I hear ya HB. Me to and I'm a nv fan. "well was a nv fan"

Now maybe they see that not sticking with the microsoft DX9 standard
was the wrong thing to do.:banghead:

ChrisW
09-11-03, 07:37 PM
Yeah, well if they want to claim the GFFX is faster than the Radeon then why won't they give ATI a copy of the benchmark? At least nVidia has access to the HL2 benchmark and is not being controlled by ATI. I'm supposed to take the word of someone that says "If you want the benchmark, talk to nVidia"?

John Reynolds
09-11-03, 07:41 PM
Originally posted by ChrisW
Yeah, well if they want to claim the GFFX is faster than the Radeon then why won't they give ATI a copy of the benchmark? At least nVidia has access to the HL2 benchmark and is not being controlled by ATI. I'm supposed to take the word of someone that says "If you want the benchmark, talk to nVidia"?

I tend to agree. And I should add, since Nvidia is currently engaging in the heavy suggestion making that HL2's performance is due to the Valve-ATI marketing deal, that Nvidia has paid $5 million for their deal with id/Activision for Doom 3. I wonder if Carmack has spent 5x longer on a special ATI code path than he has for ARB paths like Valve has for Nvidia for their game?

Solomon
09-11-03, 07:46 PM
Originally posted by John Reynolds
I tend to agree. And I should add, since Nvidia is currently engaging in the heavy suggestion making that HL2's performance is due to the Valve-ATI marketing deal, that Nvidia has paid $5 million for their deal with id/Activision for Doom 3. I wonder if Carmack has spent 5x longer on a special ATI code path than he has for ARB paths like Valve has for Nvidia for their game?

Interesting... Didn't know that about nVidia and ID. $5 million eh? To bad nVidia didn't use that chunk to invest in creating a DX9 card correctly. :p

Regards,
D. Solomon Jr.
*********.com

bkswaney
09-11-03, 07:49 PM
Originally posted by Solomon
Interesting... Didn't know that about nVidia and ID. $5 million eh? To bad nVidia didn't use that chunk to invest in creating a DX9 card correctly. :p

Regards,
D. Solomon Jr.
*********.com


ROFLMAO hahahaha "LOL" :D :D :D Yep! :lol2: :bash:

digitalwanderer
09-11-03, 07:52 PM
Originally posted by John Reynolds
I tend to agree. And I should add, since Nvidia is currently engaging in the heavy suggestion making that HL2's performance is due to the Valve-ATI marketing deal, that Nvidia has paid $5 million for their deal with id/Activision for Doom 3. I wonder if Carmack has spent 5x longer on a special ATI code path than he has for ARB paths like Valve has for Nvidia for their game?
IF YOU THINK YOU KNOW SO MUCH WHY DON'T YOU JUST GO START YOUR OWN WEBSITE AND MAKE YOUR OWN FORUMS SMARTBOY?!?!!?!?



































;) (Sorry, I just couldn't pass it up! :lol: )

Hellbinder
09-11-03, 07:58 PM
Hey guys.. i am sorry if i came accross the wrong way. I was not addressing the fact that you guys posted the information. That of course is your Job. ;)

You Nvnews dudes are great. I would never try to make some kind of negative comment about you or the Job your doing. My comment was about the informaion itself. I simply Phrased it poorly.

vampireuk
09-11-03, 07:59 PM
I know HB, was just covering our ass in the event someone else tried to pounce on us:)

GlowStick
09-11-03, 08:14 PM
Originally posted by ChrisW
I'm supposed to take the word of someone that says "If you want the benchmark, talk to nVidia"? I want the Half-Life 2 benchmark, why do i not have it?

Nvidia wanted to use their det 50's on the benchmarks, but somehow they wernt used, wasent that an issue with ATi saying

"ATi didnt have a chacne to 'optimize' their drivers"

The Baron
09-11-03, 08:16 PM
You Nvnews dudes are great. I would never try to make some kind of negative comment about you or the Job your doing. My comment was about the informaion itself. I simply Phrased it poorly.
Yay. A vote of approval from Hellbinder.. what is the world coming to? ;)

And for this, maybe we're skipping the video card war and going straight into the PR war. Note that NV said that Det50 does exist, something that I don't think they've ever done before (confirm the existence of an unannounced product or update before its launch). Maybe ATI and NV are playing hardball with regards to PR now.

GlowStick
09-11-03, 08:38 PM
Originally posted by Hellbinder
I read them.. But i still cant believe them...

First of all the ID comment..

Give me a break. Want to see which one is really Faster? Run the game with the ARB2 DEFAULT HIGH QUALITY PATH and see what happens. The Nv3x cards are currently the fastest for a reason. Carmack spcifically coded an entire REndering path just for Nvidia Lowing the Quality at every possible oportunity to make it the fastest. Further its a DX7/DX8 Game and mainly uses jsut ONE Shader. Which it repeats over and over. Since JC did 'write alot of the code' i am sure he is aware of what he wrote, and he knows what he is saying.

Your shader comment seems odd, are you saying that JC is wrong for not specificly writing code that will preform badly on nvidia hardware?

I personly think doom3 grpahics are better than HL2's and if he can pull it off by useing OpenGL then i find thats just amazing.

StealthHawk
09-11-03, 10:04 PM
Originally posted by GlowStick
Since JC did 'write alot of the code' i am sure he is aware of what he wrote, and he knows what he is saying.

Your shader comment seems odd, are you saying that JC is wrong for not specificly writing code that will preform badly on nvidia hardware?

I personly think doom3 grpahics are better than HL2's and if he can pull it off by useing OpenGL then i find thats just amazing.

Doom3 is a DX7 class game. All the effects are possible on DX7 hardware. It is not a shader intensive game. Doom3 uses shaders to accelerate performance, not to make extra effects.

GlowStick
09-11-03, 10:13 PM
Originally posted by StealthHawk
Doom3 is a DX7 class game. All the effects are possible on DX7 hardware. It is not a shader intensive game. Doom3 uses shaders to accelerate performance, not to make extra effects. Yes, but the only games that have huge preforamce problems with is the shaders.

and since doom3 dosent really use the class of shaders that they have ap roblem with, but still looks sooo good, he is complaining that its not maeking nvidia look bad.

Shadowx
09-11-03, 10:41 PM
NV is saying that Det50 will be "THEIR BEST DRIVERS EVER" i guess they will be great running normal DX9 code without optimizations. HELL NO they are going to do it even more! No fog, 12 precision, crap IQ, no AF+AA with Hi res. But ATI also has "THEIR BEST DRIVERS EVER" Cats 3.8 and I now they will be DX9 "NO OPTM.", 24 precision, awesome IQ, AF+AA with Hi res. Too bad NV the sh*t hit the fence this generation is crap and 9600 coming to sub 100$ market in a few months. NV has lost High and mid range market to ATI and DX9 in the low end will be Ati domain too. If you own NV stock sold it until NV40 this!! :D

Joe DeFuria
09-11-03, 10:50 PM
Originally posted by GlowStick
I want the Half-Life 2 benchmark, why do i not have it?

Nvidia wanted to use their det 50's on the benchmarks, but somehow they wernt used, wasent that an issue with ATi saying

"ATi didnt have a chacne to 'optimize' their drivers"

The two situations are drastically different.

1) You (the general public) WILL have the Half-life 2 benchmark...in about 2 weeks. (Not maybe a year from now.)

2) BOTH ATI and nVidia knows that Half-Life release is imminent. I mean we've known this for quite a while now. So to not have "optimized drivers" ready at this point is pretty foolish. ATI didn't have a chance to "optimize" their drivers for the doom3 benchmarks....and why should they have, with the Doom3 release nowhere in sight, and no word of a benchmark release?

GlowStick
09-12-03, 12:02 AM
Originally posted by Joe DeFuria
The two situations are drastically different.

1) You (the general public) WILL have the Half-life 2 benchmark...in about 2 weeks. (Not maybe a year from now.)

2) BOTH ATI and nVidia knows that Half-Life release is imminent. I mean we've known this for quite a while now. So to not have "optimized drivers" ready at this point is pretty foolish. ATI didn't have a chance to "optimize" their drivers for the doom3 benchmarks....and why should they have, with the Doom3 release nowhere in sight, and no word of a benchmark release? I dont see how the public release of the bench, and on a sad note, valve hinted that the bench will be avaible with the game, not before =[

and on the optimized, the first thing ATi uers said was, ati had no time to optmized their drivers. while thatsfine, but valve is saying they like ATi becuase they 'dont have to optmize' at all for them.

so basicly what your saying its its 100% ok for ATi to optmize, but 100% wrong for nvidia.

ChrisW
09-12-03, 12:09 AM
Well, Valve confirmed the rumor of it being packaged with the card ATI is about to release. Therefore, the game is going to be released in the next couple of weeks. With the game comes the benchmark.

What you are forgetting is there may have been a bug in the ATI drivers that was preventing it from running correctly. NVidia could have taken this opportunity to release their benchmark numbers and exploit what ATI did not know. This hardly seems like a fair thing to do. On the other hand you have both companies with access to the HL2 benchmark and they both know when it is being released.

Joe DeFuria
09-12-03, 12:09 AM
Originally posted by GlowStick
I dont see how the public release of the bench, and on a sad note, valve hinted that the bench will be avaible with the game, not before =

??

Valve has always said the benchmark would be released before the game.

and on the optimized, the first thing ATi uers said was, ati had no time to optmized their drivers.

Right...and like I said...ATI would have ZERO reason to do so, given Doom3 was nowhere near release.

while thatsfine, but valve is saying they like ATi becuase they 'dont have to optmize' at all for them.

VALVE doesn't have to optimize it's code for ATI's cards. Doesn't mean ATI can't benefit from optimizing their drivers for Valve's engine. Big difference.

so basicly what your saying its its 100% ok for ATi to optmize, but 100% wrong for nvidia.

No, I'm saying it's a LEGITIMATE EXCUSE for ATI to not have their drivers optimized for the Doom3 benches, and much less of an excuse for nVidia.

On a related note, I have much more confidence that ATI can optimize their drivers without cheating. ATI has earned the right to have the benefit of the doubt. nVidia, just the opposite.

Shadowx
09-12-03, 12:19 AM
http://www.amdmb.com/article-display.php?ArticleID=257

25% better in 3dmark shadares bench, i hope Cat's 3.8 gain even more than that.

Behemoth
09-12-03, 12:21 AM
Originally posted by GlowStick
Yes, but the only games that have huge preforamce problems with is the shaders.

and since doom3 dosent really use the class of shaders that they have ap roblem with, but still looks sooo good, he is complaining that its not maeking nvidia look bad.
yeah doom3 looks so good probably because of its clever and extensive use of vertex shaders as well, luckily nv3x do not have too much problem of it.

Zenikase
09-12-03, 12:22 AM
The fact that there are very minimal performance gains in every benchmark but 3DMark03 PS2.0 test makes me feel like I've been taking back in time to May 2003.

Hank Lloyd
09-12-03, 02:06 AM
The 5200 has to have one of the most laughable performance levels I've seen in quite some time. Man, I can't recall the last game that came out that yielded such low numbers. You would really have to go way back to the days of the ole' Voodoo to see that.

I was hoping to see some visual comparisons between the various modes, but I would have to think that in not showing any screenshots, the NDA is in effect.

hithere
09-12-03, 02:17 AM
I'll take Carmack at his word. From what I know, his lighting engine benefits greatly from nvidia's architecture via their shadowing support, and the shader, as has been said, is the only shader in the game and can be run in 16-bit precision and lower without a severe loss of quality.

So, FX is faster in Doom 3. ATI fans take comfort in that they will be playing HL2 first, and there are many more games out there likely to demand high-precision shaders on the way, like all the source derivatives. We've all got stuff to look forward to. I'm just happy the games are coming.:D