PDA

View Full Version : I am sad. They are doing it again.


Pages : [1] 2 3

Bopple
05-04-04, 09:52 PM
Link (http://www.techreport.com/reviews/2004q2/radeon-x800/index.x?pg=20)
It seems nvidia is testing us how we would react to this, with a beta driver, and is decreasing the gap.
NVIDIA, what is the point of 'disable optimization'?

hithere
05-04-04, 09:58 PM
Link (http://www.techreport.com/reviews/2004q2/radeon-x800/index.x?pg=20)
It seems nvidia is testing us how we would react to this, with a beta driver, and decrease the gap.
NVIDIA, what is the point of 'disable optimization'?

"disable optimism" :)

Ady
05-04-04, 09:59 PM
That one isn't as sad as the FartCry situation, still waiting to see how that one pans out yet though.

Edge
05-04-04, 10:47 PM
Umm, why does everyone think Nvidia is "cheating" whenever changes in IQ are discovered? I mean seriously, if a review said that the 6800 wasn't properly rendering certain shadows in Pandora Tomarrow I'm sure everyone would be quick to point out how Nvidia is cheating, but because that problem exists on ATI cards everyone seems to ignore it. But you know what, it doesn't matter how a bunch of colored surfaces look in a review like that, because if it looks similar in the actual game it doesn't really matter, now does it? Considering the optimisation enable/disable option doesn't affect that particular aspect of the filtering, and the fact that the FX5950 has the same IQ as the 61.xx drivers makes me wonder if they aren't rolling back their Anisotropic system to their old one for whatever reason. Are there any actual comparision screenshots so we can see what the difference would look like in the game? I'm interested in seeing how this change affects the actual textures when the game is running, and I'm wondering if this means they'll have 2 options for Anisotropic filtering in future drivers (the "old" method from the FX cards, and the "new" method that's in the 60.xx drivers).

Snarfy
05-04-04, 10:58 PM
Umm, why does everyone think Nvidia is "cheating" whenever changes in IQ are discovered?

Nvidia cheated last generation, when the fx 5800 ultra was released, they screwed up iq to improve performance, and lied about it.

Because of this, some are a bit leery with this card generation. And they could be right. Time will tell, wont it?

Bopple
05-04-04, 10:59 PM
Umm, why does everyone think Nvidia is "cheating" whenever changes in IQ are discovered?
Look. Their new 'beta' drivers at the release of a new famous game/benchmark or a new product, has always had IQ/performance issues - it is an extremely annoying 'coincidence'.
And the new driver this time with a new product again, does the same thing - especially when it's a bit slower than the competitor.
And that, when they have the option to disable the optimization?
Well, perhaps, it may be another coincidence. But everyone would doubt.

Jarred
05-04-04, 11:00 PM
Umm, why does everyone think Nvidia is "cheating" whenever changes in IQ are discovered? I mean seriously, if a review said that the 6800 wasn't properly rendering certain shadows in Pandora Tomarrow I'm sure everyone would be quick to point out how Nvidia is cheating, but because that problem exists on ATI cards everyone seems to ignore it. But you know what, it doesn't matter how a bunch of colored surfaces look in a review like that, because if it looks similar in the actual game it doesn't really matter, now does it? Considering the optimisation enable/disable option doesn't affect that particular aspect of the filtering, and the fact that the FX5950 has the same IQ as the 61.xx drivers makes me wonder if they aren't rolling back their Anisotropic system to their old one for whatever reason. Are there any actual comparision screenshots so we can see what the difference would look like in the game? I'm interested in seeing how this change affects the actual textures when the game is running, and I'm wondering if this means they'll have 2 options for Anisotropic filtering in future drivers (the "old" method from the FX cards, and the "new" method that's in the 60.xx drivers).

amen brotha.

eesa
05-04-04, 11:02 PM
doesn't their old driver work fine?

Shadowx
05-04-04, 11:03 PM
I agree give NV a chance to realease final drives this ones are beta, but on th same note ATI drivers for the x8xx familly is allo more mature rigth now. Do you think NV will have P.S.3 when the card hot retail? their drivers look like they still need some work.

flakjacket
05-04-04, 11:05 PM
I betcha some of these people get money for stuff like this.

Edge
05-04-04, 11:17 PM
Look. Their new 'beta' drivers at the release of a new famous game/benchmark or a new product, has always had IQ/performance issues - it is an extremely annoying 'coincidence'.
And the new driver this time with a new product again, does the same thing - especially when it's a bit slower than the competitor.
And that, when they have the option to disable the optimization?
Well, perhaps, it may be another coincidence. But everyone would doubt.

But CHANGES in IQ are not indicative of "cheating". Obvious DECREASES in IQ that cause major performance boosts could be called cheating, but I hardly think changing the Anisotropic pattern to a system they used previously is a whole-hearted cheat. Maybe a bit questionable, but if the difference isn't noticable in-game then it doesn't really make a big difference, now does it? That's why I want to see some actual IN-GAME comparisions rather than a bunch of strangely colored surfaces, the end result is much more important than the method they used to achieve it (who cares if ATI's Aniso is only performed on half the surfaces on the screen, you don't notice it on the surfaces it doesn't perform it on anyway!).

It also annoys me that everyone assumes that ALL those problems with the 61.xx drivers and Farcry is a cheat to increase performance, I highly doubt things like screen corruption when you take a screenshot with AA is done in order to speed up the game. Perhaps these are "cheats", but it wouldn't suprise me if almost all of these issues up like the Aquamark/50.xx driver debacle. That was actually kinda funny in a morbid kind of way, a 10+ page thread about how Nvidia was "cheating" in Aquamark, and in the end it really was just a driver bug that didn't seem to affect the framerate.

Also, don't the 61.xx driver actually SLOW DOWN a number of games? Look at the benchmarks, most games show decreases in performance due to the 61.xx drivers from the 60.xx ones. If they were going to trade off IQ for performance, wouldn't they do it for more standard benchmarking games (like UT2003/4, Halo, Splinter Cell, etc.)

Snarfy
05-04-04, 11:46 PM
117597 [TWIMTBP]NV40,WinXP: Corruption when trying to take a screenshot with AA enabled in Far Cry.

117561 [TWIMTBP]NV38/40,WinXP: Corruption of some weapons in Far Cry.

117474 [TWIMTBP]NV40,WinXP: Fog broke in Far Cry.

113476 [TWIMTBP]NV38/40-WinXP: Lighting/shadow problem in FarCry.

117575 [TWIMTBP] NV40/38: Banding visible from wall in FarCry.

Edge
05-04-04, 11:58 PM
117597 [TWIMTBP]NV40,WinXP: Corruption when trying to take a screenshot with AA enabled in Far Cry.

OMG! Pheer the dreaded "taking a screenshot with AA enabled" cheat! I bet they're gaining 10% performence from that hack alone!

Buddzha
05-05-04, 12:00 AM
http://www.chip.de/ii/21799658_5977b1e2e1.jpg

Rename FarCry.exe and this is what you get. Picture says more than a thousand words...

Shadowx
05-05-04, 12:30 AM
damn! again

jAkUp
05-05-04, 12:41 AM
http://www.chip.de/ii/21799658_5977b1e2e1.jpg

Rename FarCry.exe and this is what you get. Picture says more than a thousand words...


Your ****ing kidding me... unbelievable if that renaming farcry.exe is true.

Snarfy
05-05-04, 12:50 AM
OMG! Pheer the dreaded "taking a screenshot with AA enabled" cheat! I bet they're gaining 10% performence from that hack alone!

Aww, so sad! Even with the problems with fog, shadows, lighting, banding, and weapons, nvidia still loses substantially. What does that mean for this supposed 'graphical improvement' in farcry we've been hearing about? will nvidia do even worse? :rolleyes:

The only time nvidia really wins is in opengl games.

Unfortunately the only person who uses opengl anymore is john carmack ^_^

When someone releases an opengl game that actually looks as good as "fartcry" (Doom 3?), using complex shaders, textures, and high poly counts, i will be glad to care about opengl once again.

Until then, opengl is something i could care less about. Any of the current generation of cards will run the Quake 3 engine at +100fps. So why should i consider that in my purchase?

Lfctony
05-05-04, 01:01 AM
Aww, so sad! Even with the problems with fog, shadows, lighting, banding, and weapons, nvidia still loses substantially. What does that mean for this supposed 'graphical improvement' in farcry we've been hearing about? will nvidia do even worse? :rolleyes:

The only time nvidia really wins is in opengl games.

Unfortunately the only person who uses opengl anymore is john carmack ^_^

When someone releases an opengl game that actually looks as good as "fartcry" (Doom 3?), using complex shaders, textures, and high poly counts, i will be glad to care about opengl once again.

Until then, opengl is something i could care less about. Any of the current generation of cards will run the Quake 3 engine at +100fps. So why should i consider that in my purchase?

I can tell from your response that you are definitely not in favor of ATI. :lame:

Edge
05-05-04, 01:07 AM
Umm, I'd definatly consider the Doom 3 engine to be a pretty substantial benchmark, especially considering how many games will probably use that engine for their games (just look how far the Quake 3 engine has been pushed, there are even Q3 based games which stress the 6800/x800).

Also where are those people who a few weeks ago were saying that the Quack3 issue wasn't an intentional cheat? I wonder what they'd have to say about Fartcry. Although like Aquamark I think it's quite possible that the graphic bugs are a result of certain optimisations that are still problematic. It's no secret that there are Nvidia specific optimisations in Farcry, what has yet to be seen is what it actually changes and where the line between a "cheat" and an "optimisation" take place. Hmm, maybe they're actually trying to get SM3.0 working with it...would be very interesting to see how that would affect performance in Farcry.

BTW, from the rest of the D3D benchmarks doesn't the 6800 match up pretty well with the x800XT? Most of the time the two cards are within 10%-20% of eachother, and in a few the 6800 actually manages to be faster (though for a majority of the games the x800XT is still in the lead). It seems like the only game the 6800 REALLY has trouble with is Farcry, which is most likely one of the reasons they're trying to get it running as fast as possible as they can on the 6800. But we won't know how this issue turns out until the official drivers are released.

I'd like to see some Half-life 2 benchmarks on these cards, that to me is more important than any other benchmark at this point (since it's one of the main PC games that I'm really looking forward to this year).

Snarfy
05-05-04, 01:08 AM
I can tell from your response that you are definitely not in favor of ATI. :lame:
Err...you can? I personally think ati won this round :spongebob::canadian::naughty:

Lfctony
05-05-04, 01:28 AM
Err...you can? I personally think ati won this round :spongebob::canadian::naughty:

It's something called IRONY.

Edge
05-05-04, 01:34 AM
Actually, I think that would be sarcasm.

evilchris
05-05-04, 01:34 AM
117597 [TWIMTBP]NV40,WinXP: Corruption when trying to take a screenshot with AA enabled in Far Cry.

117561 [TWIMTBP]NV38/40,WinXP: Corruption of some weapons in Far Cry.

117474 [TWIMTBP]NV40,WinXP: Fog broke in Far Cry.

113476 [TWIMTBP]NV38/40-WinXP: Lighting/shadow problem in FarCry.

117575 [TWIMTBP] NV40/38: Banding visible from wall in FarCry.

oh noz! teh werld is end!

Snarfy
05-05-04, 01:39 AM
oh noz! teh werld is end!

:nutkick:

:smoking2:

volkskrant
05-05-04, 01:47 AM
Umm, why does everyone think Nvidia is "cheating" whenever changes in IQ are discovered?

Where have you been the last two years? Nvidia's cheating drivers have been all over the web since the FX5800 launch. And the "Fart Cry" benchies of chip.de are indicating NVidia has not changed much since!