PDA

View Full Version : Nvidia's secret weapon against ATI?


Pages : [1] 2 3

Maverickman
06-24-08, 06:49 PM
Nvidia's acquisition of Ageia received little attention a few months ago. So did the following announcement that Nvidia's GeForce 8 series and up would have support for Ageia's PhysX in an upcoming driver release. This driver release is supposedly imminent for the GTX 200 series, and the 8 and 9 series will get its driver shortly thereafter. Although few games support PhysX, it can make a dramatic difference in those that do support it. This PhysX support may make the difference for those looking at a high-end Nvidia card like the GTX 260. The ability to see particles flying all over the place causing collateral damage is a sight to behold. I've turned up the Physics in Crysis to Very High, and I can see a big difference in the explosions onscreen. If more games incorporate PhysX, it could really hurt ATI as gamers will no doubt want to have a card that takes advantage of its capabilities. In the past, you had to buy a card for $149 or more to get PhysX. Soon, all you'll need is an Nvidia card and the correct drivers. It will make more sense for developers to use PhysX, and I'm sure Nvidia will "encourage" them to do so!

methimpikehoses
06-24-08, 06:55 PM
hmmm... I've heard big promises about physX before...

Vanzagar
06-24-08, 07:23 PM
Bahh, sounds like they're loosing focus... they'll probably mess things up more by trying to add this into the mix, the quad core cpu's give us enough bandwidth to do the physics on the cpu, leave it off the gpu please, just focus on giving me a really stable SLI powerhouse card please...

rh

Redeemed
06-24-08, 07:24 PM
Nvidia's acquisition of Ageia received little attention a few months ago. So did the following announcement that Nvidia's GeForce 8 series and up would have support for Ageia's PhysX in an upcoming driver release. This driver release is supposedly imminent for the GTX 200 series, and the 8 and 9 series will get its driver shortly thereafter. Although few games support PhysX, it can make a dramatic difference in those that do support it. This PhysX support may make the difference for those looking at a high-end Nvidia card like the GTX 260. The ability to see particles flying all over the place causing collateral damage is a sight to behold. I've turned up the Physics in Crysis to Very High, and I can see a big difference in the explosions onscreen. If more games incorporate PhysX, it could really hurt ATI as gamers will no doubt want to have a card that takes advantage of its capabilities. In the past, you had to buy a card for $149 or more to get PhysX. Soon, all you'll need is an Nvidia card and the correct drivers. It will make more sense for developers to use PhysX, and I'm sure Nvidia will "encourage" them to do so!

Crysis doesn't utilise PhysX.

Ninja Prime
06-24-08, 07:35 PM
So, their secret weapon is... cheating in 3dmark scores! Seems familiar... oh yeah the FX series.

Seriously, one game can actually use GPU physics and it needs special levels to do so. Even more, so far that game hasn't been very well adopted.(UT3)

Said plainly, this is cheating so they can boast higher 3dmark scores. Which doesn't even work since using the GPU in a CPU test is against Futuremarks rules and therefore the scores don't even count.

AngelGraves13
06-24-08, 08:11 PM
Crysis doesn't utilise PhysX.

LOL

Dreamingawake
06-24-08, 08:20 PM
lol x2, like seriously, don't ever believe anything you read on the net.. some people
just don't know what they're talking about..

G-Man
06-24-08, 08:30 PM
lol x3

Vanzagar
06-24-08, 09:25 PM
... I've turned up the Physics in Crysis to Very High, and I can see a big difference in the explosions onscreen.

Hey dude, I've got this awesome bridge I'd like to sell ya, it looks realllly cool...

3DBrad
06-24-08, 09:32 PM
All UE3 games use Physx though, pretty much (except for Biostink and a few other titles).

Maverickman
06-24-08, 09:34 PM
I know that Crysis does not support PhysX, but turning up the physics setting to very high results in a dramatic improvement in explosions and damage. If your card can handle it, give it a try. We've heard about the wonders of PhysX before, but those who have seen it in action state that it can make a big difference in the gaming experience. Maybe it will turn out to be all hype and just an attempt by Nvidia to boost 3DMark Vantage scores, but I think it may turn out to signal a major shift in gaming. PhysX support was limited to those who bought the card and games that supported it. Now it will be supported in Nvidia's cards by downloading the correct drivers. Developers can include it in future games knowing that a lot of gamers will be able to use it. I'm sure that Nvidia will so all it can to ensure that this happens.

Amuro
06-24-08, 09:51 PM
Now it's up to Nvidia to push developers to support physx in future games.

mailman2
06-24-08, 10:03 PM
All UE3 games use Physx though, pretty much (except for Biostink and a few other titles).

Wrong, they used Havoc which supports multiple core CPUs. Which is why some UT3 based games tax 4 cores. Bioshock used another physics engine however.

To me PhysX isn't more important than getting 95% of the GTX 280 performance for $320. Gimme a 4870.

Amuro
06-24-08, 10:06 PM
But UT3 did show a huge FPS increase with the new 177.39 + physx driver. Why is that?

mailman2
06-24-08, 10:24 PM
But UT3 did show a huge FPS increase with the new 177.39 + physx driver. Why is that?

There are only 2 maps that you have to download that support PhysX lol. Not every map even supports PhysX. So I guess the shader re-write of the driver would be the reason why UT3 has a huge FPS increase, it has nothing to do with PhysX.

The 177.39 + PhysX combo only works with G92 and GT200 series and ONLY in Vantage (which is sorta cheats cause it uses the GPU to run the Physics benchmark and not the CPU like it should) and the UT3 addon maps ("Tornado" and "LightHouse". - download here http://www.gameupdates.org/details.php?id=1917) for PhysX.

There is no universial PhysX support for the Nvidia cards yet.

XMAN52373
06-24-08, 10:33 PM
So, their secret weapon is... cheating in 3dmark scores! Seems familiar... oh yeah the FX series.

Seriously, one game can actually use GPU physics and it needs special levels to do so. Even more, so far that game hasn't been very well adopted.(UT3)

Said plainly, this is cheating so they can boast higher 3dmark scores. Which doesn't even work since using the GPU in a CPU test is against Futuremarks rules and therefore the scores don't even count.

Until future mark takes a stand on the issue, you can't call it cheating.

mailman2
06-24-08, 10:43 PM
Until future mark takes a stand on the issue, you can't call it cheating.

True - but you cannot gauge Vantage scores for the CPU scores when the GPU is actually performing the test. So regardless, there are alot of entries in the ORB that will be null and void.


Looks like Futuremark is already fixing that - lol

http://www.xtremesystems.org/forums/showthread.php?t=192363

XMAN52373
06-24-08, 11:50 PM
True - but you cannot gauge Vantage scores for the CPU scores when the GPU is actually performing the test. So regardless, there are alot of entries in the ORB that will be null and void.


Looks like Futuremark is already fixing that - lol

http://www.xtremesystems.org/forums/showthread.php?t=192363

They are doing nothing more than they have done in the past, beta drivers = not allowed, same thing they have been doing for years now. Whats the big deal. All WHQL drivers are valid. Do you think that Nvidia will not keep that in WHQL drivers? If you do, you are delusional. They will have it in their WHQL drivers which are allowed by FM, so how will it then be cheating?

Ninja Prime
06-25-08, 12:19 AM
They are doing nothing more than they have done in the past, beta drivers = not allowed, same thing they have been doing for years now. Whats the big deal. All WHQL drivers are valid. Do you think that Nvidia will not keep that in WHQL drivers? If you do, you are delusional. They will have it in their WHQL drivers which are allowed by FM, so how will it then be cheating?

Wrong. Directly from 3D Mark Vantage Driver Approval Policy:

Based on the specification and design of the CPU tests, GPU make, type or driver version may not have a significant effect on the results of either of the CPU tests as indicated in Section 7.3 of the 3DMark Vantage specification and whitepaper.

Its cheating.

XMAN52373
06-25-08, 12:33 AM
Wrong. Directly from 3D Mark Vantage Driver Approval Policy:



Its cheating.

If they allow it and do not call Nvidia on it themselves, they are saying it is ok and not a cheat. I'm sorry, but until they comment on it specificly themselves, it is not NVidias fault they screwed the pouch when deciding to include PhysX into testing and then not aguementing how it was tested after Nv bought Ageia. Anyone with a brian saw this coming from Nv after they bought them, teh fact FM didn't or didn't fathom to think of this happening just goes to show how stupid they are and have become since 2k3 was first released.

Amuro
06-25-08, 12:47 AM
Who cares who's cheating in 3DMark LOL.

Tomato
06-25-08, 01:17 AM
All Nvidia did was allow the Physx to run on their cards. If the CPUs all of the sudden started to do video card work would the CPUs be cheating on 3DMark? Of course not it is ADDING features to a given piece of hardware. This whole line of rationalization where because a certain piece of hardware is more capable than the other, it should be normalized.

We all realize ATI has Dx10.1 support and we don't rag on ATI for supporting this. Good for them I think it is noble to support as many features as you can. Nvidia supports Physx, good for them the more they can do the better.

As for the CPU being utilized in UT games, this is due to the fact Physx will use the CPU in conjunction with GPU Physx for the best effects. Yes, Physx basically does what Havoc does on a CPU and furthur enhances it onto the GPU.

Some people make the argument that the GPU can't do Physx and Graphics at the same time, which is false.

Others make the argument that Physx can't run as fast in a game as more of the GPU is used for the game. First of all using 2 TPCs (1/5th of a GTX 280) is more power than the Ageia PPU ever had, making Physx very viable even with graphics running full bore.

Another argument has been made that the Futuremark test runs the Nvidia cards twice because of its dual functionality and does not represent real world results. Lets look at this, if the CPU can run the CPU test and the Physx test then how can the CPU not qualify for offending the same principal. For example the CPU test assumes the CPU has nothing to do other than CPU stuff, but then again during the Physx test Vantage assumes again the CPU has nothing to do but Physx, completely invalidating the point that a single piece of HW can't be used for any two different tests to their full potiental.

The point is that Vantage is a benchmark and is intended the list the potiental of the machine. Using Vantage to measure against games is not that great of an idea. If you want to know how games will run, why not try games? Lets not get bent out of shape here for Nvidia providing us with a very valuable feature which has the potiental to make games more fun.

Quite frankly I can't believe people are so upset about this.

particleman
06-25-08, 02:10 AM
Who cares if they are cheating in 3Dmark. This whole thing just makes Futuremark's 3Dmark look even more stupid than it already does. 3Dmark has almost no credibility anymore and this will just further that point. A good synthetic benchmark should try to reflect the performance you will get in games or future games. With 3Dmark Vantage and the physx test weighed into the final score, Vantage does a horrible job of reflecting real life 3D performance.

I think there needs to be a standard physics API before a physics score should even be considered and even then I would still be skeptical about adding it. Right now, by having 3Dmark support only physx, it has basically become a nVidia only physx benchmark. It is like if someone wrote a benchmark which used only Glide or S3 S3D Metal. The S3 cards would be the fastest card in that benchmark because they would be the only one that supported that API.

3Dmark has become an even bigger joke than it already is. And I say this as a 280 GTX owner who's card gets a significant boost.

mtl
06-25-08, 02:19 AM
Well said.

nekrosoft13
06-25-08, 07:33 AM
OMFG, how could they. 3dmark is such an awesome game