PDA

View Full Version : I have seen the future....


Pages : 1 [2] 3 4 5 6 7

PeterJensen
03-23-10, 11:22 AM
Could we please stay on topic?

Sazar
03-23-10, 11:24 AM
Could we please stay on topic?

What is the topic?

Seeing the future?

Let's talk star trek in that case :bleh:

Razor1
03-23-10, 11:37 AM
Crap games with implementation doesn't mean jack.

Havok still has a lead in tier 1 titles. Heck, per the graphs on the site you linked, it has the lead in the top 3 tiers while PhysX dominates in the bargain basement titles that likely see few installs.

http://physxinfo.com/articles/wp-content/uploads/2009/12/numb_released_titles.jpg

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph_year.png

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph.jpg

http://physxinfo.com/articles/wp-content/uploads/2009/12/platform_distribution_graph.jpg

Results are obvious enough – PhysX SDK is dominating on PC market, Havok – on console market (reasons were described in pt. 1). Also, only Havok has advanced support for various platforms – not only PC and modern consoles, but Xbox, PS2 and even PSP.

Read the article next time

If you can't read it and just looking at one graph to make a point its a pretty weak arguement.

Havok leads in Consoles, and thats it, when it comes to PC titles, they aren't even close.

Thanks to it’s free license and rich feature set PhysX SDK, preferred by small teams, is dominating PC market. Currently PhysX SDK is widely adopted by russian (mostly trash games) and korean (mostly specific MMOs) developers. Not to mention, that PhysX SDK is default physics solution for Unreal Engine 3, used in majority of UE3 based titles (Gears of War, Mass Effect, etc). Year 2009 has brought some popular games, like Dragon Age: Origins, Overlord 2 or Risen, into PhysX library.

Havok is currently best choise for AAA titles – extensive toolset , orientation on consoles, best-in-class developer support. Well-known titles of year 2009, like Uncharted 2: Among Thieves or Killzone 2 are based on Havok. Surprisingly, even Try Havok initiative hasn’t helped Havok to gain popularity at indie-developers community.


And problem is now that PhysX is also available on PS3 and PS2 is now gone, added to physX works on cellphones like the Iphone, guess what the market is going to be ;)?
If you look at AAA titles for the PC from a year to year basis, its pretty equal for physX to Havok in recent years, but then again, GPU physics, only one it town. Havok looses at the end, and Havok will continue to loose marketshare as long as Intel doesn't allow them to come out. Why do you think ATi is now with Bullet? What happened with Havok? Intel screwed AMD, AMD did a stupid move to promote Havok GPU physics after Intel bought them out. They should have know Intel would lock them out (well not lock them out but drop the ball because Larrebbee just wasn't up to snuff)

As of 2008 Havok hasn't been implemented as much as PhsyX and this is when PhysX came out after nV bought them. My point being, nV is doing a good job at pushing it out there, if you don't like the effects, thats up to you, and I really could care less, because others like what they see, and the potential of more is there, just takes time to get those out.

Toss3
03-23-10, 11:39 AM
http://physxinfo.com/articles/wp-content/uploads/2009/12/platform_distribution_graph.jpg

yeah there is a various platform uptake chart there too ;)

Without consoles in the mix, phsyX has more then a two fold advantage over Havok.
?

That graph is flat out wrong.

Here' (http://www.havok.com/index.php?page=available-games)s the list of all of the titles currently using Havok. If you do a simple search you'll see that there are ~124 PC titles in there. All in all there's over 200.

Note here that I'm not saying in any way that Havok is better than PhysX, because clearly it isn't. Instead of saying "Havok" I could have said "physics run on the CPU".

If you want GPU only physics, why even have this discussion, because is there any game that isn't using PhysX for gpu accellerated physX?

I'm just saying that "GPU assisted physics" is a feature nvidia currently are touting as a major advantage of their cards and I'm just pointing out that that argument bares no weight at all. It's currently only being used for things that could be done on a CPU 10 years ago, but now suddenly require a gpu to run. Then they limit the performance of PhysX on CPUs to make their cards look better. See where I am going? Nvidia being the only one having PhysX is only hurting the market, not advancing it.

I would very much like to see future titles make use of the PhysX effects shown in the plethora of different demos out there, but with only one camp having access to those features I don't see it happening.

Sazar
03-23-10, 11:39 AM
http://physxinfo.com/articles/wp-content/uploads/2009/12/numb_released_titles.jpg

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph_year.png

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph.jpg

http://physxinfo.com/articles/wp-content/uploads/2009/12/platform_distribution_graph.jpg



Read the article next time

If you can't read it and just looking at one graph to make a point its a pretty weak arguement.

Havok leads in Consoles, and thats it, when it comes to PC titles, they aren't even close.

My bad.

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_rating_graph.jpg

So yah, Havok leads in quality games. You're absolutely right.

Physx has the lead in most overall titles apparently, but the vast majority appear to be low-end/crap games with who knows how good an implementation.

shadow001
03-23-10, 11:49 AM
One also has to make the distinction between CPU based and GPU based physics,since PhysX can run on the CPU as well,if the user doesn't have a video card that supports GPU based physX,and the game itself needs specific support for GPU based physX.


So under those circumstances,the total amount of games supporting GPU based physX drops to about 10 games over the last 2 years,so it's hardly impressive in that sense.


I really don't care about differences between PhysX,havok,bullet as long as all 3 run on CPU's,as they're pretty similar when it comes to overall capabilities,and it comes down to which one uses less CPU power for a given effect which decides the best one,and even then,it depends on the game itself and what type of effects have the developers planned for it.


Lots of variables come into it basically.

Razor1
03-23-10, 12:08 PM
That graph is flat out wrong.

Here' (http://www.havok.com/index.php?page=available-games)s the list of all of the titles currently using Havok. If you do a simple search you'll see that there are ~124 PC titles in there. All in all there's over 200.

Note here that I'm not saying in any way that Havok is better than PhysX, because clearly it isn't. Instead of saying "Havok" I could have said "physics run on the CPU".

I'm just saying that "GPU assisted physics" is a feature nvidia currently are touting as a major advantage of their cards and I'm just pointing out that that argument bares no weight at all. It's currently only being used for things that could be done on a CPU 10 years ago, but now suddenly require a gpu to run. Then they limit the performance of PhysX on CPUs to make their cards look better. See where I am going? Nvidia being the only one having PhysX is only hurting the market, not advancing it.

I would very much like to see future titles make use of the PhysX effects shown in the plethora of different demos out there, but with only one camp having access to those features I don't see it happening.

You want to list out when those game came out in the market, Xbox games stopped coming out in in late 2005. You still think that graph is still wrong? If you look at 2006 and up, which is what those graphs are showing, the number of Havok titles for PC goes down pretty fast. This is when Havok was the only solid physics engine out there prior to Novedex

What are you talking about now? First you want to talk about PC only, the list you just showed me a list that isn't PC only. What do you exactly want to talk about? Because throwing a billion darts at a board you are bound to hit something correctly.

There is only one camp because ATi doesn't get off their butt to do anything, they can talk and show powerpoint slides all they want it, buts till they actually get some decent opencl and direct compute drivers and with bullet with some games, they don't have much of a choice (yeah they aren't in the greatest shape right now).

Funny thing is games 10 years ago had very little outside of collision detection based on bound box, even games 5 years ago, same old collision detection based on pre poly (acutally just more precise bound box based on skeletons), now we are doing per poly with physX, see the difference in escalation.

When you really want to talk about things like this, guys really read some basic game programming books, don't need to know the real thing and make a game, just the basics and history have how games have evoloved from a tech point of view. To me when people put out for the most part pointless arguements based on crap knowledge, guess what.........

shadow001
03-23-10, 12:29 PM
You want to list out when those game came out in the market, Xbox games stopped coming out in in late 2005. You still think that graph is still wrong? If you look at 2006 and up, which is what those graphs are showing, the number of Havok titles for PC goes down pretty fast. This is when Havok was the only solid physics engine out there prior to Novedex

What are you talking about now? First you want to talk about PC only, the list you just showed me a list that isn't PC only. What do you exactly want to talk about? Because throwing a billion darts at a board you are bound to hit something correctly.

There is only one camp because ATi doesn't get off their butt to do anything, they can't talk and show powerpoint slides all they want it, buts till they actually get some decent opencl and direct compute drivers and with bullet with some games, they don't have much of a choice (yeah they aren't in the greatest shape right now).

Funny thing is games 10 years ago had very little outside of collision detection based on bound box, even games 5 years ago, same old collision detection based on pre poly (acutally just more precise bound box based on skeletons), now we are doing per poly with physX, see the difference in escalation.

When you really want to talk about things like this, guys really read some basic game programming books, don't need to know the real thing and make a game, just the basics and history have how games have evoloved from a tech point of view. To me when people put out for the most part pointless arguements based on crap knowledge, guess what.........


They don't really need to do anything when viewed in broader terms,as ATI is owned by AMD and AMD makes CPU's,which are used to run physics in games,among other things of course,and obviously,intel also has a say in the matter,since they're also primarily a CPU business afterall,and both are interested in selling the fastest and most expensive CPU's they can make to hardware enthusiats.



The other point is that i have yet to see a single user claim that at least with Nvidia's current hardware(might be different with Fermi),that they can run a game and GPU physics on the same GPU and still get acceptable performance,especially if the user in question like to play games at high quality settings,so at least for now,GPU physics are relegated to either Dual GPU cards,or multi card SLI setups,which have the extra power to pull it off with good performance.



Nvidia will have a real edge when they develop a single card with a single processor in it, that can do both workloads(graphics + physics),at high graphics quality,while still having playable performance....Until then,it's a gimmick plain and simple.

Razor1
03-23-10, 12:46 PM
They don't really need to do anything when viewed in broader terms,as ATI is owned by AMD and AMD makes CPU's,which are used to run physics in games,among other things of course,and obviously,intel also has a say in the matter,since they're also primarily a CPU business afterall,and both are interested in selling the fastest and most expensive CPU's they can make to hardware enthusiats.

And this is another area where PhysX is winning over development teams. Since PhysX isn't bound by the what CPU you use, it gives the additional benifits for nV only cards, it would be nice if it worked on ATi cards, but thats a business choice nV made, and it screwed over ATi, so what, that is their choice to make.

The other point is that i have yet to see a single user claim that at least with Nvidia's current hardware(might be different with Fermi),that they can run a game and GPU physics on the same GPU and still get acceptable performance,especially if the user in question like to play games at high quality settings,so at least for now,GPU physics are relegated to either Dual GPU cards,or multi card SLI setups,which have the extra performance to pull it off with good performance.

Great way to sell more cards right :D, but another note as I stated about escalation.

Lets take a simple cape flowing effect, in the past we used skeletal animation for this, to make a nice (reletively speaking for that time) flowing cape you would use around 20 bones. Now a nice looking cape would be modelled with around 200 polys or so, it get the bones to flex properly so you won't get any sharp angles or stretching and what not.

Now put that into a per poly situation, mind you the skeleton is still there for over all movement of the cape, but now the calculations just increased 200 fold because its based on the poly that was hit.


Nvidia will have a real edge when they develop a single card with a single processor in it, that can do both workloads(graphics + physics),at high graphics quality,while still having playable performance....Until then,it's a gimmick plain and simple.

Plain and simple its not a gimmick, if you don't know how much the load increased for such a simple effect, guess what happens when we are looking at much more intensive effects? It doesn't take a genius to understand this stuff.

CPU's in the past 5 years have increased in preformance 10 times or so, not much more then that, animations of physics that were based on skeletons before, now that are based on per poly, the increase is much larger, the cape example is nothing to some of the effects I've seen in some up coming games.

Toss3
03-23-10, 12:56 PM
You want to list out when those game came out in the market, Xbox games stopped coming out in in late 2005. You still think that graph is still wrong? If you look at 2006 and up, which is what those graphs are showing, the number of Havok titles for PC goes down pretty fast. This is when Havok was the only solid physics engine out there prior to Novedex

What are you talking about now? First you want to talk about PC only, the list you just showed me a list that isn't PC only. What do you exactly want to talk about? Because throwing a billion darts at a board you are bound to hit something correctly.

You were the one who took up the whole Havok vs PhysX stuff to begin with - I just said that gpu-accelerated PhysX is a gimmick and you still haven't addressed that argument which was the point to begin with. I honestly don't care what physics engine a game uses as long as it doesn't affect the game itself(which is exactly what nvidia's PhysX does).

There is only one camp because ATi doesn't get off their butt to do anything, they can talk and show powerpoint slides all they want it, buts till they actually get some decent opencl and direct compute drivers and with bullet with some games, they don't have much of a choice (yeah they aren't in the greatest shape right now).

This has nothing to do with ATI at all. They are not arguing that nvidia should give them PhysX for free. They are just sticking up for the consumers calling nvidia out on the lies they are feeding us. We don't need a gpu to run those effects currently presented in PhysX titles and that is the truth.

Funny thing is games 10 years ago had very little outside of collision detection based on bound box, even games 5 years ago, same old collision detection based on pre poly (acutally just more precise bound box based on skeletons), now we are doing per poly with physX, see the difference in escalation.

Red Faction:
baWbh9062VE

Mirror's Edge:
D58DlquZjKY

Just to prove that you don't need a gpu to simulate broken glass. Funny how a game made in 2001 can look better than a game made in 2009.

When you really want to talk about things like this, guys really read some basic game programming books, don't need to know the real thing and make a game, just the basics and history have how games have evoloved from a tech point of view. To me when people put out for the most part pointless arguements based on crap knowledge, guess what.........

What are you talking about? No one is saying that gpu accelerated physics would be WORSE. We all know what PhysX is capable of - what we are saying is that it isn't being utilized for anything beyond what could be done on a cpu.

shadow001
03-23-10, 12:59 PM
And this is another area where PhysX is winning over development teams. Since PhysX isn't bound by the what CPU you use, it gives the additional benifits for nV only cards, it would be nice if it worked on ATi cards, but thats a business choice nV made, and it screwed over ATi, so what, that is their choice to make.


Great way to sell more cards right :D, but another note as I stated about escalation.

Lets take a simple cape flowing effect, in the past we used skeletal animation for this, to make a nice (reletively speaking for that time) flowing cape you would use around 20 bones. Now a nice looking cape would be modelled with around 200 polys or so, it get the bones to flex properly so you won't get any sharp angles or stretching and what not.

Now put that into a per poly situation, mind you the skeleton is still there for over all movement of the cape, but now the calculations just increased 200 fold because its based on the poly that was hit.




Plain and simple its not a gimmick, if you don't know how much the load increased for such a simple effect, guess what happens when we are looking at much more intensive effects? It doesn't take a genius to understand this stuff.

CPU's in the past 5 years have increased in preformance 10 times or so, not much more then that, animations of physics that were based on skeletons before, now that are based on per poly, the increase is much larger, the cape example is nothing to some of the effects I've seen in some up coming games.


Well,this is what i'm thinking about getting this year,being the enthusiast user that i am,and i think i'm covered for CPU power for the next couple of years at least,no matter how sophisticated physics get:


http://www.blogcdn.com/www.engadget.com/media/2010/03/3-17-10-classifiedsr2-600.jpg


Dual socket enthusiast board,costing 600$,and can support 6 core/12 thread CPU's in each socket,for a grand total of 24 threads and with 48 GB of memory in total as a maximum and support for both SLI and crossfire,and unlike standard server boards,it has plenty of options for overclocking too.


Expensive,sure,but problem solved for a long time to come...It's an EVGA classified SR-2 motherboard btw,and it's out now.

Razor1
03-23-10, 01:24 PM
You were the one who took up the whole Havok vs PhysX stuff to begin with - I just said that gpu-accelerated PhysX is a gimmick and you still haven't addressed that argument which was the point to begin with. I honestly don't care what physics engine a game uses as long as it doesn't affect the game itself(which is exactly what nvidia's PhysX does).



This has nothing to do with ATI at all. They are not arguing that nvidia should give them PhysX for free. They are just sticking up for the consumers calling nvidia out on the lies they are feeding us. We don't need a gpu to run those effects currently presented in PhysX titles and that is the truth.



Red Faction:
baWbh9062VE

Mirror's Edge:
D58DlquZjKY

Just to prove that you don't need a gpu to simulate broken glass. Funny how a game made in 2001 can look better than a game made in 2009.



What are you talking about? No one is saying that gpu accelerated physics would be WORSE. We all know what PhysX is capable of - what we are saying is that it isn't being utilized for anything beyond what could be done on a cpu.



All of these things, you forget that now the glass in Mirror's edge also causes damage to the player, as in per poly collision detection, and also now the amount of object calculations of the glass itself. What am I talking about? The amount of calculations to do these things, aren't just a visual crap shot. This is what you guys are missing, you think game developement is still like the text based games of the 80's? I think you guys think it is. Guess what I've been around long enough to know some of the best things in games, that I like, I don't see them in the best games today, and thats why I still play at times text based games.

You guys want realism in physics, lets go through some simple neutonian fluid dynamics

http://en.wikipedia.org/wiki/Newtonian_fluid

now what were we using in games 5 years ago, were we using physics for water? Whats the difference in calculations amounts? Are we using physics in todays water, yeah on a per poly basis, to show interaction, lets get some real physics involved the increase is in the tens of thousands of increase in calculation amounts and more so because we have to use particles. I don't need to keep posting but a basic understanding is all I'm looking for, if you think physics is easy to implement there are quite alot of implications from design, hardware side that without understanding now computers and games evolve on features, anyone even a two year old can say something is a gimmick. I can say Sh*t my favorite games are wizardary 1, 2, 3 and they still are, and everything since then has been a gimmick with game design is involved.

Razor1
03-23-10, 01:30 PM
Well,this is what i'm thinking about getting this year,being the enthusiast user that i am,and i think i'm covered for CPU power for the next couple of years at least,no matter how sophisticated physics get:


http://www.blogcdn.com/www.engadget.com/media/2010/03/3-17-10-classifiedsr2-600.jpg


Dual socket enthusiast board,costing 600$,and can support 6 core/12 thread CPU's in each socket,for a grand total of 24 threads and with 48 GB of memory in total as a maximum and support for both SLI and crossfire,and unlike standard server boards,it has plenty of options for overclocking too.


Expensive,sure,but problem solved for a long time to come...It's an EVGA classified SR-2 motherboard btw,and it's out now.


Well I'm an enthusiast have a mac pro with 2 quad core xeons at 2.93 ghz with 16gigs of ram, I spend 6 grand total on my computer (with a 30 inch monitor) a year and half back, and I know I don't need to touch my cpu's for another 2 years. Good investment, I know CPU power won't hold me back for a while. I never really cared about SLi or Crossfire. If I did I would probably have gotten a system from titanus or some company like that, back then.

Rollo
03-23-10, 01:34 PM
Hmmm.

My only point in posting this was to note this game kicks ass in DX11 with PhysX- very immersive.

Personally, given the choice between an ATi DX11 card and an NVIDIA DX11 card, I'd buy the NVIDIA card for this game alone. (not to mention all the other games with PhysX I have, and the ones that are still in development)

The thread isn't meant to be about marketshare of PhysX vs Havok, what can be done on cpu and what can't, whether NVIDIA should give ATi users PhysX capability.

This is a very nice game, it would really annoy me if I knew I couldn't see it at it's best after buying a high end graphics card. :(

shadow001
03-23-10, 01:37 PM
All of these things, you forget that now the glass in Mirror's edge also causes damage to the player, as in per poly collision detection, and also now the amount of object calculations of the glass itself. What am I talking about? The amount of calculations to do these things, aren't just a visual crap shot. This is what you guys are missing, you think game developement is still like the text based games of the 80's? I think you guys think it is. Guess what I've been around long enough to know some of the best things in games, that I like, I don't see them in the best games today, and thats why I still play at times text based games.

You guys want realism in physics, lets go through some simple neutonian fluid dynamics

http://en.wikipedia.org/wiki/Newtonian_fluid

now what were we using in games 5 years ago, were we using physics for water? Whats the difference in calculations amounts? Are we using physics in todays water, yeah on a per poly basis, to show interaction, lets get some real physics involved the increase is in the tens of thousands of increase in calculation amounts and more so because we have to use particles. I don't need to keep posting but a basic understanding is all I'm looking for, if you think physics is easy to implement there are quite alot of implications from design, hardware side that without understanding now computers and games evolve on features, anyone even a two year old can say something is a gimmick. I can say Sh*t my favorite games are wizardary 1, 2, 3 and they still are, and everything since then has been a gimmick with game design is involved.



The point is that we're not even close enough to have enough processing power simulate physics interation properly,the way it happens in real life,while having acceptable performance anyhow....That's the point,so shortcuts have to be taken,which is what developers do in the end.


Push physics to it's logical limits and see Current GPU's dragging along at 1 fps in desktop systems,even enthusiast level setups...No thanks.


Here's the link for the CPU's i'm thinking of using with the above board btw:

http://www.ncix.com/products/index.php?sku=51357&vpn=BX80601W3520&manufacture=Intel

Basically Xeon versions of the i7 920 and can be overclocked to 4 Ghz,and they're only 350$ each and i'd have 16 CPU threads to play with....I'd only upgrade to 6 core/12 thread processors when they get much cheaper of course(1200~1300$ a pop right now..ouch).


Add some nice DDR 3 memory from corsair(12GB of it),and voila,super computer as your home system,and it would cost about 2000$ between the motherboard,the CPU's and the ram.....I could fit that motherboard in my coolermaster HAF 932 case with room to spare.

Toss3
03-23-10, 01:38 PM
All of these things, you forget that now the glass in Mirror's edge also causes damage to the player, as in per poly collision detection, and also now the amount of object calculations of the glass itself. What am I talking about? The amount of calculations to do these things, aren't just a visual crap shot. This is what you guys are missing, you think game developement is still like the text based games of the 80's? I think you guys think it is. Guess what I've been around long enough to know some of the best things in games, that I like, I don't see them in the best games today, and thats why I still play at times text based games.

You guys want realism in physics, lets go through some simple neutonian fluid dynamics

http://en.wikipedia.org/wiki/Newtonian_fluid

now what were we using in games 5 years ago, were we using physics for water? Whats the difference in calculations amounts? Are we using physics in todays water, yeah on a per poly basis, to show interaction, lets get some real physics involved the increase is in the tens of thousands of increase in calculation amounts and more so because we have to use particles. I don't need to keep posting but a basic understanding is all I'm looking for, if you think physics is easy to implement there are quite alot of implications from design, hardware side that without understanding now computers and games evolve on features, anyone even a two year old can say something is a gimmick. I can say Sh*t my favorite games are wizardary 1, 2, 3 and they still are, and everything since then has been a gimmick with game design is involved.

Believe me I get the difference between physics then and now, but are you honestly telling me that those simulations in Mirror's Edge couldn't be done on an i7 at 4Ghz? I should also have included the video without PhysX so you could see for yourself how awful it looks even compared to red faction. The point here wasn't that the glass shattering with physx on looked worse than red faction, but that it looks worse with it turned off. This to me makes no sense.

If, and when, nvidia chooses to properly support PhysX on multicore CPUs and allow people to use dedicated physx cards along with their ATI gpus I'll definitely support it, but until then I won't.

Razor1
03-23-10, 01:42 PM
Believe me I get the difference between physics then and now, but are you honestly telling me that those simulations in Mirror's Edge couldn't be done on an i7 at 4Ghz? I should also have included the video without PhysX so you could see for yourself how awful it looks even compared to red faction. The point here wasn't that the glass shattering with physx on looked worse than red faction, but that it looks worse with it turned off. This to me makes no sense.

If, and when, nvidia chooses to properly support PhysX on multicore CPUs and allow people to use dedicated physx cards along with their ATI gpus I'll definitely support it, but until then I won't.

Thats an art direction my man, thats the problem look at this way you want something that has breakable polys or do you want something that is premade and broken down when collision is detected. There is a major difference, there will be art limitations based on polygon arragement and texture details, if you want more realism, then we have to go into procedural texture being built on the fly, based on new UV's made for the polygons. Those aren't easy to do at all even with the horsepower we have today.

Razor1
03-23-10, 01:44 PM
The point is that we're not even close enough to have enough processing power simulate physics interation properly,the way it happens in real life,while having acceptable performance anyhow....That's the point,so shortcuts have to be taken,which is what developers do in the end.


Push physics to it's logical limits and see Current GPU's dragging along at 1 fps in desktop systems,even enthusiast level setups...No thanks.


Here's the link for the CPU's i'm thinking of using with the above board btw:

http://www.ncix.com/products/index.php?sku=51357&vpn=BX80601W3520&manufacture=Intel

Basically Xeon versions of the i7 920 and can be overclocked to 4 Ghz,and they're only 350$ each and i'd have 16 CPU threads to play with....I'd only upgrade to 6 core/12 thread processors when they get much cheaper of course(1200~1300$ a pop right now..ouch).


Add some nice DDR 3 memory from corsair(12GB of it),and voila,super computer as your home system,and it would cost about 2000$ between the motherboard,the CPU's and the ram.....I could fit that motherboard in my coolermaster HAF 932 case with room to spare.

Neutonia fluid simulations are doable on todays GPU's, but they are very intensive, and thats about all they will do, so no pretty game, pretty water though, and they aren't doable on CPU's at the necessary frame rates.

shadow001
03-23-10, 01:48 PM
Hmmm.

My only point in posting this was to note this game kicks ass in DX11 with PhysX- very immersive.

Personally, given the choice between an ATi DX11 card and an NVIDIA DX11 card, I'd buy the NVIDIA card for this game alone. (not to mention all the other games with PhysX I have, and the ones that are still in development)

The thread isn't meant to be about marketshare of PhysX vs Havok, what can be done on cpu and what can't, whether NVIDIA should give ATi users PhysX capability.

This is a very nice game, it would really annoy me if I knew I couldn't see it at it's best after buying a high end graphics card. :(


Played it for quite a bit now and i got to say that always dealing with low ammo and filters when you go outside or near radioactive areas can and does piss me off some....Died many times already coming up with different strategies.

shadow001
03-23-10, 01:53 PM
Neutonia fluid simulations are doable on todays GPU's, but they are very intesive, and thats about all they will do, so no pretty game, pretty water though, and they aren't doable on CPU's at the necessary frame rates.


Then you get super realistic physics with no graphics to go along with those,which kinda defeats the whole point doesn't it,regardless if they're attempted on CPU's or GPU's.


I'm trying to look at it globally here,from gameplay,to graphics,to interactivety and of course physics,the system that can pull off Crysis caliber graphics along with full newtonian physics simply doesn't exist yet.

Toss3
03-23-10, 02:00 PM
Thats an art direction my man, thats the problem look at this way you want something that has breakable polys or do you want something that is premade and broken down when collision is detected. There is a major difference, there will be art limitations based on polygon arragement and texture details, if you want more realism, then we have to go into procedural texture being built on the fly, based on new UV's made for the polygons. Those aren't easy to do at all even with the horsepower we have today.

I get what you are saying and definitely wouldn't mind getting an extra physx card if there was a game that required it(/made proper use of it). Sadly this is not the case.

What I think is going on is that nvidia knows that a single GPU running both physics and rendering wouldn't outperform a multicore CPU + GPU doing the same task(fluidmark proves this, see chart below). They also know that most people wouldn't shell out for an extra card unless there are games out there that make use of it(reason why I think ageia would have failed). So they gimped the cpu support of physx to make having a physx-capable card look more attractive. It's marketing and makes them money they otherwise wouldn't get, but unless you own nvidia stock this is not behavior you should support as it hurts the consumer.

http://physxinfo.com/news/wp-content/uploads/2010/03/fluidmark_graph.jpg

MikeC
03-23-10, 04:34 PM
I'll second Rollo's opinion.

http://www.nvnews.net/vbulletin/showthread.php?t=149249

BTW, if you dislike PhysX, just turn it off in the control panel ;)

Excellent debate between PhysX and Havok...

Toss3
03-23-10, 05:27 PM
I'll second Rollo's opinion.

http://www.nvnews.net/vbulletin/showthread.php?t=149249

BTW, if you dislike PhysX, just turn it off in the control panel ;)

Excellent debate between PhysX and Havok...

Does the future hold great overclocks for us? :) Nice to see that nvnews' got a day 1 review! Can't wait til' Friday!

Xion X2
03-23-10, 06:40 PM
Hmm. This thread actually has some good information in it.

Thumbs up to both razor and Toss for providing their info. I had no idea that PhysX had gained so much marketshare.

Some key issues, I think, are:

1) How many of those PhysX titles from razor's graphs run on the GPU exclusively (this would be Nvidia's selling point.)
2) How does performance with multicore physics w/ a CPU compare to single-GPU physics (Toss answered this one with his graph.)

I still contend that most of the best physics effects that I've seen were ran strictly on the CPU. HardOCP comments on this here when demo'ng Ghostbusters on a quad-core with the Terminal Reality team:

http://www.youtube.com/watch?v=H9boF-JZKcU

And I've yet to see explosions and smoke effects of the quality of Battlefield Bad Company 2, with physics on the CPU:

http://www.youtube.com/watch?v=WDJqetBhR-Y

http://www.youtube.com/watch?v=3lxjAVGcPCk&NR=1

Notice that the engine is smart enough to know how much structural damage a building can take before it collapses.

From what I've seen of Metro, I just don't see any significant gains at all from PhysX in that game over what the above titles offer. In fact, I'm generally less impressed than I am with the CPU-physics games.

CaptNKILL
03-23-10, 09:07 PM
I still don't understand why people are calling the destruction in Bad Company 2 "physics".

The only thing being calculated in real time is the dozen or so random chunks of wood\stone that actually collide with things. Beyond that, they're simply deleting "wall" models and replacing them with "hole" models. If enough walls have been deleted, the building plays the "fall over" animation, which is the same every time. They cover all of the transitions up with smoke effects but none of it is dynamic, even if it looks good.

There is no physics calculation being done for any of that, other than the chunks of wood that bounce (rather than fall through things).

A far better example of the kind of physics a CPU can do would be Flatout Ultimate Carnage. Just look at all of the debris flying around that actually has physics interactions:

som7lmVkrX4

... or you could look at Crysis for several examples of real physics calculations as opposed to canned model swapping and preset animations:

hGrMIJirJPY

I know that isn't in real-time, but the engine can do it in real time with enough rendering and processing power. You won't see anything like this in Bad Company 2. Absolutely nothing close to it.

BC2 is a great game, but it doesn't do anything that proves that we don't need GPU physx. In fact, with GPU physx they could have made the walls actually fall apart dynamically, but it wouldn't be usable for everyone so that's obviously not realistic at this point. The game is already a multicore CPU hog. If it was doing more real physics it wouldn't even be playable.