PDA

View Full Version : Ghostbusters Physics Video


Pages : [1] 2

jAkUp
04-15-09, 02:14 AM
:D

http://www.hardocp.com/news.html?news=Mzg5NzksLCxoZW50aHVzaWFzdCwsLDE=

trivium nate
04-15-09, 02:38 AM
boring

NaitoSan
04-15-09, 03:12 AM
Wow that's amazing and glad to see they're taking advantage of SPUs in PS3. Also this should be in console thread since it's about PS3. :p

I can't wait for this game on PC.

PeterJensen
04-15-09, 04:12 AM
wow nice

Buio
04-15-09, 04:44 AM
Not that impressive, it's about what already is possible with multicore CPU.

SH64
04-15-09, 05:14 AM
Nice!

walterman
04-15-09, 08:35 AM
No GPU needed. Allowed our GPU to do what it should be doing - graphics.

Agree on this :)

LordJuanlo
04-16-09, 04:57 AM
Both impressive videos, and the last one just using CPU power is amazing, it clearly shows that a good CPU can do a lot for physics effects if you code your engine properly.

walterman
04-16-09, 09:50 AM
Another cool vid: http://www.viddler.com/explore/HardOCP/videos/37/

Running 1920x1200 with a GTX280 & a 3.2 GHz Quad.

Loving it (nana2)

NarcissistZero
04-16-09, 10:42 AM
Another cool vid: http://www.viddler.com/explore/HardOCP/videos/37/

Running 1920x1200 with a GTX280 & a 3.2 GHz Quad.

Loving it (nana2)

AWESOME.

People are going to complain though. Putting so much on the processor like GTA4 did means that typical PC gamers, who worry about GPU and not much else, will get slowdowns when physics stuff is happening. I can hear the "poorly optimized crap" calls already.

Buio
04-16-09, 11:24 AM
No GPU needed. Allowed our GPU to do what it should be doing - graphics. Agree on this :)

I don't agree.

Much because of how the CPU vs GPU is evolving. While the future CPU market is also moving towards more cores, it is extremely slow moving. In 2006 Intel released it's first Quad core CPU, and next year or late this we should get a 6-core CPU. Compare this to a GPU from 2006 versus today. The GPU is moving almost exponentially, while CPU is evolving fairly flat.

It is true that a single GPU has a huge graphics load, and potential GPU physics has to be balanced versus that. But consider how many ALU a GPU will have in a few years (ATI has 800 today). And maybe multi-GPU will be common.

Physics calculations scale really good with number of cores afaik. Sure a single ALU is much much weaker than a full CPU core. But it still doesn't compare given the numbers of ALU and its potential for big leaps in the future.

Even the Larrabee is a sign that Intel knows CPU is locked in slow development compared to GPU's.

MaxFX
04-16-09, 11:35 AM
Well their we have it, ppl will soon ***** again that bla bla my super roxxors hardware cant run it, the end of Pc gaming is coming. I can allready hear them as you said Velvet bad optimizations and all that crap will follow like a storm.

Well the fuk.. devs should have used nVidias physx insteed!!

LordJuanlo
04-16-09, 12:13 PM
Why nVidia PhysX? I prefer to have a graphics card (or two) 100% dedicated to graphics and my eight CPU threads busy at 80-90% each one than having 1 CPU core at 80% and 7 idle threads, while the GPU is suffering doing all the work.

If they can give us such spectacular physics effects, why do you complain?. Why do you want them done by the GPU?. They look comparable to Ageia demos IMO.

Buio
04-16-09, 12:24 PM
Why nVidia PhysX? I prefer to have a graphics card (or two) 100% dedicated to graphics and my eight CPU threads busy at 80-90% each one than having 1 CPU core at 80% and 7 idle threads, while the GPU is suffering doing all the work.

If they can give us such spectacular physics effects, why do you complain?. Why do you want them done by the GPU?. They look comparable to Ageia demos IMO.

Why, because you didn't read my post.

The Ghostbuster engine should have been able to do this in 2006, although somewhat slower.

The GPU has potential do do physics in the future that can blow the stuff they show in that video away. And I'm not talking about PhysX it is just one API, I'm talking generally.

LordJuanlo
04-16-09, 01:11 PM
Well with GPU maybe the effects could blow away what we have seen in those videos, but for today's games that must run also on 360 and PS3 with their limited capabilities, I think those CPU effects seen in those videos are top notch.

I think it's good that a game does a good use of CPU power if you have plenty to spare. Why would you want 7 idle threads on a Core i7?. I don't encode several 1080p videos in the background while I'm gaming, so I prefer all those CPU power can be used by the game. With the GPU free from graphics calculations, you could enable higher levels of AA and AF.

The future is the future, with OpenCL and Directx 11 GPU accelerated physics will be standard, but today if a CPU based physics engine can deliver those results, I embrace it. Both ATI and nVidia users will enjoy some terrific physics effects, they won't be limited to green GPUs like Mirror's Edge so it means more happy customers. BTW GPU accelerated PhysX effects seen on Mirror's Edge don't look better than those seen in Ghostbusters video.

MaxFX
04-16-09, 01:22 PM
Couse everyone does have a Core i7 right ?

Well I dont and I do not plan to upgrade anytime as my Q6600 are more then plenty CPU and GPU physics can blow away anything your pressius Core7 can even dream off ;)

walterman
04-16-09, 02:17 PM
I don't agree.

Much because of how the CPU vs GPU is evolving. While the future CPU market is also moving towards more cores, it is extremely slow moving. In 2006 Intel released it's first Quad core CPU, and next year or late this we should get a 6-core CPU. Compare this to a GPU from 2006 versus today. The GPU is moving almost exponentially, while CPU is evolving fairly flat.

It is true that a single GPU has a huge graphics load, and potential GPU physics has to be balanced versus that. But consider how many ALU a GPU will have in a few years (ATI has 800 today). And maybe multi-GPU will be common.

Physics calculations scale really good with number of cores afaik. Sure a single ALU is much much weaker than a full CPU core. But it still doesn't compare given the numbers of ALU and its potential for big leaps in the future.

Even the Larrabee is a sign that Intel knows CPU is locked in slow development compared to GPU's.

A modern GPU has more GFlops than any CPU, that is sure.

But, guess what, if you spend some GPU GFlops in your calculations, your frame rate will be lower, but, if you combine CPU + GPU (properly), you will keep your GPU working at the max frame rate, and the CPU will use all those extra cores for something useful.

Also, thanks to my past experiences with the old versions of the Infernal Engine, i can say that it taxes the GPU hugely.

Check this quick example of my experiences:

CPU - SSE3 - 3 Threads: 76 fps
http://img16.imageshack.us/img16/7572/cpusse33threads.th.jpg (http://img16.imageshack.us/my.php?image=cpusse33threads.jpg)

GPU - CUDA - 1 Thread (= 1 GPU): 46 fps
http://img18.imageshack.us/img18/8439/gpucuda1thread.th.jpg (http://img18.imageshack.us/my.php?image=gpucuda1thread.jpg)

The GPU doing the 3D rendering & the perlin noise calculations gives you a lower frame rate than my GPU + Quad Core combination.

It's all about proper coding.

They day in which i'll be able to run my fav games at 2560x1600 with SSAA 4x4 at 120fps, i'll change my mind, meanwhile, i will prefer to squeeze all the computing power from all the chips of my machine, in a perfect harmony. ;)

LordJuanlo
04-17-09, 02:45 AM
Couse everyone does have a Core i7 right ?

If you look at the videos, they weren't done using a Core i7, but a C2Q. I just put my precious i7 as an example ;)

But, guess what, if you spend some GPU GFlops in your calculations, your frame rate will be lower, but, if you combine CPU + GPU (properly), you will keep your GPU working at the max frame rate, and the CPU will use all those extra cores for something useful.

That's absolutely right, who would prefer 30 fps with life-like physics effects better than 70 fps with physics effects like the ones we have seen on the videos?

i will prefer to squeeze all the computing power from all the chips of my machine, in a perfect harmony

That's exactly my point, I don't want precious idle cicles wasted because I didn't only pay for my GPU, but also for my CPU, and I want it working at near 100% of its capabilities

Loreman
04-17-09, 03:21 AM
To be honest, I didn't see anything in that physics demonstration that looks any more complex or different to the stuff I have already seen in Crysis from ages ago.

And the ragdolls have already been done very well in other older games, in fact better than that in the original Hitman game which had awesome ragdoll effects, and of course Half Life 2.

I would be more interested in seeing some actual gameplay. Gameplay is where it's at. I know the graphics and effects for this game are going to be significantly hobbled for PC by the cross platform development with consoles so I just want good gameplay.

jcrox
04-17-09, 07:41 AM
I can't wait for this game to come out :captnkill:

Buio
04-17-09, 05:19 PM
The GPU doing the 3D rendering & the perlin noise calculations gives you a lower frame rate than my GPU + Quad Core combination.

It's all about proper coding.

They day in which i'll be able to run my fav games at 2560x1600 with SSAA 4x4 at 120fps, i'll change my mind, meanwhile, i will prefer to squeeze all the computing power from all the chips of my machine, in a perfect harmony. ;)

Ok I was wrong about saying the videos doesn't look impressive because they are very good looking. I like what I see in the videos. So looking forward to Ghostbusters and even if it doesn't seem likely I do hold my thumbs for BR3.

What I'm saying is just that for PC I think the CPU development is stale in comparison to the GPU. In the video they talk about how they are running on all 4 cores but the thing is that CPU has had 4 cores for many years. Physics in the caliber they show has already been possible to do with focus on it. Why I changed my mind and think moving physics to the GPU is good, is because it evolves so fast.

Anyway for games graphics is what sells and therefore AI, physics always take the backseat.

walterman
04-17-09, 07:46 PM
... So looking forward to Ghostbusters and even if it doesn't seem likely I do hold my thumbs for BR3...

So do i ;)

Yaboze
04-18-09, 05:24 PM
I was skeptical but damn this game looks pretty good now, I can't wait!

Ancient76
04-21-09, 04:50 AM
Why nVidia PhysX? I prefer to have a graphics card (or two) 100% dedicated to graphics and my eight CPU threads busy at 80-90% each one than having 1 CPU core at 80% and 7 idle threads, while the GPU is suffering doing all the work.

If they can give us such spectacular physics effects, why do you complain?. Why do you want them done by the GPU?. They look comparable to Ageia demos IMO.

Try to simulate dynamic fluids on CPU. CPU isn't constructed with idea to accelerate physics.

LordJuanlo
04-21-09, 05:10 AM
You are right, but this game won't have dynamic fluids, so the CPU will be enough for what we have seen on the videos.