PDA

View Full Version : I have seen the future....


Pages : 1 2 3 [4] 5 6 7

Toss3
03-24-10, 06:43 AM
I'm an vN fan, but 130 minimum difference per card is a lot. If you want two cards plus an additional PhysX card (9800GT) you're talking 350 more, just for PhysX. Is it that good?? Throw in a very hot running card (?) and the 5870 looks pretty appealing, at least to me.

Edit: Metro 2033 scaling is gonna be the decider for me.

Remember that you can always hack PhysX to work with an ati card. :)

http://physxinfo.com/news/568/ati-hd-5870-nv-gt220-physx-benchmarks/

Revs
03-24-10, 06:48 AM
Remember that you can always hack PhysX to work with an ati card. :)

http://physxinfo.com/news/568/ati-hd-5870-nv-gt220-physx-benchmarks/

If I end up going ATi, I'll need that. Cheers :)

Iruwen
03-24-10, 07:58 AM
When you shoot at bricks with a grenade launcher, they blow apart violently. They don't topple over like a deck of cards like in the PhysX video.

So how many grenade launchers do you own? :p
I think they're both pretty far off when it comes to realism, but that's by intention since realism is boring. Realistic explosions would be boring too.

nekrosoft13
03-24-10, 09:55 AM
realism in unreal game, yap, sure

Razor1
03-24-10, 11:29 AM
Read the thread on PhysX and see how important it really is. 3DVision requires a new monitor, surround three new monitors and a second card. Cuda is worthless unless you fold. All I see are marketing gimmicks. No one cares for eyefinity either. Where are the real performance numbers? You know that thing we actually buy these things for in the first place?
Why do we even let nvidia focus members post on these forums? Corporate shills are what they are. Stick to the nvidia official forums where you belong!

CUDA's foundation isn't gaming, its made for programmablity of different applications to GPU's, there is a lot more things in the world outside of gaming that need performance that a CPU, even multiple CPU's can't provide with out a huge cost to return ratio. As a side note CUDA does come in handy to port PC applications to GPU's because of its similarity in programming calls and language, and this is why physX was ported to CUDA, it gave immediate benefits, to owners of nV products. That was a better way to showcase their technology then porting over to OpenCL or direct compute since it would have been a year or so later we would have had to wait. On ATi's front its going to more then 2 years most likely.

Toss3
03-24-10, 11:37 AM
CUDA's foundation isn't gaming, its made for programmablity of different applications to GPU's, there is a lot more things in the world outside of gaming that need performance that a CPU, even multiple CPU's can't provide with out a huge cost to return ratio. As a side note CUDA does come in handy to port PC applications to GPU's because of its similarity in programming calls and language, and this is why physX was ported to CUDA, it gave immediate benefits, to owners of nV products. That was a better way to showcase their technology then porting over to OpenCL or direct compute since it would have been a year or so later we would have had to wait. On ATi's front its going to more then 2 years most likely.

But how exactly does CUDA benefit the average customer?

Razor1
03-24-10, 11:44 AM
But how exactly does CUDA benefit the average customer?

do you every use photoshop? Anti virus programs will benefit in the future, any application that needs heavy parallel computational needs will benefit, there isn't many out there yet, that is true, but to propagate new techniques and features into the consumer world with computer programs always takes time. Its not something that happens overnight, CUDA provides leverage of current programming knowledge, OpenCL and Direct Compute shaders don't because now an application programmer, who most likely hasn't used any graphics API extensively, probably touched it in college or something like that, will have to understand the API.

Rollo
03-24-10, 12:05 PM
Read the thread on PhysX and see how important it really is.
Why do we even let nvidia focus members post on these forums? Corporate shills are what they are. Stick to the nvidia official forums where you belong!

1. Read thre 100s of angry posts from ATi owners that can't use PhysX and see how important it really is.

2. "we even let"? I was unaware you had any authority to speak for this forum Toss3? I thought you were just another forum member like me, what's your association with the forum that speak for it? Or are you just playing make believe mod to sound tough? :thumbdwn:

shadow001
03-24-10, 12:27 PM
1. Read thre 100s of angry posts from ATi owners that can't use PhysX and see how important it really is.

2. "we even let"? I was unaware you had any authority to speak for this forum Toss3? I thought you were just another forum member like me, what's your association with the forum that speak for it? Or are you just playing make believe mod to sound tough? :thumbdwn:



Angry that i don't have GPU PhysX support?.....On the maybe 10 games that use it?,and Nvidia bought Ageia now a little over 2 years ago,so we can't say it' been hugely popular.....Hardly.


I've seen the videos of games that actually use it and compared that to the same game(mirrors edge,batman darkham asylum),that i actually own and ran in my system,and the differences are fairly subtle to say the least,and doesn't change the overall gameplay basically.


Developers would need to get way more agressive with GPU physics in terms of actually enhancing the gameplay/interactivity of the game itself,beyond what's possible with CPU physics,and pull that off with acceptable performance using a single GPU for both physics and graphics work,then they'd have something to really boast over the competition.


The current situation is no where near that right now.

Vardant
03-24-10, 12:35 PM
Since when is being an NV Focus Group member a bad thing? There were and still are great people out there, that helped the community and for that, were offered the position. If ATI or NV approached anyone in here, with similar offer, who would turn it down?

And it's not like they are hiding, not anymore that is :D

There's at least a dozen people, that are working for Intel or ATI/AMD and are posting things aimed to hurt the competition and you don't even know it.

Unless people, that hate PhysX just because it is owned by NV stop posting or come to senses, it's all pointless. Look at XS forums and the thread about Havok in the News section. Doing nothing is considered better than doing anything by some, just because it involves ATI...:thumbdwn:

shadow001
03-24-10, 12:59 PM
Since when is being an NV Focus Group member a bad thing? There were and still are great people out there, that helped the community and for that, were offered the position. If ATI or NV approached anyone in here, with similar offer, who would turn it down?

And it's not like they are hiding, not anymore that is :D

There's at least a dozen people, that are working for Intel or ATI/AMD and are posting things aimed to hurt the competition and you don't even know it.

Unless people, that hate PhysX just because it is owned by NV stop posting or come to senses, it's all pointless. Look at XS forums and the thread about Havok in the News section. Doing nothing is considered better than doing anything by some, just because it involves ATI...:thumbdwn:


Given what we've seen so far from GPU physics,and knowing that in all of those games where it is being used,i had task manager up an running and only see 2~3 CPU threads getting used for the most part,where the other 5 CPU threads are doing jack **** basically(i7+ hyperthreading enabled here),it would be interesting to see the same physics calculations attempted on those 5 CPU threads doing nothing,seeing if it can still handle the load,and then see which is the better solution in overall performance.


What i'm talking about is making an educated decision based on observation and pushing the limits on the hardware i already own,and not simply using a physics API that limits itself to looking for an Nvidia video card in the system,and never allowing the option for the CPU to handle the workload,even if it's a high end one that more than half it's resources aren't being used to begin with.


As it is,some users that bought those Ageia physics cards are already pretty furious that they can't use a card they paid 300$ at the time for,since the latest physX updates don't allow it to be used anymore,and it's a better physics processor than using an Nvidia GPU,since all the physics calculations are supported in actual hardware,not partially using the CPU for some of them.


Or how about Nvidia also not allowing users to have ATI cards for the graphics portion,and using an Nvidia card for the physics calculations....Nope,not allowed either,even though it was in the past,so Nvidia wants to make sure that you buy their hardware exclusively for no real reason except making more money,and shot themselves in the foot in the process,hence why there's so few games using GPU accelerated physics,even after 2 years.

NoWayDude
03-24-10, 01:36 PM
Ok, i know this is silly season and all, but could someone on their right mind, explain to me something?

Why would Nvidia/Ati/Intel/AMD, having something that is a proprietary tech, let all of the other guys have this for....free?

Am I mistaken, or is it not the goal of this companies to ... make money?

In regards to Toss3 remarks about the Nvidia forum focus group, can I ask you, what makes some people be over zealous about ATI?

At least with the focus members we know why, what about the ATI ones?

I see more misinformation from ATI fans about Nvidia than from Nvidia focus members about ATI. Shell we believe that this is all for the love of 1 company?

Sorry, I stopped being that naive years ago.

Razor1
03-24-10, 01:36 PM
Given what we've seen so far from GPU physics,and knowing that in all of those games where it is being used,i had task manager up an running and only see 2~3 CPU threads getting used for the most part,where the other 5 CPU threads are doing jack **** basically(i7+ hyperthreading enabled here),it would be interesting to see the same physics calculations attempted on those 5 CPU threads doing nothing,seeing if it can still handle the load,and then see which is the better solution in overall performance.



i7 has a max of 8 threads in parallel, if you think about it a GPU has hundreds of threads in parallel. If those effects hurt a GPU, its going to hurt a CPU even more. In games most games are done this way 1 thread for graphics and game needs, 1 thread for physics needs, and one thread for AI needs. So yeah you can have up to 6 threads for physics, 6 vs hundreds, its quite a big difference ;). Also hypertheading is nice but doesn't always equate to the same amount of parrallelism as a GPU, so you have factor that in.


What i'm talking about is making an educated decision based on observation and pushing the limits on the hardware i already own,and not simply using a physics API that limits itself to looking for an Nvidia video card in the system,and never allowing the option for the CPU to handle the workload,even if it's a high end one that more than half it's resources aren't being used to begin with.


The developers have the choice here, not you, or anyone else, its a good thought no doubt but realistically speaking look above, I'm sure developers have looked into it to some degree, since physX is multithread, just needs to be used.

As it is,some users that bought those Ageia physics cards are already pretty furious that they can't use a card they paid 300$ at the time for,since the latest physX updates don't allow it to be used anymore,and it's a better physics processor than using an Nvidia GPU,since all the physics calculations are supported in actual hardware,not partially using the CPU for some of them.

Unfortunately not much choice there early adopters always have that to take into consideration, specially since Ageia didn't look to viable at first because their delays in hardware and software.

Or how about Nvidia also not allowing users to have ATI cards for the graphics portion,and using an Nvidia card for the physics calculations....Nope,not allowed either,even though it was in the past,so Nvidia wants to make sure that you buy their hardware exclusively for no real reason except making more money,and shot themselves in the foot in the process,hence why there's so few games using GPU accelerated physics,even after 2 years.

Sux to be an ATi user, it would be nice I agree, but nV owns what they make, and they are in the business of make money.

Iruwen
03-24-10, 02:08 PM
Ok, i know this is silly season and all, but could someone on their right mind, explain to me something?

Why would Nvidia/Ati/Intel/AMD, having something that is a proprietary tech, let all of the other guys have this for....free?

Am I mistaken, or is it not the goal of this companies to ... make money?

In regards to Toss3 remarks about the Nvidia forum focus group, can I ask you, what makes some people be over zealous about ATI?

At least with the focus members we know why, what about the ATI ones?

I see more misinformation from ATI fans about Nvidia than from Nvidia focus members about ATI. Shell we believe that this is all for the love of 1 company?

Sorry, I stopped being that naive years ago.

+1

Toss3
03-24-10, 02:32 PM
1. Read thre 100s of angry posts from ATi owners that can't use PhysX and see how important it really is.

2. "we even let"? I was unaware you had any authority to speak for this forum Toss3? I thought you were just another forum member like me, what's your association with the forum that speak for it? Or are you just playing make believe mod to sound tough? :thumbdwn:

I don't mind nvidia focus members being here, but due to their obvious bias towards one company their arguments become very lopsided. jAkUp did an amazing job at marketing nvidia products without ever talking crap about ati. You however seem to have a need to talk crap about them in every post you make.
I never tried to sound like a mod nor did I ever think my comment would be thought as anyone's but my own.


In regards to Toss3 remarks about the Nvidia forum focus group, can I ask you, what makes some people be over zealous about ATI?


I have no idea, but I guess some people have a need to justify their purchase.

I might add that nvidia hasn't been marketing themselves very well lately and it shows.

SirPauly
03-24-10, 02:49 PM
This was an important data point on the popularity of Physic middleware based on developers.

http://bulletphysics.org/wordpress/?p=88

If a title offers more advanced physics using the GPU? Great! If a title offers more advanced physics using the CPU and taking advantage of many cores? Great!

For me with my system -- I'll enjoy whatever the developers choose to use. Personally can't force my idealism to developers or IHV's but can build a flexible system to enjoy the most immersion for me.

shadow001
03-24-10, 02:52 PM
i7 has a max of 8 threads in parallel, if you think about it a GPU has hundreds of threads in parallel. If those effects hurt a GPU, its going to hurt a CPU even more. In games most games are done this way 1 thread for graphics and game needs, 1 thread for physics needs, and one thread for AI needs. So yeah you can have up to 6 threads for physics, 6 vs hundreds, its quite a big difference ;). Also hypertheading is nice but doesn't always equate to the same amount of parrallelism as a GPU, so you have factor that in.



The developers have the choice here, not you, or anyone else, its a good thought no doubt but realistically speaking look above, I'm sure developers have looked into it to some degree, since physX is multithread, just needs to be used.


Doesn't that largely depend on what the developers have settled on the minimum configuration that's required to run their game though,as it's in their interest to make their game run on the maximum amount of system configurations possible,in order to potentially increase sales of their game,and i personally haven't seen any game yet released even listing an i7 processor even as their recommended system setup to run a particular game optimally...At least not yet.


The most i've seen as recommended specifications is a Quad core processor with no hyperthreading ablilities being mentioned at all,so we have yet to see exactly just how well would a CPU with 8 threads actually handle it,at least publically,and people should be informed about it and then make a decision if they want a physics card or not.



Unfortunately not much choice there early adopters always have that to take into consideration, specially since Ageia didn't look to viable at first because their delays in hardware and software.


That one would also be nice to know if current Nvidia GPU's can actually outperform the ageia physics processor for physics calculations,and not dropping support without even thinking twice about it,and i don't need to tell you the bad impression that left for those that did buy those physics cards,which weren't that cheap at the time,and everybody called it a gimmick then,and now you're telling me it's supposed to be considered otherwise because Nvidia bought the company?....I don't think so.


If they're so confident that their current GPU's can outperform that ageia physics processor,then prove it in actual benchmarks to show people why they dropped support for it....Put that information out in the open,rather than playing this cloak and dagger crap and just saying it's better to use the GPU for it and that's the end of it.



Sux to be an ATi user, it would be nice I agree, but nV owns what they make, and they are in the business of make money.

Like i said,it's been out for 2 years,there's maybe 10 games using GPU physics right now,and i've seen the differences both with and without GPU physics with some games,and it's not a night and day difference anyhow,at least for now.

Toss3
03-24-10, 03:04 PM
Since when is being an NV Focus Group member a bad thing? There were and still are great people out there, that helped the community and for that, were offered the position. If ATI or NV approached anyone in here, with similar offer, who would turn it down?

I would never take free products in exchange for the right to express my own opinion. If I wanted free hardware I'd start reviewing them thus helping the community instead of doing the opposite.

And it's not like they are hiding, not anymore that is :D

There's at least a dozen people, that are working for Intel or ATI/AMD and are posting things aimed to hurt the competition and you don't even know it.

I've seen AMD employees post on xtremesystems and they've all gotten pretty beat down by others.

Unless people, that hate PhysX just because it is owned by NV stop posting or come to senses, it's all pointless. Look at XS forums and the thread about Havok in the News section. Doing nothing is considered better than doing anything by some, just because it involves ATI...:thumbdwn:

You need to understand that people don't hate PhysX in any way. People just don't like it when they're being lied to. If nvidia were to enable proper multi-core support in their drivers and let ATI users buy an nvidia product to run PhysX most people wouldn't have a problem with it. Right now however it is only being used to hurt gamers that aren't using nvidia GPUs.
PhysX is a glimpse of what the future of gaming holds in store for us and the demos clearly show that.

SirPauly
03-24-10, 03:09 PM
Doesn't that largely depend on what the developers have settled on the minimum configuration that's required to run their game though,as it's in their interest to make their game run on the maximum amount of system configurations possible,in order to potentially increase sales of their game,and i personally haven't seen any game yet released even listing an i7 processor even as their recommended system setup to run that game optimally...At least not yet.


The most i've seen as recommended specifications is a Quad core processor with no hyperthreading ablilities being mentioned at all,so we have yet to see exactly just how well would a CPU with 8 threads actually handle it,at least publically,and people should be informed about it and then make a decision if they want a physics card or not.





That one would also be nice to know if current Nvidia GPU's can actually outperform the ageia physics processor for physics calculations,and not dropping support without even thinking twice about it,and i don't need to tell you the bad impression that left for those that did buy those physics cards,which weren't that cheap at the time,and everybody called it a gimmick then,and now you're telling me it's supposed to be considered otherwise because Nvidia bought the company?....I don't think so.


If they're so confident that their current GPU's can outperform that ageia physics processor,then prove it in actual benchmarks to show people why they dropped support for it....Put that information out in the open,rather than playing this cloak and dagger crap and just saying it's better to use the GPU for it and that's the end of it.




Like i said,it's been out for 2 years,there's maybe 10 games using GPU physics right now,and i've seen the differences both with and without GPU physics with some games,and it's not a night and day difference anyhow,at least for now.


How objective really is it to compare just a single GPU doing rendering and Physics compared to a PPU doing just Physics?

An Ageia PPU is about as powerful as a 9600 GT to me.

shadow001
03-24-10, 03:16 PM
How objective really is it to compare just a single GPU doing rendering and Physics compared to a PPU doing just Physics?

An Ageia PPU is about as powerful as a 9600 GT to me.


I wasn't thinking doing it that way actually,but to rather have a dedicated graphics card in each setup,only one uses an additional Nvidia GPU for physics and the other system uses the ageia physics card to calculate that part of the workload....Make it as fair as possible and see what comes out of it in the end.


If the Dual Nvidia GPU setup still beats the Single Nvidia GPU + Ageia card setup in the games that do have support for GPU physics,then it lends evidence as to why they dropped support for the Ageia physics card.


It's not hard to conduct such a test.

Toss3
03-24-10, 03:18 PM
How objective really is it to compare just a single GPU doing rendering and Physics compared to a PPU doing just Physics?
This is how nvidia markets it and the reason why they don't let ATi users run nvidia cards along with their GPU. They sell it to people who think that one GPU is enough. Nowhere does it say that "to enjoy the full benefits of PhysX nvidia recommends running a second card alongside the 400$ you just purchased" on the box. A gpu+cpu setup is always going to be a better option than just one GPU doing everything.

An Ageia PPU is about as powerful as a 9600 GT to me.

Sounds about right. :)

EDIT: Found a chart comparing Ageia PPU with 9600GT in Mirror's Edge:

http://www.nvnews.net/vbulletin/attachment.php?attachmentid=39845&stc=1&d=1269464259

Those cpu physx numbers look horrible.

EDIT2: They also recently released their Metro2033 PhysX benchmarks:

http://www.nvnews.net/vbulletin/attachment.php?attachmentid=39846&stc=1&d=1269464302

This is the first title to properly utilize multi-core support for PhysX! Nvidia seems to be listening to the consumers! :thumbsup:
Would have been interesting to see how a dedicated physx card would have affected the performance.

Razor1
03-24-10, 03:34 PM
Doesn't that largely depend on what the developers have settled on the minimum configuration that's required to run their game though,as it's in their interest to make their game run on the maximum amount of system configurations possible,in order to potentially increase sales of their game,and i personally haven't seen any game yet released even listing an i7 processor even as their recommended system setup to run a particular game optimally...At least not yet.

http://ixbtlabs.com/articles3/cpu/archspeed-2009-4-p2.html

They also stated quad core systems showed similiar gains, hyperthreading doesn't give the same performance benefits as a second CPU, so the 8 threads well not that great lets say more like 5 to 6 threads total.

The most i've seen as recommended specifications is a Quad core processor with no hyperthreading ablilities being mentioned at all,so we have yet to see exactly just how well would a CPU with 8 threads actually handle it,at least publically,and people should be informed about it and then make a decision if they want a physics card or not.

Look above hyperthreading is a solution that isn't ideal to the problem

That one would also be nice to know if current Nvidia GPU's can actually outperform the ageia physics processor for physics calculations,and not dropping support without even thinking twice about it,and i don't need to tell you the bad impression that left for those that did buy those physics cards,which weren't that cheap at the time,and everybody called it a gimmick then,and now you're telling me it's supposed to be considered otherwise because Nvidia bought the company?....I don't think so.

A 9600gt does well against the Ageia card, and the 9600gt at launch costed more then $100 less then Ageia top end which both those are fairly competitive.

If they're so confident that their current GPU's can outperform that ageia physics processor,then prove it in actual benchmarks to show people why they dropped support for it....Put that information out in the open,rather than playing this cloak and dagger crap and just saying it's better to use the GPU for it and that's the end of it.

Look around there are benchmarks out there.

Like i said,it's been out for 2 years,there's maybe 10 games using GPU physics right now,and i've seen the differences both with and without GPU physics with some games,and it's not a night and day difference anyhow,at least for now.

It takes time we will see in the future, Dx11 games there is only 2 so what, ya still buy the hardware right?

shadow001
03-24-10, 03:55 PM
http://ixbtlabs.com/articles3/cpu/archspeed-2009-4-p2.html

They also stated quad core systems showed similiar gains, hyperthreading doesn't give the same performance benefits as a second CPU, so the 8 threads well not that great lets say more like 5 to 6 threads total.



Look above hyperthreading is a solution that isn't ideal to the problem



A 9600gt does well against the Ageia card, and the 9600gt at launch costed more then $100 less then Ageia top end which both those are fairly competitive.



Look around there are benchmarks out there.



It takes time we will see in the future, Dx11 games there is only 2 so what, ya still buy the hardware right?


I'm talking about the physics calculations alone,not entire programs here,and i'll look to see if there's any benchmark results comparing the Ageia card versus an Nvidia GPU on physics workloads exclusively.


As for DX11 games,there's Dirt 2,aliens versus predator,battleforge,battlefield bad company 2 and a couple of others,but like i stated before,the main reason i bought them is for the triple display support in games,for where there are already close to 30 games in the supported list,it doesn't require developer support,and the cards have only been out 6 months on the market,compared to 2 years for GPU physX.


You were saying?...;):D

Toss3
03-24-10, 03:56 PM
It takes time we will see in the future, Dx11 games there is only 2 so what, ya still buy the hardware right?

Posted the benchmarks above, but wanted to comment on this little snippet. DX11 is as worthless as physx until people of both camps can run it.

SirPauly
03-24-10, 03:57 PM
This is how nvidia markets it and the reason why they don't let ATi users run nvidia cards along with their GPU. They sell it to people who think that one GPU is enough. Nowhere does it say that "to enjoy the full benefits of PhysX nvidia recommends running a second card alongside the 400$ you just purchased" on the box. A gpu+cpu setup is always going to be a better option than just one GPU doing everything.

Always thought nVidia simply offered PhysX to try to differentiate themselves from their competitors and to offer value for their customers. If one doesn't like it or can't find the right balance - they can disable it. Or they can try to find the right balance if they choose to with a single GPU or invest into a PhysX discrete card. For someone like me, used my dated 8800 GT which was collecting dust as a paper weight -- reborn to offer a bit more gaming. Sure, would like to see more compelling content, GPU phyX to be ported to OpenCL, but while I wait for more maturity and idealism, can enjoy some content now while costing me nothing extra. Think it is neat to see a dated GPU be used to improve gaming - sure wish I could do that with a dated CPU.