PDA

View Full Version : How much better will the performance of Crysis get?


Pages : [1] 2

3DBrad
10-30-07, 11:41 AM
I turned all the settings to the lowest they would go, went to some not-graphically-challenging areas, and started trying to put a load on the CPU, and the highest my quad-core got was 39% or so.

Unreal Engine 3 games are doing MUCH better than this, so I surely hope Crytek has this fixed in the final release.

Also, those running around with about $1100 worth of GPUs are also getting unacceptable framerates.

Here's hoping Nvidia brings out another monster of a card like the 8800gtx was.

DirkGently
10-30-07, 12:48 PM
Not much i imagine.

The game has gone gold so further development work will be restricted to patches that'll be released in future.

Driver optimisations may give some improvements but don't expect a miracle cure for low framerates.

agentkay
10-30-07, 12:51 PM
UE3 games do have higher CPU usage and are usually really well optimized (minus R6-Vegas) but I actually think it's to Crysis advantage to have a lower total CPU load because it creates a headroom for more physics. Lets say Crysis would use 95% of all CPU power you have and still would run the same framerate and same features, don't you think it would actually be worse? CryEngine2 and it's own physics engine might very well be more efficient than UE3+Ageia API but the only way to test it would be to build a similar level with the editors and test the limits of possible physic objects vs. CPU load.

I also think the framerate is acceptable for what is being rendered here. The amount of textures, polygons, shaders, realtime softshadows is just crazy and it's demanding for todays hardware, too demanding at VERYHIGH and even more so when you go beyond it with cfg tweaks.

I'm sure there is room for improvement and we will see it both from Crytek (they want to support the game for 2 years, lets hope EA lets them to do it) and Nvidia but only the guys themselves know how much but we can only wait and hope and I'm myself definitely looking forward to the next high-end card because we finally do need it. :)

Edit: Another thing that I forgot to mention is that UE3 uses a deferred renderer but I'm not sure what type of renderer CryEngine2/Crysis use. :confused:

SH64
10-30-07, 12:56 PM
Not much i imagine.

The game has gone gold so further development work will be restricted to patches that'll be released in future.

Driver optimisations may give some improvements but don't expect a miracle cure for low framerates.
Exactly.

sniggle
10-30-07, 01:02 PM
The funny thing is that for all we know the developers of the UT3 engine could have just put some threads to work doing nothing to pump up the CPU usage and make it LOOK like it's doing work.

Placebos are fun.

bugmeplz
10-30-07, 02:43 PM
The funny thing is that for all we know the developers of the UT3 engine could have just put some threads to work doing nothing to pump up the CPU usage and make it LOOK like it's doing work.

Placebos are fun.

So are nonsensical conspiracy theories.

FastRedPonyCar
10-30-07, 02:44 PM
I see no problem with crysis at all. I benchmark an avg of 30 fps at 1680X1050 with all the ultra high graphics settings on and I'm still rocking the E6600.

Crysis has shown through numerous benchmark tests that it's almost TOTALLY GPU dependant, DX10 performance is hit or miss and it doesn't really use hardly any CPU processing at all when it comes to raw performance numbers.

It's not rocket science. If you want to play crysis with teh eye candy, you gotta dump more $$ into a high dollar video card.

While they're not night and day different, the GTS vs GTX/Ultra are finally showing some differences and benefits to the higher onboard memory count as raising the resolution jacks up video memory usage tremendously. One guy that wrote the incrysis.com tweak guide said he was EASILY able to suck down a gig of video memory usage only at 1280X1024 resolution.

Here's a breakdown.

* Windows XP
* 1280x1024 (All Medium)- 300MB
* 1280x1024 (All High - textures medium only)- 300MB
* 1280x1024 (High)- 380MB
* 1280x1024 4x AA (high) - 460MB
* 1680x1050 (All Medium) - 300MB
* 1680x1050 (High) - 390MB
* 1680x1050 4x AA (All High - textures medium only) - 530MB
* 1680x1050 4x AA (High) - 580MB
* 1920x1200 (All High - textures medium only) - 380MB
* 1920x1200 (High) - 490MB
* 1920x1200 4x AA (All High - textures medium only) - 570MB
* 1920x1200 4x AA (High) - 650MB
* 1920x1200 8x AA (High) - 710MB


Unless crytek has some graphics options that they haven't implemented into the game (and are thus missing from the config file), the only difference as far as "being scalable for tomorrows hardware" will be increasing view distance and AA. Becuase right now, those are the two biggest items that kill frame rate.

3DBrad
10-30-07, 03:31 PM
The funny thing is that for all we know the developers of the UT3 engine could have just put some threads to work doing nothing to pump up the CPU usage and make it LOOK like it's doing work.

Placebos are fun.


LOL, I hope your post was JOKE!

Tim Sweeney is to programming what Micheal Jordan was to basketball.

[EOCF] Tim
10-30-07, 04:23 PM
Uhm, I can remember the full version of Farcry running alot better then the Demo, this might just be the same.

JohnDio
10-30-07, 04:44 PM
Not much i guess. This game engine even at low settings is demanding as hell. It's worth though ;). Furthermore quad cores aren't being supported fully in the demo. I assume that there will be a patch with multi-cores and SLI full support

Eliminator
10-30-07, 05:27 PM
LOL, I hope your post was JOKE!

Tim Sweeney is to programming what Micheal Jordan was to basketball.
oh really?... then please knock on his head to remind him that he forgot to add widescreen support

3DBrad
10-30-07, 06:00 PM
oh really?... then please knock on his head to remind him that he forgot to add widescreen support

Heh?

It'll have it, like normal.

Buio
10-30-07, 06:40 PM
Here's hoping Nvidia brings out another monster of a card like the 8800gtx was.

I certainly believe that nextgen graphic cards will be tested especially on Crysis to run it well. The company that shows best numbers in Crysis will win in my opinion, I don't care whatever tops upcoming futuremark product.

$n][pErMan
10-30-07, 06:52 PM
I think some of us forget our monster GTX's (and Ultra in the sense its just a faster GTX) are a year old now. If everyone recalls, when the 6800 series was king it took about a year as well before they started to be pushed to thier limits in games. I welcome something that pushes my hardware to its limits :) For now just scalling back to High on a few things allows me to play 1920x1200 with 2xAA just fine and still have it look damn good. When more than 1 game starts to push my hardware in the way Crysis does I will look toward better hardware, for now as its the only game doing it I don't really care to much and have no plans for an upgrade :p

avirox
10-30-07, 07:24 PM
While they're not night and day different, the GTS vs GTX/Ultra are finally showing some differences and benefits to the higher onboard memory count as raising the resolution jacks up video memory usage tremendously. One guy that wrote the incrysis.com tweak guide said he was EASILY able to suck down a gig of video memory usage only at 1280X1024 resolution.


I think hardocp make a good point that the 8800 series is more power-limited than memory w/ crysis. To play it at what's considered "playable" settings, you're not gonna run into memory limitations anyways because those playable settings are probably low enough not to need more than 320 or so mb (this is the case between the 2 GTSs anyways).

loafer87gt
10-30-07, 10:07 PM
Is it true that the demo doesn't have quad core optimizations? Someone over at the Crysis forums said that it didn't and that the final version should give about a 10 - 15 % boost for each extra core. This sounded really high to me for a game that doesn't seem CPU dependant.

3DBrad
10-30-07, 10:11 PM
Is it true that the demo doesn't have quad core optimizations? Someone over at the Crysis forums said that it didn't and that the final version should give about a 10 - 15 % boost for each extra core. This sounded really high to me for a game that doesn't seem CPU dependant.

I guess that means in the scenario where the GPU isn't the bottleneck, which basically all current GPUs are bottlenecks.

einstein_314
10-30-07, 10:13 PM
The thing is performance doesn't have to get better. What we need are new video cards. As $n[]pErMan pointed out, the 8800gtx is almost a year old. That's ancient in terms of technology. And crysis wasn't designed to run at max settings on today's technology. So suck it up and be patient. Play it at high settings in DX9 mode for now. In half a year replay it in DX10 with maxed settings.

Danik
10-30-07, 11:06 PM
The performance isn't even that bad, unless you are aiming for really high resolutions. Due to my rather mediocre monitor I run at 1280x1024, so I'm running DX10 with High settings and a little AA at very smooth framerates. I do plan on upgrading to the next high-end graphics card, and hopefully picking up an 8 core processor, which I'm sure will be available by the end of '08.

JasonPC
10-30-07, 11:17 PM
I don't think anyone really expected to run crysis completely maxed out smoothly. After all Crytek did say they were trying to make a game that would scale up for future graphic cards. But it is a bit misleading with what Cevat said in the past (like wasn't there a time he said the 7900 GT would run the game on all high settings smoothly?). Since statements like that are a bit old I guess it's possible former high settings are now medium, etc.

Amuro
10-30-07, 11:22 PM
Hopefully SLI will work in the full version.

nrdstrm
10-31-07, 12:39 AM
[pErMan']I think some of us forget our monster GTX's (and Ultra in the sense its just a faster GTX) are a year old now. If everyone recalls, when the 6800 series was king it took about a year as well before they started to be pushed to thier limits in games. I welcome something that pushes my hardware to its limits :) For now just scalling back to High on a few things allows me to play 1920x1200 with 2xAA just fine and still have it look damn good. When more than 1 game starts to push my hardware in the way Crysis does I will look toward better hardware, for now as its the only game doing it I don't really care to much and have no plans for an upgrade :p

QFT on everything you said. I'm very happy with my GTX, but even happier that there is already a game pushing it's limits...I am now looking forward to a new Nvidia/ATI part that can play Crysis with ease...But then again, I am always an early adopter for high end GPU's (though nowhere near as bad as Jakup :P )

hugos82
10-31-07, 03:45 AM
And crysis wasn't designed to run at max settings on today's technology.

I have seen this sentence many times before, and it always makes me laugh.
using that argument you can justify any new game with bad preformance.
and please remember that nvidia is working close to Crytek, that way when even the authors says that the game is meant to be played (fully support) on next gen hardware, than it only confirms that 8xxx are the next FX series in nvidia history (if we're talking about dx10 of course, cause they handle dx9 quite well)

FastRedPonyCar
10-31-07, 07:20 AM
I have seen this sentence many times before, and it always makes me laugh.
using that argument you can justify any new game with bad preformance.
and please remember that nvidia is working close to Crytek, that way when even the authors says that the game is meant to be played (fully support) on next gen hardware, than it only confirms that 8xxx are the next FX series in nvidia history (if we're talking about dx10 of course, cause they handle dx9 quite well)

Yep. I'm thoroughly impressed with the performance both me and my brother have with our systems. He's using a 2 and a half year old 939 system with a 320mb GTS and it benchmarks right at 30 fps @ just about all high graphics options and a few of the DX10 effects turned on.

Mine benches 25fps with all ultra high settings enabled.

I can't think of asking for anything more from video technology that's well over a year old. I think that the 8800's are still going to be a top contender card well into late next year.

Arioch
10-31-07, 12:41 PM
If they get SLI working correctly with the usual performance increases that come with it I will be happy, especially since that means that very high settings should be playable as 30FPS average will be good enough for the single player game.