Originally Posted by Viral
I remember a crysis vid when the demo only just came out and they said it was running on a 8600GT fine at very high details. So everyone was saying "performance will be waaay better than the demo!". How did that turn out again?
You know, several of the Crysis showcases at games conferences and even some trailer material was Crysis at roughly high settings. But recordings often being offscreen gave an impression of looking much better (not to say high looks bad at all or such).
But yeah a 8600GT would only pull medium but the 8800GTX could do all v.h at 720p with minimal problems with slight dips here and there like most other games.
One problem though that made people think it ran worse than it really did was that they played it in DX10 mode with vsync on but game doesn't enable tripplebuffering for DX10 mode nor has cvar for it. Thus falling down to 29fps meant they got a locked 20fps framerate until it hit 30fps. And if they got higher than 30fps it was still was locked to 30fps until they hit 60fps (this for monitors with 60Hz refreshrate)!