Originally Posted by Redeemed
Icecold, you are forgetting- I'm using a socket 754 3400- yes a faster CPU would boost my framerates, substantially. If I were running an E6600 at about 3.5Ghz I'm betting I'd be getting no less than 30fps. That is only a 6fps increase, and it is along the lines of what everybody else gets with an E6600 @ 3.5Ghz and a single GTS.
Icecold, I'll throw this at you another way:
If today's games are capable of bringing the 8800's to thier knees, then how will it be possible for them to play any DX10 game? Okay, even allowing for a 20% gain (the most you'll see from DX10) if you add 20% to let's say 20 you only get an fps of 24. Going by what you are saying (following your logic), the 8800 series will completely crawl and flop when playing any next-gen game.
I mean, if FEAR can bring the 8800GTX to the teens in some spots- CrySis should crawl. Lower the settings? You then run into your CPU bottleneck.
Alan Wake will probably do okay, the graphics aren't nearly as extreme as those of Crysis. But how about Interstellar Marines? Another game that will crawl- according to your logic.
Icecold, what has been pointed out to you before, but you completely blow off, is that the 8800 series is not perfect, i.e. there are still flaws with it. Such flaws as offering poor performance where they shouldn't.
A few posts ago I mentioned that in FEAR I'd walk into a completely empty room with nothing extremely graphically intensive happening- yet my FPS would drop quite a bit, and my character would barely move forward, as in it was like he ran into an invisible wall.
Are you to tell my that his is merely a scene to complexe for the 8800? Okay, then explain to my why my 7600GTs in SLi could burn through the same scene at more than double the fps. This is where either FEAR needs a patch or the drivers for the 8800 need some more work. Or it could be both. This happens when you are an early adopter- you run into the early problems of the new hardware. Do you also realise that BFME (the first one) runs about about 15fps average on my SLi'd 8800GTSs (1920x1440, 16xAA, 16xAF, max in-game options) and sometimes dips to single digits, yet with my SLi's 7600GTs it would remain at 30fps constantly (literally never dipping). No, BFME is not taxing my GTSs, it is a software glitch where BFME either needs a patch or something in the 8800 drivers needs fixing (or both).
There is no possible way that two 7600GTs can match or exceed the performance of two 8800GTSs. Yet at some times they do (the odd occurance in FEAR and BFME I mentioned). This is just cause of a software bug, and has NOTHING to do with the hardware.
Now, you mentioned that there are places in FEAR where absolutely NOTHING is going on, but your FPS tanks- I'd chalk that up to a software bug. Why, cause I'd bet you are experiecing the same thing I am. And again, I never experienced it with my SLi'd 7600GTs. No, the SLi'd 7600GTs aren't faster than my two GTSs or your single GTX. It is a software problem, not a hardware problem.
Now, I have provided multiple bechmarks (a couple posts back) that even go into the CPU scaling of the 8800s. They compare stock CPU speeds to oc'd speeds and show the results. Grated, never is there a gain of 20fps or greater, but in many cases there is 10fps (and occassionally more). Xion even posted the differeces between his CPU at stock and then oc'd while running only one GTX to show you the difference. There IS a difference. Not one of 20+fps, but still a difference none the less.
With my CPU (a socket 754 3400), I'd probably experience an average gain of about 10-15fps in my games if I went with an E6600 and ran it at 3.5GHz. Benchmarks get about that much over what I'm getting with my two GTSs. So that is why I know that for my setup there is a huge CPU bottleneck. And even common sense will tell you that a socket 754 3400 is not enough CPU for dual 8800s.
Icecold, I'm not trying to say that the 8800's are invincible. But I am saying they perform better than you seem to believe they do. You refuse to OC your processor to check for any possible improvement. Yet we have shown you that there is improvemet (in some cases near 10fps). So, I assume that you do not know how to overclock your processor, and are afraid to learn. This is understandable- with how much $$$ your setup must have cost I'd hate to fry anything due to an OC gone wrong.
You refuse to try vsync with triple buffering to see if that helps your FPS any, as it is known that in certain games it will aid fps.
Icecold, you have, for the most part, convinced me that you want the 8800s to perform as you believe that they do. No ammount of evidence to the contrary will make you think otherwise. Even though going by your logic last generation products (even the mainstream and not high end) can match and outperform the 8800s- that just isn't even logical but you seem to like that idea a lot (even though you have never come straight out to claim that, your arguments imply that).
Icecold, this is a debate that would never end. I could come to you in person, and show you that your rig is capable of better fps than you claim- you could see it with your own eyes, yet I doubt you'd believe it. And that, my friend, is sad.
You keep claiming that you don't want a console, yet you are practically having wet-dreams over the 360 and spit upon the 8800s. You go against everythig every other member and 90% or all reviews say. Yet you still feel you are right.
This post should be a huge indicator of how blind you are:
Bokishi even stated he has the same hardware you have, but isn't experiencing any of the slowdowns that you are. I guess he is lying right? Cuase it is obviously impossible for you to be wrong.
You say that the slo-mo in FEAR would cause mouse lag if it dropped your fps? I'm starting to think that you have never played FEAR. You mouse does lag when using slo mo. It does not lag substantial, but definitely by a little bit- it does lag.
I hope not to offend you, but I'm now convinced that you really don't know anything about computer gaming and the hardware for computer gaming.