PDA

View Full Version : Wonder how well you PC will run Crysis


Pages : 1 2 3 [4] 5 6 7

icecold1983
01-16-07, 12:59 AM
also ur res of 1680x 1050 is 31% less than my res of 1920 x 1200.

if we take ur low of 45 and take 31% off it it brings us to about 32. again i know exactly what im talking about. actually in reality it would be more of a decrease because aa modes typically have higher penalties at higher resolutions because more and more scenes are gpu limited and the memory footprint increases exponentially.

Roadhog
01-16-07, 01:05 AM
is there a console command in cod2 wer you can see the number of rendered polygons in game?

Theres not many.

Roadhog
01-16-07, 01:06 AM
actually in reality it would be more of a decrease because aa modes typically have higher penalties at higher resolutions because more and more scenes are gpu limited and the memory footprint increases exponentially.

that is true. Hence why there is no need to run insane levels of AA at high resolutions.

Xion X2
01-16-07, 01:11 AM
Are you completely brain-dead? You originally said that yours dropped "regularly" down to 30fps. And I told you right above (and showed you) that most of the time mine sticks at 52fps and up "regularly." And I did this by putting as many as 10 soldiers on the screen at once, along with snow flurries, gunfire, background buildings and everything else.

The only time that my game dropped to 45fps was for a split-second, and it's quite clear to see that the scene it was rendering at the time was a lot less intensive than the ones where I was getting 52fps and up.

Now if you had originally said "my game drops to 30 fps on occasion" I might've let you off the hook, but you didn't say that. Again, your words:

i know cod2 doesnt run at 60 fps at 1920 x 1200 with 16x aa 16x af. and it drops to around 30 regularly.

http://www.nvnews.net/vbulletin/showthread.php?t=84146

And 1680x1050 / 1920x1200 [i]isn't "31 % less", it's 12.5 x 2 which is 25%.

And 75% of 52 fps, or what I get "regularly" is 39fps.

And if you overclock your processor / graphics card like I did mine, you will get an additional 5fps or more in performance, putting you at mid-40's.

And if you do that, then you're getting, on average, 15 more fps and about 8x the amount of AA you'd get in the 360 version of Call of Duty 2.. you know, that white sh!tbox that you like to dry hump so much around this place.

And no, you don't have a clue as to what you're talking about. That much has been clear for quite a while.

icecold1983
01-16-07, 01:12 AM
that is true. Hence why there is no need to run insane levels of AA at high resolutions.

i disagree, even at that high of a res the difference in aa modes is very noticeable, theres room for tons of improvement even over 16xqaa for future graphics cards. we arent even close in terms of aa to achieving photorealism. when we have 128xaa or something that gives that level of quality we'll be getting somewhere :)

icecold1983
01-16-07, 01:18 AM
Are you completely brain-dead? You originally said that yours dropped "regularly" down to 30fps. And I told you right above (and showed you) that most of the time mine sticks at 52fps and up "regularly." And I did this by putting as many as 10 soldiers on the screen at once, along with snow flurries, gunfire, background buildings and everything else.

The only time that my game dropped to 45fps was for a split-second, and it's quite clear to see that the scene it was rendering at the time was a lot less intensive than the ones where I was getting 52fps and up.

Now if you had originally said "my game drops to 30 fps on occasion" I might've let you off the hook, but you didn't say that. Again, your words:



http://www.nvnews.net/vbulletin/showthread.php?t=84146

And 1680x1050 / 1920x1200 [i]isn't "31 % less", it's 12.5 x 2 which is 25%.

And 75% of 52 fps, or what I get "regularly" is 39fps.

And if you overclock your processor / graphics card like I did mine, you will get an additional 5fps or more in performance, putting you at mid-40's.

And if you do that, then you're getting, on average, 15 more fps and about 8x the amount of AA you'd get in the 360 version of Call of Duty 2.. you know, that white sh!tbox that you like to dry hump so much around this place.

And no, you don't have a clue as to what you're talking about. That much has been clear for quite a while.

1920 x 1200 = 2304000 pixels
1680 x 1050 = 1764000 pixels

(x-y)/y = (2304000 - 1764000)/1764000
540000/1764000
.3061
31%

also aa modes cost more at higher resolutions.

http://www.extremetech.com/article2/0,1697,2053790,00.asp

Roadhog
01-16-07, 01:21 AM
i disagree, even at that high of a res the difference in aa modes is very noticeable, theres room for tons of improvement even over 16xqaa for future graphics cards. we arent even close in terms of aa to achieving photorealism. when we have 128xaa or something that gives that level of quality we'll be getting somewhere :)

Actually we will never need 128xaa to look "photo realistic". The AA method will have to improve.

I want you to TRY and guess how much AA are in these renders I did, because you wont be able to.

http://img236.imageshack.us/img236/2601/cobra9xi.jpg

http://img50.imageshack.us/img50/8621/lambomn8.jpg

EDIT: Here is another..

http://img407.imageshack.us/img407/9055/cool2ox.jpg

icecold1983
01-16-07, 01:28 AM
hard to tell when not in motion, but ur right i would just be guessing. some of the biggest problems with current aa tho are on very small polygons which arent rly an issue in those renders. also i said 128xaa or something that gives that level of quality :)

Roadhog
01-16-07, 01:32 AM
hard to tell when not in motion, but ur right i would just be guessing. some of the biggest problems with current aa tho are on very small polygons which arent rly an issue in those renders. also i said 128xaa or something that gives that level of quality :)


Learn English please.

The poly placement, or number has NOTHING to do with how the aa on the scene is applied, or looks. It will look the same on a 1million poly model, or a 1000 poly model. Maybe you should do some reading on how AA works before you flap your trap and make a fool of yourself.

EDIT: If you wanted to know all the renders I posted have what you would call 4x AA. The method it is applied by is just more efficient than what your video card can do. You also wouldn't be able to push 30+ fps with this AA level on anything. That is why it take minutes to hours to render one frame of that detail level.

icecold1983
01-16-07, 01:40 AM
Learn English please.

The poly placement, or number has NOTHING to do with how the aa on the scene is applied, or looks. It will look the same on a 1million poly model, or a 1000 poly model. Maybe you should do some reading on how AA works before you flap your trap and make a fool of yourself.

EDIT: If you wanted to know all the renders I posted have what you would call 4x AA. The method it is applied by is just more efficient than what your video card can do. You also wouldn't be able to push 30+ fps with this AA level on anything. That is why it take minutes to hours to render one frame of that detail level.

maybe you should learn english, i never mentioned poly counts or placement.

Roadhog
01-16-07, 01:50 AM
maybe you should learn english, i never mentioned poly counts or placement.

You didn't? But what you mentioned falls into that category.

hard to tell when not in motion, but ur right i would just be guessing. some of the biggest problems with current aa tho are on very small polygons which arent rly an issue in those renders.

So what you are saying. Is that the issue with current AA is that it doesn't work right on very small polygons. The game you play is made out of polygons, which are placed in a certain fashion. Has nothing to do with how the AA looks.

If it doesn't work right on very small polies... Then my the AA on my model should look like ****. Most of the models consist of over 2 million polygons, quads, tris whatever you want to call them.


Maybe this will help you grasp it.

http://www.tweaktown.com/guides/601/1


Antialiasing: Also called Full Screen Anti-Aliasing (FSAA) or simply Antialiasing (AA) for short. As the name implies, this is a method which counteracts the effects of aliasing. What's Aliasing? Well aliasing is the jaggedness and pixelation you see on computer images – particularly noticeable on things like the straight edges of walls, or the outline of buildings and terrain in 3D games. These jagged edges can even "sparkle" somewhat when you are moving around in a 3D environment. That effect can be overcome in two ways: by increasing the resolution at which your game displays (e.g. from 640x480 to 1024x768), and by the use of Antialiasing, or both. When AA is enabled, it uses your graphics card's hardware to blend the edges of the jagged lines and hence produce a smoother image.


EDIT: Here is a pic for you if you don't know what a polygon is. Since you apparently don't.

All the little squares, are polygons.

polygon: a figure, esp. a closed plane figure, having three or more, usually straight, sides.

http://www.area51.infiniteorigin.com/poly.jpg

icecold1983
01-16-07, 02:22 AM
You didn't? But what you mentioned falls into that category.



So what you are saying. Is that the issue with current AA is that it doesn't work right on very small polygons. The game you play is made out of polygons, which are placed in a certain fashion. Has nothing to do with how the AA looks.

If it doesn't work right on very small polies... Then my the AA on my model should look like ****. Most of the models consist of over 2 million polygons, quads, tris whatever you want to call them.


Maybe this will help you grasp it.

http://www.tweaktown.com/guides/601/1





EDIT: Here is a pic for you if you don't know what a polygon is. Since you apparently don't.

All the little squares, are polygons.

polygon: a figure, esp. a closed plane figure, having three or more, usually straight, sides.

http://www.area51.infiniteorigin.com/poly.jpg

wow u just dont get it. go post on b3d and ask why current aa algorithms dont work well on small polygons.

brownstone
01-16-07, 02:23 AM
Maybe someone should change the thread title to 'Wonder how well your PC will run COD2?'.

Redeemed
01-16-07, 09:05 AM
Icecold- this is at 1920x1440, with 16xAA and 16xAF-

FEAR:

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FEAR/Photobucket9.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FEAR/Photobucket6.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FEAR/FEAR%2012-14-06/FEAR2006-12-1403-38-13-71.jpg

(BTW, that above shot was with a single 8800GTS. 24fps at 1920x1440, with 16xAA and 16xAF plus all in-game options maxed isn't bad- and it'd be better if I had a better processor).

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FEAR/FEAR%2012-15-06/FEAR2006-12-1503-20-14-59.jpg

Okay, now for some FarCry (a very dated game):

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FarCry/FarCry2007-01-1402-42-49-12.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FarCry/FarCry2007-01-1402-42-13-44.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FarCry/FarCry2007-01-1402-41-00-32.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FarCry/FarCry2007-01-1402-38-55-19.jpg

http://i6.photobucket.com/albums/y225/Sir_Farts_A_Lot/FarCry/FarCry2007-01-1402-37-33-42.jpg

Okay, why did I post all of these screenshots? I want you to show me where the AA is horrible. In all of the above screenshots there is either no visible aliasing, or extremely minor aliasing. My point- high levels of AA at higher resolutions does help out. There is visual improvement. Remember, all of those above shots were at 1920x1440, with 16xAA. No, not 16xQ AA but 16xAA.

I'd post more shots of FEAR and FarCry where there is a bunch of stuff going on, but it is 6AM and I just got in from work- I'm tired and I'm going to bed.

I'll be back to continue this interesting debate later. ;)

icecold1983
01-16-07, 03:50 PM
yes lots of parts in fear u get high fps, and in others it absolutely tanks, its been confirmed by so many people not to mention benchmarking sites that u rly cant even argue it. and no ur fps wouldnt be higher in that scene with a better cpu.

http://www.guru3d.com/article/Videocards/391/23/

at 1600 x 1200 or higher there is absolutly no cpu limitation in any recent game. far cry has some but as long as u have an e6600 and play at 1600x1200 or higher there is none.

http://www.legionhardware.com/document.php?id=603&p=3

another example of the complete lack of cpu limitation in anything other than strategy games.

i never said current aa is horrible but there is so much room for improvement. certain areas of games still appear is if they arent getting ANY aa applied to them regardless of how high u set the aa. usually its very small polygons. i could post a ton of screens with 16xq showing u exactly what i mean. this leads to 'sparkling' or 'jittering', whatever you like to call it.

Redeemed
01-16-07, 05:58 PM
Icecold, you are forgetting- I'm using a socket 754 3400- yes a faster CPU would boost my framerates, substantially. If I were running an E6600 at about 3.5Ghz I'm betting I'd be getting no less than 30fps. That is only a 6fps increase, and it is along the lines of what everybody else gets with an E6600 @ 3.5Ghz and a single GTS.

Icecold, I'll throw this at you another way:

If today's games are capable of bringing the 8800's to thier knees, then how will it be possible for them to play any DX10 game? Okay, even allowing for a 20% gain (the most you'll see from DX10) if you add 20% to let's say 20 you only get an fps of 24. Going by what you are saying (following your logic), the 8800 series will completely crawl and flop when playing any next-gen game.

I mean, if FEAR can bring the 8800GTX to the teens in some spots- CrySis should crawl. Lower the settings? You then run into your CPU bottleneck.

Alan Wake will probably do okay, the graphics aren't nearly as extreme as those of Crysis. But how about Interstellar Marines? Another game that will crawl- according to your logic.

Icecold, what has been pointed out to you before, but you completely blow off, is that the 8800 series is not perfect, i.e. there are still flaws with it. Such flaws as offering poor performance where they shouldn't.

A few posts ago I mentioned that in FEAR I'd walk into a completely empty room with nothing extremely graphically intensive happening- yet my FPS would drop quite a bit, and my character would barely move forward, as in it was like he ran into an invisible wall.

Are you to tell my that his is merely a scene to complexe for the 8800? Okay, then explain to my why my 7600GTs in SLi could burn through the same scene at more than double the fps. This is where either FEAR needs a patch or the drivers for the 8800 need some more work. Or it could be both. This happens when you are an early adopter- you run into the early problems of the new hardware. Do you also realise that BFME (the first one) runs about about 15fps average on my SLi'd 8800GTSs (1920x1440, 16xAA, 16xAF, max in-game options) and sometimes dips to single digits, yet with my SLi's 7600GTs it would remain at 30fps constantly (literally never dipping). No, BFME is not taxing my GTSs, it is a software glitch where BFME either needs a patch or something in the 8800 drivers needs fixing (or both).

There is no possible way that two 7600GTs can match or exceed the performance of two 8800GTSs. Yet at some times they do (the odd occurance in FEAR and BFME I mentioned). This is just cause of a software bug, and has NOTHING to do with the hardware.

Now, you mentioned that there are places in FEAR where absolutely NOTHING is going on, but your FPS tanks- I'd chalk that up to a software bug. Why, cause I'd bet you are experiecing the same thing I am. And again, I never experienced it with my SLi'd 7600GTs. No, the SLi'd 7600GTs aren't faster than my two GTSs or your single GTX. It is a software problem, not a hardware problem.

Now, I have provided multiple bechmarks (a couple posts back) that even go into the CPU scaling of the 8800s. They compare stock CPU speeds to oc'd speeds and show the results. Grated, never is there a gain of 20fps or greater, but in many cases there is 10fps (and occassionally more). Xion even posted the differeces between his CPU at stock and then oc'd while running only one GTX to show you the difference. There IS a difference. Not one of 20+fps, but still a difference none the less.

With my CPU (a socket 754 3400), I'd probably experience an average gain of about 10-15fps in my games if I went with an E6600 and ran it at 3.5GHz. Benchmarks get about that much over what I'm getting with my two GTSs. So that is why I know that for my setup there is a huge CPU bottleneck. And even common sense will tell you that a socket 754 3400 is not enough CPU for dual 8800s.

Icecold, I'm not trying to say that the 8800's are invincible. But I am saying they perform better than you seem to believe they do. You refuse to OC your processor to check for any possible improvement. Yet we have shown you that there is improvemet (in some cases near 10fps). So, I assume that you do not know how to overclock your processor, and are afraid to learn. This is understandable- with how much $$$ your setup must have cost I'd hate to fry anything due to an OC gone wrong.

You refuse to try vsync with triple buffering to see if that helps your FPS any, as it is known that in certain games it will aid fps.

Icecold, you have, for the most part, convinced me that you want the 8800s to perform as you believe that they do. No ammount of evidence to the contrary will make you think otherwise. Even though going by your logic last generation products (even the mainstream and not high end) can match and outperform the 8800s- that just isn't even logical but you seem to like that idea a lot (even though you have never come straight out to claim that, your arguments imply that).

Icecold, this is a debate that would never end. I could come to you in person, and show you that your rig is capable of better fps than you claim- you could see it with your own eyes, yet I doubt you'd believe it. And that, my friend, is sad.

You keep claiming that you don't want a console, yet you are practically having wet-dreams over the 360 and spit upon the 8800s. You go against everythig every other member and 90% or all reviews say. Yet you still feel you are right.

This post should be a huge indicator of how blind you are:

I've got the exact same setup as IceCold, and 16xAA runs great.

don't take anything that icecold says seriously

Bokishi even stated he has the same hardware you have, but isn't experiencing any of the slowdowns that you are. I guess he is lying right? Cuase it is obviously impossible for you to be wrong.

You say that the slo-mo in FEAR would cause mouse lag if it dropped your fps? I'm starting to think that you have never played FEAR. You mouse does lag when using slo mo. It does not lag substantial, but definitely by a little bit- it does lag.

I hope not to offend you, but I'm now convinced that you really don't know anything about computer gaming and the hardware for computer gaming.

brownstone
01-16-07, 06:13 PM
I keep clicking on this thread to read about new info on Crysis and maybe some new system info and how well crysis is running on it.

Can those arguing about other games and who are posting pics of FEAR and COD2, start PM each other or start another thread. I know you are trying prove some points but I think this thread has drifted way off topic.

:matrix:

agentkay
01-16-07, 06:23 PM
Ok brownstone, here is a Crysis video, poor to average quality (filmed off a screen), over 2 minutes long and you can see a guy shooting rpgs at a helicopter. Im not sure if you saw it before, but at least Im putting the thread back on topic. :p

http://www.crysis-online.com/video_downloader.php?dir=/video/SD/(Crysis-Online)%20CES%202007%20Crysis%20Gameplay%207.mpg

Redeemed
01-16-07, 06:26 PM
I keep clicking on this thread to read about new info on Crysis and maybe some new system info and how well crysis is running on it.

Can those arguing about other games and who are posting pics of FEAR and COD2, start PM each other or start another thread. I know you are trying prove some points but I think this thread has drifted way off topic.

:matrix:

Actually, we've stayed on topic. This thread is about how well Crysis will perform on your PC. It has nothing to do with news, it is purely about performance. Thus the debate is an on-topic debate.

Nice vid aget-kay. ;)

brownstone
01-16-07, 07:01 PM
Actually, we've stayed on topic. This thread is about how well Crysis will perform on your PC. It has nothing to do with news, it is purely about performance. Thus the debate is an on-topic debate.

Nice vid aget-kay. ;)

It just appears to me that your talking about how DX9 games are running on your 8800's. Since Crysis is a DX10 game and the 8800 is DX10 then its gonna run quite different than FEAR or COD2.

Not to mention Crysis is a totally different engine than other games and is still being optimized of course.

So because FEAR fps drops in some points, on your 8800, Crysis will crawl? Who knows right.

Unfortunately we keep getting conflicting reports of what Crysis is running on in the gameplay videos we see. FRAPS in the corner would help to.

I have a 7900gtx so news on how the DX9 portion of the game is coming along in the build would be great of course.

Oh agentkay yeah I have seen that video. Thx anyways.

Redeemed
01-16-07, 07:10 PM
It just appears to me that your talking about how DX9 games are running on your 8800's. Since Crysis is a DX10 game and the 8800 is DX10 then its gonna run quite different than FEAR or COD2.

Not to mention Crysis is a totally different engine than other games and is still being optimized of course.

So because FEAR fps drops in some points, on your 8800, Crysis will crawl? Who knows right.

I have a 7900gtx so news on how the DX9 portion of the game is coming along in the build would be great of course.

Oh agentkay yeah I have seen that video. Thx anyways.

Well, that is what we are debating. Icecold is convinced that the 8800s will perform poorly compared to consoles with next-gen titles (DX10). We are arguing against that. Following Icecold's logic, the 8800 would crawl in Crysis. Xion and I are trying to show him that is not true and his logic is flawed.

I'm very convinced that my 8800GTSs will play Crysis with everything maxed or close to it. Icecold, on the other hand, believes that they'll barely managed decet fps at medium settings. We are trying to convince him otherwise. :p

CaptNKILL
01-16-07, 07:34 PM
Actually this just seems like the same flame war thats been going on for a couple weeks now in various other threads.

Why are people still trying to convince icecold of anything related to PC performance and graphics? If he hasn't agreed in the 5 other threads where the same screenshots were posted, he isn't going to agree this time.

I'm not trying to play forum police, but come on guys... it seems like every interesting thread in this section ends up getting dragged into this ongoing crap.

icecold1983
01-16-07, 08:16 PM
Icecold, you are forgetting- I'm using a socket 754 3400- yes a faster CPU would boost my framerates, substantially. If I were running an E6600 at about 3.5Ghz I'm betting I'd be getting no less than 30fps. That is only a 6fps increase, and it is along the lines of what everybody else gets with an E6600 @ 3.5Ghz and a single GTS.

Icecold, I'll throw this at you another way:

If today's games are capable of bringing the 8800's to thier knees, then how will it be possible for them to play any DX10 game? Okay, even allowing for a 20% gain (the most you'll see from DX10) if you add 20% to let's say 20 you only get an fps of 24. Going by what you are saying (following your logic), the 8800 series will completely crawl and flop when playing any next-gen game.

I mean, if FEAR can bring the 8800GTX to the teens in some spots- CrySis should crawl. Lower the settings? You then run into your CPU bottleneck.

Alan Wake will probably do okay, the graphics aren't nearly as extreme as those of Crysis. But how about Interstellar Marines? Another game that will crawl- according to your logic.

Icecold, what has been pointed out to you before, but you completely blow off, is that the 8800 series is not perfect, i.e. there are still flaws with it. Such flaws as offering poor performance where they shouldn't.

A few posts ago I mentioned that in FEAR I'd walk into a completely empty room with nothing extremely graphically intensive happening- yet my FPS would drop quite a bit, and my character would barely move forward, as in it was like he ran into an invisible wall.

Are you to tell my that his is merely a scene to complexe for the 8800? Okay, then explain to my why my 7600GTs in SLi could burn through the same scene at more than double the fps. This is where either FEAR needs a patch or the drivers for the 8800 need some more work. Or it could be both. This happens when you are an early adopter- you run into the early problems of the new hardware. Do you also realise that BFME (the first one) runs about about 15fps average on my SLi'd 8800GTSs (1920x1440, 16xAA, 16xAF, max in-game options) and sometimes dips to single digits, yet with my SLi's 7600GTs it would remain at 30fps constantly (literally never dipping). No, BFME is not taxing my GTSs, it is a software glitch where BFME either needs a patch or something in the 8800 drivers needs fixing (or both).

There is no possible way that two 7600GTs can match or exceed the performance of two 8800GTSs. Yet at some times they do (the odd occurance in FEAR and BFME I mentioned). This is just cause of a software bug, and has NOTHING to do with the hardware.

Now, you mentioned that there are places in FEAR where absolutely NOTHING is going on, but your FPS tanks- I'd chalk that up to a software bug. Why, cause I'd bet you are experiecing the same thing I am. And again, I never experienced it with my SLi'd 7600GTs. No, the SLi'd 7600GTs aren't faster than my two GTSs or your single GTX. It is a software problem, not a hardware problem.

Now, I have provided multiple bechmarks (a couple posts back) that even go into the CPU scaling of the 8800s. They compare stock CPU speeds to oc'd speeds and show the results. Grated, never is there a gain of 20fps or greater, but in many cases there is 10fps (and occassionally more). Xion even posted the differeces between his CPU at stock and then oc'd while running only one GTX to show you the difference. There IS a difference. Not one of 20+fps, but still a difference none the less.

With my CPU (a socket 754 3400), I'd probably experience an average gain of about 10-15fps in my games if I went with an E6600 and ran it at 3.5GHz. Benchmarks get about that much over what I'm getting with my two GTSs. So that is why I know that for my setup there is a huge CPU bottleneck. And even common sense will tell you that a socket 754 3400 is not enough CPU for dual 8800s.

Icecold, I'm not trying to say that the 8800's are invincible. But I am saying they perform better than you seem to believe they do. You refuse to OC your processor to check for any possible improvement. Yet we have shown you that there is improvemet (in some cases near 10fps). So, I assume that you do not know how to overclock your processor, and are afraid to learn. This is understandable- with how much $$$ your setup must have cost I'd hate to fry anything due to an OC gone wrong.

You refuse to try vsync with triple buffering to see if that helps your FPS any, as it is known that in certain games it will aid fps.

Icecold, you have, for the most part, convinced me that you want the 8800s to perform as you believe that they do. No ammount of evidence to the contrary will make you think otherwise. Even though going by your logic last generation products (even the mainstream and not high end) can match and outperform the 8800s- that just isn't even logical but you seem to like that idea a lot (even though you have never come straight out to claim that, your arguments imply that).

Icecold, this is a debate that would never end. I could come to you in person, and show you that your rig is capable of better fps than you claim- you could see it with your own eyes, yet I doubt you'd believe it. And that, my friend, is sad.

You keep claiming that you don't want a console, yet you are practically having wet-dreams over the 360 and spit upon the 8800s. You go against everythig every other member and 90% or all reviews say. Yet you still feel you are right.

This post should be a huge indicator of how blind you are:





Bokishi even stated he has the same hardware you have, but isn't experiencing any of the slowdowns that you are. I guess he is lying right? Cuase it is obviously impossible for you to be wrong.

You say that the slo-mo in FEAR would cause mouse lag if it dropped your fps? I'm starting to think that you have never played FEAR. You mouse does lag when using slo mo. It does not lag substantial, but definitely by a little bit- it does lag.

I hope not to offend you, but I'm now convinced that you really don't know anything about computer gaming and the hardware for computer gaming.

http://www.legionhardware.com/Bench/CPU_Scaling_With_The_GeForce_8800_GTX/FEAR.png

they scale from a core 2 duo@3333 to a 3800. there is no cpu limitation at high res with aa/af in 95% of games.

yes todays games can bring an 8800 to its knees at high res with aa/af. crysis would crawl at those settings(how well do u expect crysis to run on an 8800 at 1920 x 1200 with 4x aa 16x af?), but it will surely run smooth at lower resolutions with some aa/af which is what ive always said.

fear issues could be a game or driver issue, but u cant claim every single game on the market that brings an 8800 to its knees is a driver or game issue. what about oblivion? what about trackmania? what about dark messiah?

btw im not experiencing the same fear issue as you, i never cant move fowards, for me its just the fps tanking. and the mouse lag isnt normal as if i activate slowmo in an empty room where i get high fps my mouse feels great, its only when the framerate tanks that mouse lag becomes an issue. that goes for every game btw.

no i want th e8800s to perform as good as possible, if i could magically snap my fingers and have them double in speed i would, but overclocking my cpu wont help me in any games except supcom.

i havent played any of the bfme games but im aware 8800 has many sli issues atm.

those 2 posters opinions dont prove anything as u have no idea what resolutions or settings they are playing at and what they define as 'great', and the review sites benches actually back eveyrthing im saying. ive already proven with math that cod 2 drops to around 30 with 16x aa 16x af at 1920 x 1200, we know fear bombs, we know trackmania drops to below 30 if u play online, oblivion drops to the teens even without any visual mods, dakr messiah drops to the 20s. this is all fact. my system is fine, i know how to maintain it as i can run any game bench and its never lower than the reviews u find, unfortunately the actual game is usually slower, and therefor matters a lot more than a bench.

icecold1983
01-16-07, 08:18 PM
just another graph of another title showing cpu scaling

http://www.legionhardware.com/Bench/CPU_Scaling_With_The_GeForce_8800_GTX/Prey.png

again almost no difference from a 3800 all the way up to a core 2 duo at 3333, and this isnt even at 1920 x 1200 where i play, so i have even less of a cpu limitation

Redeemed
01-16-07, 08:22 PM
http://www.legionhardware.com/Bench/CPU_Scaling_With_The_GeForce_8800_GTX/FEAR.png

they scale from a core 2 duo@3333 to a 3800. there is no cpu limitation at high res with aa/af in 95% of games.

yes todays games can bring an 8800 to its knees at high res with aa/af. crysis would crawl at those settings(how well do u expect crysis to run on an 8800 at 1920 x 1200 with 4x aa 16x af?), but it will surely run smooth at lower resolutions with some aa/af which is what ive always said.

fear issues could be a game or driver issue, but u cant claim every single game on the market that brings an 8800 to its knees is a driver or game issue. what about oblivion? what about trackmania? what about dark messiah?

btw im not experiencing the same fear issue as you, i never cant move fowards, for me its just the fps tanking. and the mouse lag isnt normal as if i activate slowmo in an empty room where i get high fps my mouse feels great, its only when the framerate tanks that mouse lag becomes an issue. that goes for every game btw.

no i want th e8800s to perform as good as possible, if i could magically snap my fingers and have them double in speed i would, but overclocking my cpu wont help me in any games except supcom.

i havent played any of the bfme games but im aware 8800 has many sli issues atm.

those 2 posters opinions dont prove anything as u have no idea what resolutions or settings they are playing at and what they define as 'great', and the review sites benches actually back eveyrthing im saying. ive already proven with math that cod 2 drops to around 30 with 16x aa 16x af at 1920 x 1200, we know fear bombs, we know trackmania drops to below 30 if u play online, oblivion drops to the teens even without any visual mods, dakr messiah drops to the 20s. this is all fact. my system is fine, i know how to maintain it as i can run any game bench and its never lower than the reviews u find, unfortunately the actual game is usually slower, and therefor matters a lot more than a bench.

Icecold, for everygame where the fps tanks for you- other members here owning a setup equivalent to yours says the opposite (multiple members). And the reviews do not back up what you claim. I posted a review where they tested the CPU scaling of the 8800's, and the results were very differet than what you have just posted- have a link to your source?

Actually this just seems like the same flame war thats been going on for a couple weeks now in various other threads.

Why are people still trying to convince icecold of anything related to PC performance and graphics? If he hasn't agreed in the 5 other threads where the same screenshots were posted, he isn't going to agree this time.

I'm not trying to play forum police, but come on guys... it seems like every interesting thread in this section ends up getting dragged into this ongoing crap.

You are correct CaptNKILL. This is much like beating a dead horse. It would be wise of me to discontinue this debate. This will make the third time where I've said I would step out of this debate- let's see if I can actually do it this time. :p