PDA

View Full Version : Wonder how well you PC will run Crysis


Pages : 1 2 3 4 [5] 6 7

Arioch
01-16-07, 07:34 PM
This game was the deciding factor in me getting a new system with a quad core CPU and SLI video cards. While I wait for it to be released I will replay Oblivion with mods this time (beat it on the 360), Dark Messiah, Gothic 3, and FEAR:EP.

icecold1983
01-16-07, 07:35 PM
http://www.legionhardware.com/document.php?id=603

was the site with those graphs

the review su linked to were the following

http://www.legitreviews.com/article/415/3/
http://www.overclockers.com/articles1390/
http://www.tbreak.com/reviews/article.php?cat=grfx&id=474&pagenumber=5
http://www.anandtech.com/video/showdoc.aspx?i=2870&p=22

one of the links u gave doesnt work

the first link isnt rly useful as it doesnt show anything in regards to cpu scaling on a single card.

the 2nd link u showed benches at 1024 x 768 which is absolutely pointless, at the end of the article how ever they say what ive been saying

"When high res gaming, it is quite obvious here that the GPU's performance does not scale AT ALL at any CPU clockspeed"

3rd link is just a regular review, no info in regards to cpu scaling, but it shows somewhat higher cod 2 numbers, could be a less strenuous benchmark than other sites run, as its unlikely that they are just better at setting up a system than all the other review sites.

4th link is pretty irrelevant to this discussion, but with an avg fps of 32 u can bet the minimum fps is extremely low, were talking single digit lows.

again anyone who owns tm united and an 8800, simply max out the graphics, set ur res to 1920 x 1200 and go play some stadium online. i suggest u turn motion blur off tho or else it wont even be playable.

ill go take some screens and post them

btw captnkill im being perfectly mature, im not flaming or insulting anyone.

icecold1983
01-16-07, 09:19 PM
alo another game that brings an 8800 to its knees, r6 vegas. call of juarez too.

|MaguS|
01-16-07, 09:21 PM
alo another game that brings an 8800 to its knees, r6 vegas. call of juarez too.

R6 Vegas brings it to its knees because the game is coded like ass... There is no reason why while im playing my framerates would be higher during high points of action then while walking down an empty hallway.

Xion X2
01-16-07, 09:21 PM
[
fear issues could be a game or driver issue, but u cant claim every single game on the market that brings an 8800 to its knees is a driver or game issue. what about oblivion? what about trackmania? what about dark messiah?

Where does this idea of yours come from that a single graphics card should be able to run every title at 1920x1200 resolution at 40fps or more while 16xAA is activated?

Can you tell us, please? Because this has never, ever been the case as far as I can remember. If you're so pissed at the way that your setup is running, then there's an easy solution for it. Go SLI.

no i want th e8800s to perform as good as possible, if i could magically snap my fingers and have them double in speed i would, but overclocking my cpu wont help me in any games except supcom.

i havent played any of the bfme games but im aware 8800 has many sli issues atm.

SLI works perfect on Oblivion, Dark Messiah, Call of Duty 2, Prey, and Tomb Raider Legend for me. The only title giving me trouble so far is Rainbow Six Vegas. So, again.. you exaggerate things.

What you should've done if you wanted better performance at higher resolutions is to go SLI. That is really the only thing that can handle 1920x1200/16xAA with really good framerates 100% of the time. All of those titles I listed above, including Dark Messiah--which you say brings the 8800GTX to its knees--run no less than 60fps on my system at any time.

So how about just quit bitching about this on every thread like you've been doing and either go buy a second card or wait for some drivers to come out that are more optimized.

It just seems absurd to me that you, one, expect a single card to play every game without slowing down at your crazy resolution that you're playing at, and two, that you expect a brand new architecture to have no problems balancing both DX9 and DX10 perfectly right off the bat. The drivers for this card are still early and Nvidia is having to program for two different operating systems--XP and Vista.

icecold1983
01-16-07, 10:35 PM
R6 Vegas brings it to its knees because the game is coded like ass... There is no reason why while im playing my framerates would be higher during high points of action then while walking down an empty hallway.

agreed, but it still brings the 8800 to its knees as an available and popular game

icecold1983
01-16-07, 10:39 PM
Where does this idea of yours come from that a single graphics card should be able to run every title at 1920x1200 resolution at 40fps or more while 16xAA is activated?

Can you tell us, please? Because this has never, ever been the case as far as I can remember. If you're so pissed at the way that your setup is running, then there's an easy solution for it. Go SLI.



SLI works perfect on Oblivion, Dark Messiah, Call of Duty 2, Prey, and Tomb Raider Legend for me. The only title giving me trouble so far is Rainbow Six Vegas. So, again.. you exaggerate things.

What you should've done if you wanted better performance at higher resolutions is to go SLI. That is really the only thing that can handle 1920x1200/16xAA with really good framerates 100% of the time. All of those titles I listed above, including Dark Messiah--which you say brings the 8800GTX to its knees--run no less than 60fps on my system at any time.

So how about just quit bitching about this on every thread like you've been doing and either go buy a second card or wait for some drivers to come out that are more optimized.

It just seems absurd to me that you, one, expect a single card to play every game without slowing down at your crazy resolution that you're playing at, and two, that you expect a brand new architecture to have no problems balancing both DX9 and DX10 perfectly right off the bat. The drivers for this card are still early and Nvidia is having to program for two different operating systems--XP and Vista.

i never said i expect the 8800 to play every game at 1920 x 1200 with 16x aa, that was you or someone else who said NO CURRENT GAME CAN TEST THE 8800. i simply argued that fact.

btw in the games i listed i use 4x aa and still hit performance walls. the only reason im hesitant to go sli is some games just dont see an increase, teackmania being one. i also want to see the r600 b4 i make the decision to go sli.

1920 x 1200 isnt a crazy res, even the ps3 supports it in some games, likely to be many games in the future, with no framerate problems.

|MaguS|
01-16-07, 10:40 PM
agreed, but it still brings the 8800 to its knees as an available and popular game

If thats your logic, I can write a DOS game that can bring down a QuadCore to its knees... guess they suck as CPUs aswell...

God your more retarded then Nv40.

nutcrackr
01-16-07, 10:43 PM
God your more retarded then Nv40.
woah nelly, don't you think that's going a little too far... :p

sillyeagle
01-16-07, 10:44 PM
Where does this idea of yours come from that a single graphics card should be able to run every title at 1920x1200 resolution at 40fps or more while 16xAA is activated?

Can you tell us, please? Because this has never, ever been the case as far as I can remember. If you're so pissed at the way that your setup is running, then there's an easy solution for it. Go SLI.



Yeah I don't understand this guy. All he does is bitch and moan about every little thing in his perception that is not "right", and in nearly every case he is bitching about nothing, because it is the way it is. Nothing is god enough for him. I don't think I have seen a single post by him where he was not complaining about one thing or another. For somebody who knows as much as him, I can't see why he would be like that. I wonder if he works for ATI.

He came on here asking all these questions like a noob, "why does the 8800 AF look terrible", "why does fear perf. suck on the 8800". I helped him out at first until I realized what he was up to.

icecold1983
01-16-07, 10:47 PM
If thats your logic, I can write a DOS game that can bring down a QuadCore to its knees... guess they suck as CPUs aswell...

God your more retarded then Nv40.

when did i ever say the 8800 sucks? reading is fundamental

Roadhog
01-16-07, 10:53 PM
when did i ever say the 8800 sucks? reading is fundamental

shooo, no one wants you.

icecold1983
01-16-07, 10:55 PM
Yeah I don't understand this guy. All he does is bitch and moan about every little thing in his perception that is not "right", and in nearly every case he is bitching about nothing, because it is the way it is. Nothing is god enough for him. I don't think I have seen a single post by him where he was not complaining about one thing or another. For somebody who knows as much as him, I can't see why he would be like that. I wonder if he works for ATI.

He came on here asking all these questions like a noob, "why does the 8800 AF look terrible", "why does fear perf. suck on the 8800". I helped him out at first until I realized what he was up to.

correction, i was trying to find out why the 8800 was doing angle dependent af, no one here had an answer.

i also asked if fear always tanks in certain areas, as i had never played the game till i got my 8800.

Xion X2
01-17-07, 12:19 AM
agreed, but it still brings the 8800 to its knees as an available and popular game

It brings EVERY graphics card to its knees because, like Magus said, it's coded like ass. How you can somehow lay the blame at the hardware with this is beyond me. What's the point of complaining about it? It's not just an issue with the 8800 cards. In the places where the 8800GTX is getting 25-30 fps at your resolution, the 1900XT is getting half that.

Do those cards suck as well? Complain to the programmers if you want to bitch about something.. they're the ones that did a rush job porting it over.

Xion X2
01-17-07, 12:30 AM
i never said i expect the 8800 to play every game at 1920 x 1200 with 16x aa, that was you or someone else who said NO CURRENT GAME CAN TEST THE 8800. i simply argued that fact.

Give me a break here icecold, eh? You've been hopping on every other thread discussing the 8800GTX or PC gaming in particular in this forum for the past few weeks bitching about how slow your computer runs some of these games.

btw in the games i listed i use 4x aa and still hit performance walls.

Outside of Rainbow Six and Oblivion, I just don't see how that's possible. On a single GTX, playing at 1680x1050 and with 16xAA, I got anywhere from 55-60fps on FEAR most of the time. And it was the same for most other games, too. I already showed you what Call of Duty 2 played like at my resolution and 16xAA. But when I did that, you wanted to ramble on and on about what it played like on your system w/ 16xAA enabled. You never even mentioned what it ran like with only 4xAA enabled, so whatever, kid. Don't go sidestepping now acting like 4xAA is now good enough for you.

teackmania being one. i also want to see the r600 b4 i make the decision to go sli.

Why didn't you just go with a much cheaper single-card motherboard like the Asus P5B Deluxe if that was the case? Why'd you spend the extra 100$ on an SLI motherboard?

1920 x 1200 isnt a crazy res, even the ps3 supports it in some games, likely to be many games in the future, with no framerate problems.

Yeah, and in order to do that you'll get sh!tty texture resolution and filtering like is plaguing so many of the console titles right now.

Resistance: Fall of Man
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23392&stc=1&d=1168283185

Tiger Woods PGA 07
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23393&stc=1&d=1168328008

icecold1983
01-17-07, 12:51 AM
It brings EVERY graphics card to its knees because, like Magus said, it's coded like ass. How you can somehow lay the blame at the hardware with this is beyond me. What's the point of complaining about it? It's not just an issue with the 8800 cards. In the places where the 8800GTX is getting 25-30 fps at your resolution, the 1900XT is getting half that.

Do those cards suck as well? Complain to the programmers if you want to bitch about something.. they're the ones that did a rush job porting it over.

im not laying the blame anywhere, im simply looking at it from game x performs at y speed on the given hardware.

again i never said the 8800 sucked. i just argue against people who go in threads and make ridiculous comments like "it never drop below 60 fps in any game at xhd resolutions"

icecold1983
01-17-07, 01:01 AM
Give me a break here icecold, eh? You've been hopping on every other thread discussing the 8800GTX or PC gaming in particular in this forum for the past few weeks bitching about how slow your computer runs some of these games.



Outside of Rainbow Six and Oblivion, I just don't see how that's possible. On a single GTX, playing at 1680x1050 and with 16xAA, I got anywhere from 55-60fps on FEAR most of the time. And it was the same for most other games, too. I already showed you what Call of Duty 2 played like at my resolution and 16xAA. But when I did that, you wanted to ramble on and on about what it played like on your system w/ 16xAA enabled. You never even mentioned what it ran like with only 4xAA enabled, so whatever, kid. Don't go sidestepping now acting like 4xAA is now good enough for you.



Why didn't you just go with a much cheaper single-card motherboard like the Asus P5B Deluxe if that was the case? Why'd you spend the extra 100$ on an SLI motherboard?



Yeah, and in order to do that you'll get sh!tty texture resolution and filtering like is plaguing so many of the console titles right now.

Resistance: Fall of Man
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23392&stc=1&d=1168283185

Tiger Woods PGA 07
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23393&stc=1&d=1168328008

well first of all its a 30% performance drop just to go to my resolution, then u have the extra around 7%~ cost of aa at the higher res over ur res. so right there thats a sizable drop in performance.

i d/led the cod 2 demo again and gave it a run thru at 1920 x 1200 4x/16x. avg fps was around 44 with drops to the 30s during certain scenes. i never mentioned what it played like on my system with 16x aa, i was just stating how it would likely perform based on how it performs with 4x aa. and i never mentioned or implied that i run my games with 16x aa. please quote the posts where i imply or state such.

and that brings me to my original point, when the 8800 already hits walls on current titles at 1920 x 1200, how do u people expect it to run crysis smoothly at an even higher res?

yes consoles have worse texture filtering and resolution than pcs, but they make up for it with much better lighting, much higher poly counts, better animation, and far more detail in general. crysis will be the first game to surpass the consoles graphically.

sillyeagle
01-17-07, 01:17 AM
correction, i was trying to find out why the 8800 was doing angle dependent af, no one here had an answer.

i also asked if fear always tanks in certain areas, as i had never played the game till i got my 8800.

My bad.

Xion X2
01-17-07, 01:36 AM
well first of all its a 30% performance drop just to go to my resolution, then u have the extra around 7%~ cost of aa at the higher res over ur res. so right there thats a sizable drop in performance.

Your logic is flawed here. The resolutions just do not scale in the way that you're saying they do.

For instance, look here at the comparison of COD2 ran at the same settings but at different resolutions that are in question here (1920x1200 vs 1600x1200)

http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/images/cod21600.gif
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/images/cod22048.gif

62fps for the lower res vs 55 for the higher res. 7/62 = 11% difference in performance, not "30%" as you keep claiming. And that's taking your "AA is tougher at higher res" theory into account, as well. There's just not that big of a difference in performance.

You just don't know what in the hell you're talking about. If your computer is playing Call of Duty 2 at an average of 44 fps then, as I've said all along, you need to learn how to optimize it for better performance.

i d/led the cod 2 demo again and gave it a run thru at 1920 x 1200 4x/16x. avg fps was around 44 with drops to the 30s during certain scenes.

Then your computer's fubar, because firingsquad got 55fps on average on COD2 w/ an FX-62 paired with an 8800GTX:

http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page13.asp
http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/images/cod22048.gif

And as for your comments about FEAR running like crap on a single 8800GTX?

At 1920x1200 res and 4xAA anandtech got an average of 59fps on a single stock-clocked 8800GTX:

http://images.anandtech.com/graphs/8800%20roundup%20nov06_111206101147/13562.png
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=6

Actually, now that I look at it, they were pushing even more pixels than you are (1920x1440) so it even heightens the blow here.


i never mentioned what it played like on my system with 16x aa, i was just stating how it would likely perform based on how it performs with 4x aa. and i never mentioned or implied that i run my games with 16x aa. please quote the posts where i imply or state such.

Oh, okay. Right here:

"i know cod2 doesnt run at 60 fps at 1920 x 1200 with 16x aa 16x af. and it drops to around 30 regularly."
http://www.nvnews.net/vbulletin/showthread.php?t=84146&page=2

You were saying?

and that brings me to my original point, when the 8800 already hits walls on current titles at 1920 x 1200, how do u people expect it to run crysis smoothly at an even higher res?

Do you consider an average of "59fps" like the 8800GTX plays FEAR at your res "hitting a wall?" That's my original point; you're overexaggerating. Either your computer is seriously screwed up or you just enjoy lying out of both sides of your mouth for attention.

yes consoles have worse texture filtering and resolution than pcs, but they make up for it with much better lighting, much higher poly counts, better animation, and far more detail in general. crysis will be the first game to surpass the consoles graphically.

Ridiculous. Dark Messiah and Oblivion on the 8800GTX look better than any game I've seen on the consoles. I even played Lost Planet on the 360 the same day I installed Dark Messiah for the first time and couldn't believe how low-res and jaggied the textures on the 360 looked by comparison.

Dark Messiah:
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23580&stc=1&d=1169019723

Oblivion:
http://www.nvnews.net/vbulletin/attachment.php?attachmentid=23581&stc=1&d=1169021056

Lost Planet:
http://image.com.com/gamespot/images/2007/004/931114_20070106_screen003.jpg

You're nuts.

icecold1983
01-17-07, 02:04 AM
Your logic is flawed here. The resolutions just do not scale in the way that you're saying they do.

For instance, look here at the comparison of COD2 ran at the same settings but at different resolutions that are in question here (1920x1200 vs 1600x1200)

62fps for the lower res vs 55 for the higher res. 7/62 = 11% difference in performance, not "30%" as you keep claiming. And that's taking your "AA is tougher at higher res" theory into account, as well. There's just not that big of a difference in performance.

You just don't know what in the hell you're talking about. If your computer is playing Call of Duty 2 at an average of 44 fps then, as I've said all along, you need to learn how to optimize it for better performance.



Then your computer's f*cked, because firingsquad got 55fps on average on COD2 w/ an FX-62 paired with an 8800GTX:

http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/page13.asp

And as for your comments about FEAR running like crap on a single 8800GTX?

At 1920x1200 res and 4xAA anandtech got an average of 59fps on a single stock-clocked 8800GTX:

http://www.anandtech.com/video/showdoc.aspx?i=2873&p=6


Oh, okay. Right here:

"i know cod2 doesnt run at 60 fps at 1920 x 1200 with 16x aa 16x af. and it drops to around 30 regularly."
http://www.nvnews.net/vbulletin/showthread.php?t=84146&page=2

You were saying?



Do you consider an average of "59fps" like the 8800GTX plays FEAR at your res "hitting a wall?" That's my original point; you're overexaggerating. Either your computer is seriously screwed up or you just enjoy lying out of both sides of your mouth for attention.



Ridiculous. Dark Messiah on the 8800GTX looks better than any game I've seen on the consoles. I even played Lost Planet on the 360 the same day I installed Dark Messiah for the first time and couldn't believe how low-res and jaggied the textures on the 360 looked by comparison.

Dark Messiah:

Lost Planet:


You're nuts.

u dont know what stage firingsquads demo is, until u do u cant draw any conclusions that my comp is screwed. what u can conclude is that no way in hell does the game run at 60 fps with 16x aa at 1920 x 1200.

yeah my AVG fps on fear is high too, its the extreme lows i have an issue with, and apparently so does hardocp as they have replicated it over multiple reviews/systems.

ur right, cod 2 doesnt run at 60 fps with 16x aa 16x af at 1920 x 1200, thats what i was saying and i stand by it. it does drop to 30 regularly, it drops to the 30s even with 4x aa.

u picked a terrible screen to show off console graphics tho.

heres some screens with fps counters on, 1920 x 1200 with 4/16.

http://img207.imageshack.us/img207/9366/148rw9.th.jpg (http://img207.imageshack.us/my.php?image=148rw9.jpg)
http://img207.imageshack.us/img207/7508/2cl0.th.jpg (http://img207.imageshack.us/my.php?image=2cl0.jpg)
http://img404.imageshack.us/img404/5711/3ks4.th.jpg (http://img404.imageshack.us/my.php?image=3ks4.jpg)
http://img401.imageshack.us/img401/8122/4vo5.th.jpg (http://img401.imageshack.us/my.php?image=4vo5.jpg)
http://img401.imageshack.us/img401/7042/5el7.th.jpg (http://img401.imageshack.us/my.php?image=5el7.jpg)

icecold1983
01-17-07, 03:11 AM
http://www.nvnews.net/vbulletin/showthread.php?t=83459

jakup benched tm united with the included bench. my score is within 1 or 2 fps of his on a much slower cpu(cuz the game is gpu limited), and yet when u actually play the game the fps is nowhere near 60 at these settings in stadium mode. unless ur playing an extremely baasic track

Mr_LoL
01-17-07, 03:36 AM
Xion stop wasting your time with the clown. You arent going to convince him otherwise.

icecold1983
01-17-07, 03:50 AM
everything ive claimed ive showed solid evidence that backs it up

Xion X2
01-17-07, 04:45 AM
everything ive claimed ive showed solid evidence that backs it up

Just because you show evidence of your computer running crappy doesn't mean that it should be running that way. The results that you're showing don't match up either with review sites or what users are posting here. But it's not like it matters, because you're too hard-headed to listen to it. Just a quick recap:

I've shown you self-benchmarks that I've done. You didn't like those and wanted to claim it was low-action/detail.

So then I took some shots in the middle of some very detailed battlefields with lots of gunfire with the frame-counter sticking where it did before.

Then you wanted to claim you never said you were talking about running 16xAA.. that you were talking about 4xAA the entire time. Then I pulled a quote of yours specifically mentioning you running at 16xAA.

Then, screw that.. you wanted to then say that even at 4xAA your COD2 game only ran around 44fps. Then I pulled benchmark after benchmark showing tests averaging over 10fps more than you were getting on a weaker processor while debunking your "Resolution Theory" in the process.

Then.. screw that. You wanted to say that the review never said which stage it was, even though I've shown you 3 different maps with screenshots with the framerate staying pegged over 50 w/ 16xAA on my machine.

Well, I'm tired of playing ring-around-the-rosie with you; I'll let you have your playground all to yourself. I've tried talking some common sense into you, but I think there's a bigger gap between you and it than the Grand Canyon itself.

Enjoy your 360.

icecold1983
01-17-07, 05:32 AM
Just because you show evidence of your computer running crappy doesn't mean that it should be running that way. The results that you're showing don't match up either with review sites or what users are posting here. But it's not like it matters, because you're too hard-headed to listen to it. Just a quick recap:

I've shown you self-benchmarks that I've done. You didn't like those and wanted to claim it was low-action/detail.

So then I took some shots in the middle of some very detailed battlefields with lots of gunfire with the frame-counter sticking where it did before.

Then you wanted to claim you never said you were talking about running 16xAA.. that you were talking about 4xAA the entire time. Then I pulled a quote of yours specifically mentioning you running at 16xAA.

Then, screw that.. you wanted to then say that even at 4xAA your COD2 game only ran around 44fps. Then I pulled benchmark after benchmark showing tests averaging over 10fps more than you were getting on a weaker processor while debunking your "Resolution Theory" in the process.

Then.. screw that. You wanted to say that the review never said which stage it was, even though I've shown you 3 different maps with screenshots with the framerate staying pegged over 50 w/ 16xAA on my machine.

Well, I'm tired of playing ring-around-the-rosie with you; I'll let you have your playground all to yourself. I've tried talking some common sense into you, but I think there's a bigger gap between you and it than the Grand Canyon itself.

Enjoy your 360.

not matching up? sigh let me link AGAIN to the same links ive already linked to.

http://enthusiast.hardocp.com/article.html?art=MTIxOCwxNCwsaGVudGh1c2lhc3Q=

"There are some places in the game where we saw the framerates tank, even on the BFGTech GeForce 8800 GTX when antialiasing was enabled."

http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/15.html

"we found that the SloMo mode dropped our frame rates in to the low teens."

so now that weve confirmed my fear results are 100% normal, as are my trackmania results, how can u claim that my cod 2 results are off?

oh wait fatality here confirms issues in dark messiah

http://www.nvnews.net/vbulletin/showthread.php?p=1115079&highlight=messiah#post1115079

wait heres another user echoing what im saying

http://www.nvnews.net/vbulletin/showthread.php?t=83652&page=5

shaker718 "DM:M&M runs well, looks good, but still dips into the teens with everything on high using the large fireball spell."

he must also be an idiot and have no idea what hes doing.

and again in regards to the 16x aa thing, i was stating how cod 2 runs on the 8800 with 16x aa, not on my system with 16x aa. i then said that even with only 4x aa u can hit performance walls. there is no contradiction.

so i posted ss of fear framerate drops, and linked to 2 reputable sites coming to the EXACT same conclusion. i then posted ss of dark messiah with fps problems and 2 other 8800 gtx owners on this forum echo my problems. ive shown u jakups trackmania bench matching mine, i get the exact same scores running the built in company of heros bench as ever review site that has used it. my 3d mark score is 10k with a stock e6600, perfectly normal. i get the same fps in the nvidia g80 demos as a bunch of 8800 owners on beyond3d.com. do the math.