PDA

View Full Version : What is generally considered to be minimum playable fps?


jimmyjames123
10-23-03, 04:44 PM
What is generally considered to be the minimum playable fps? 30 frames per second? Possibly more for a decent gaming experience?

If so, isn't it a bit unpractical to compare two graphics cards at a game setting where both have less than 30fps?

For instance, let's say that card X gets 42fps and card Y gets 34fps with noAA/noAF, and also that card X gets 16fps and card Y gets 21fps with 4xAA/8xAF. Wouldn't the results with noAA/AF be more meaningful to a gamer who was interested in playing this game?

Just a thought, because we all know how much AA/AF settings and resolution settings can affect results, and you rarely see hardware reviewers point this out even when they show tests with one card running at something like, say 14fps and another competing card running at, say 11fps. This might be noted as a 30% performance delta, however %'s really are a misleading measure especially at these fps levels.

subbo
10-23-03, 05:02 PM
Gfx card reviews concentrate on testing the hardware.. the numbers are supposed to tell you how the card performs under the heaviest conditions. It doesnt really matter if it's 12fps or 120fps .. the range of "sampling" for the test is just far more limited the less frames you have, unless you include the first decimal ie 12.4fps

If you consider movies run at 24 FPS, and thats near the limit where the human eye starts to see motion instead of a series of still photos, and you can still really see how jerky the picture is when the camera pans quickly from side to side, I'd say that 20fps is the bare minimum of pure mechanical playability (reaction and response).

But in fast games like fast 2d/side scrollers or first person shooters, my personal preference is in excess of 40 or 50 fps. If its less it starts to hurt my "skill" especially in multiplayer.

But of course the human body is highly adaptive, you can get used to pretty poor fps or image quality, and unless you have something better to compare to you dont know any better. Much of it is also personal preference.

euan
10-23-03, 05:04 PM
Originally posted by subbo
Gfx card reviews concentrate on testing the hardware.. the numbers are supposed to tell you how the card performs under the heaviest conditions. It doesnt really matter if it's 12fps or 120fps .. the range of "sampling" for the test is just far more limited the less frames you have, unless you include the first decimal ie 12.4fps

If you consider movies run at 24 FPS, and thats near the limit where the human eye starts to see motion instead of a series of still photos, and you can still really see how jerky the picture is when the camera pans quickly from side to side, I'd say that 20fps is the bare minimum of pure mechanical playability (reaction and response).

But in fast games like fast 2d/side scrollers or first person shooters, my personal preference is in excess of 40 or 50 fps. If its less it starts to hurt my "skill" especially in multiplayer.

But of course the human body is highly adaptive, you can get used to pretty poor fps or image quality, and unless you have something better to compare to you dont know any better. Much of it is also personal preference.

Anything less than 30 and I get upset. 40-50 is meh. 60 is happy. anything more, and I turn up the resolution and / or image settings.

CaptNKILL
10-23-03, 05:06 PM
It depends on what game.... for fast paced action games like Unreal or Quake, I find it hard to play at under ~50. Anything over 60fps in my book is almost perfectly smooth (although I can tell the difference when fps are higher than that, 60fps is plenty smooth enough). For games like Battlefield or Vietcong, 35fps is enough to play well (although it obviously doesnt LOOK smooth enough for my tastes) unless you turn those into a twitch fest like Quake\Unreal ;)

And your right about the reviews compairing cards at unplayable frame rates. It doesnt really matter if its already unplayable. But when it DOES matter is when they are showing minimum frame rates. Playing a fast paced game at 25fps average is a joke, just as much as playing it at 5fps. But when framerates drop to 25fps here and there its not as much of a problem as dropping to 5fps (heck, even 20fps would be noticeably more anoying of a drop in frame rate).

euan
10-23-03, 05:06 PM
Also I'll point out that AF is always set to 8 minimum (has been for several years), and with the current generation of cards, AA is always on. 2xAA for 9600 and lower. 4xAA minimum for 9700+ What I then change is the resolution to hit around 50fps.

jimmyjames123
10-23-03, 05:11 PM
It would also be very helpful if the reviewers made a note about what settings (resolution and AA/AF settings) they thought resulted in the most pleasing overall gaming experience with the respective card that they are reviewing in the different games that they are reviewing. However, with the expected time constraints and the pressure to get out a review as fast as possible, I can understand why this wouldn't be mentioned.

Hellbinder
10-23-03, 05:15 PM
Pinky in mouth doctor Evil Style....

1 MILLION FPS

TheTaz
10-23-03, 05:30 PM
This brings us to the validy of games that get "stupid frame rates", now.

Who needs 400 FPS in Quake 3???

Well... IMO, until recent games.... Quake 3 was still a great way to tell "how long a card would last", and if other game devs could "code worth a sh!t". While you didn't literally need 200+ FPS, you were assured that most newer games (But before DX9) would play above 30FPS.... And if they didn't... whomever made the game did a very poor job.

Now, with DX9 stuff coming out... Quake 3 Benchmarks are fairly "invalid", IMO... other than 640x480 "CPU Testing". It did have a "good run", tho.

So I guess the question is... what will become the "new" definitive game to bench against? (To give "rough longevity" info of a card, and keep game devs "responsible" for their coding) HL2? UT2k4? Doom3? Something else?

And once a new game is "established" as a "THE good common indicator"... Will the graphics companies focus to "cheat" in that new game?

The advantage we have now is... everybody is trying to test "as much as they can" to avoid "cheats", and give a better "overall picture". Which is good... BUT we STILL need a "definitive reference game" to show "who codes good, vs. who doesn't" (On a graphics vs. performance level). ;)

Regards,

Taz

Edge
10-23-03, 08:58 PM
It depends on the game, but personally for non-FPS games I find anything above 20 to be acceptable. Sure, 30 fps would be better, but if I need to I'm willing to play the game at say 20 FPS if the image quality is worth it. For FPS games like Natural Selection and such, I prefer to keep it above 40 (though I can tell the difference between 40 and 60 pretty easilly). Actually, I think at this point I could probably tell the difference in framerate above 60 if I tried, but I really don't care because 60 FPS is perfectly fine.

But I'm usually more of an IQ freak. I rarely play games now without AA and Aniso, even if it means sub-30 framerates. Right now I'm playing Anarchy Online at 1280x960 (to get as much GUI on the screen as possible) with 2xAA and lvl2 or lvl4 aniso, and that's on a TI4200. Sure, the framerate isn't great, but in an RPG you could probably play it at 10 FPS and you'd be just as good at the game as you would be at 60.

jAkUp
10-23-03, 10:29 PM
yup depends on the game.. but i can usually tolerate anything as low as 25fps... anthing lower than that and it looks really bad... i rarely ever experience anything lower than 25fps anymore though

The_KELRaTH
10-24-03, 04:11 AM
From my experience getting the fps to stay at the Monitor Hz is the most important thing. If I don't use Vsync I often notice tearing when panning left/right or if I use Vsync and fps is lower then there's like a vibrating effect when panning.
In most fps games the framerates are generally high enough but with certain games the limit fps like C&C Generals scrolling around the screen is awful unless the limiter is removed or playing online.

What I find more frustrating is that I'm told this is normal behavior with Vsync on yet with cards that very rarely attained framerates at Vsync levels (Voodoo1, 2, ATI Rage Fury, Geforce 2,3 etc) the vibrating effect when panning the screen was never an issue.

Gaal Dornik
10-24-03, 05:30 AM
60+ is fluidly. The eye can see only ca. 25 frames per second. But because its not syncronized with an action on the screen, 60 fps are needed.
So if you have 24fps on the screen, and the eye misses some frames, the framerate you see will drop to 12fps or so and it feels jerky.
On the TV it is compensated with a motion blur, caused IIRC by the nature of the analog devices.

particleman
10-24-03, 03:19 PM
I consider 30fps the minimum framerate for non choppy gameplay. 60fps or better is my desired framerate though.

Dazz
10-24-03, 05:13 PM
27fps is the lowest for smoothness, so long as it never goes below that. 60+ fps is good as it gives head room for large thing happening which will greatly reduce the low frame rates.

The XBOX with halo plays @ 30fps + TV which gives smooth playback. However if locked at 30fps on a PC with a monitor it doesn't give smooth peformance intill you hit the 40's range.