Originally Posted by abtomat74
Of course it factors, but here is how failing to disclose it can skew the end result:
Duke Nukem Never scores(hypothetically)
card 1) min - 15, max - 75, AVG - 45fps
card 2) min - 25, max - 65, AVG - 45fps
Which card would you choose if every game played like the above results? Would you sacrifice 10 min fps for 10 more max?
You're correct to an extent. But how do any of us know that the minimum framerate posted on a benchmark for a certain card wasn't a single instance that lasted a fraction of a second instead of a repeated occurrence?
That's why I think it's better to place more emphasis on an average, because over the course of the benchmark it will factor in the entire range of frames where the minimums took place. This would exclude cases where a driver/card had a certain hiccup that may have caused an abnormal hitch/minimum for a very short time.