Welcome, Guest
Username: Password: Remember me

TOPIC: Why aren't average frame times used in benchmarks?

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15886

  • SpeedinBanana
  • SpeedinBanana's Avatar
  • Offline
  • Lurker
  • Posts: 10
  • Karma: 0
Since it's possible that two CPUs can have same average frame rates and different 1% lows, isn't it possible that two CPUs have the same 1% lows and yet different amount of stuttering?Why aren't average frame times used then?They would tell much more about smoothness than 1% lows.
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15888

  • i7Baby
  • i7Baby's Avatar
  • Offline
  • Moderator
  • Posts: 590
  • Thank you received: 87
  • Karma: 18
I gather you mean GPUs, not CPUs. And framerates, not frame times.

If they have the same % low framerates, then they would have the same average frame rate stuttering. But the distribution could be different eg same averages but one GPU might have a low number of very reduced framerates and the other might have a high number of slightly reduced framerates. This might make the perception of stuttering severity different.
R7-1700 @3900/1.375V, H110iGT, ASRock x370 Taichi, GSkill 2x8GB 3200/14 @3466/14 1.38V, Samsung 250gb EVO (W10), 2TB Barracuda, 2 x R9 Nano, EVGA G2 750, Enthoo Pro, LG 34UM88. Aizo kb, M6580, HyperX Cloud
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15890

  • SpeedinBanana
  • SpeedinBanana's Avatar
  • Offline
  • Lurker
  • Posts: 10
  • Karma: 0
I'm talking about CPUs.1% lows show only 1% of the worst frames, but what about the rest?What if two CPUs have the same 1% lows and yet one stutters more often than the other?Wouldn't average frame times show a better picture about smoothness than just 1% lows and average frame rates (or distribution as you called it)?
Last Edit: 2 months 1 week ago by SpeedinBanana.
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15897

  • i7Baby
  • i7Baby's Avatar
  • Offline
  • Moderator
  • Posts: 590
  • Thank you received: 87
  • Karma: 18
Its the GPUs that determine frametimes
R7-1700 @3900/1.375V, H110iGT, ASRock x370 Taichi, GSkill 2x8GB 3200/14 @3466/14 1.38V, Samsung 250gb EVO (W10), 2TB Barracuda, 2 x R9 Nano, EVGA G2 750, Enthoo Pro, LG 34UM88. Aizo kb, M6580, HyperX Cloud
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15902

  • SpeedinBanana
  • SpeedinBanana's Avatar
  • Offline
  • Lurker
  • Posts: 10
  • Karma: 0
Don't both CPU and GPU determine frame times?CPUs do affect frametimes, try to compare pentium g4560 with ryzen 1600, ryzen will certainly have better frame times.My question is why don't benchmarkers use average frame times (not average frame rates) since they tell how often a cpu stutters, 1% lows only say what the worst 1% of frames will look like, they don't tell if it can deliver a smooth experience (what if two CPUs have the same 1% lows and yet one stutters more than the other?).
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 1 week ago #15905

  • Birdhunter
  • Birdhunter's Avatar
  • Offline
  • First Blood!
  • Gamer, Modder, Engineer... the usual stuff.
  • Posts: 76
  • Thank you received: 23
  • Karma: 6
Frametime and Framerate are part of the same equation, so they tell kinda the same as long as the time is the same.

For example:
Frametime = Time / Framerate
0.01s = 1s / 100fps
0.0166s = 1s / 60fps

And a CPU can 'bottleneck' a GPU, leading to lower framerates (=higher frametimes).

I think the issue he is talking about, is that if you have more then 1% lower frames, the 1% low number doesn't tell how much were low.

As an example:
A test with 1000 frames and 10s runtime.

Config A:
Runs 120fps for 8seconds total and 20fps for 2seconds.
The 1% low would be 20fps.
The Average would be 100fps.

Config B:
Runs 180fps for 5 seconds total and 20fps for 5 seconds.
The 1% low would be 20fps.
The average would be 100fps.

Because both configs run the same total amount of frames within the same time, the average is the same.

Because the average of the 1% (1second) worst framerates are both 20fps, the 1% low shows as the same.

If we would compare a 50% low , there would be a difference - but because only 1% is looked at and 99% is discarded, the fact that a higher % of fps were lower and the delta between highest and lowest was bigger, would be discarded as well and not show up in the numbers.

Obviously, the person testing would see the difference, the perceived higher Average of 180fps, the harder drops, etc. and the bigger delta would be easy to spot on a chart that shows FPS over time.

Having said all that, I don't think GN needs to change or add things, because:

- The 1% low seems to be a high enough percentage to cover all the "lows" in a test. More then 1% would only be necessary if the amount of "low" is more then 1% (like in my examples), but we rarely see that in reality, if at all.

- High/Low Frametimes and Framerates are practically the same as far as the Consumer is concerned, because a higher frametime i.e a lower framerate at a given time will be perceived as a stutter either way.

- Stutters and similar issues are usually picked up and mentioned by the reviewers and if the Hardware has any issues, a consumer that does it's due diligance and looks up multiple reliable review sources (and not just one) would pick up on that.

Edit:
I think most reviewers use fps and not frametimes because it's easier to understand for the general user. An "increase in frametimes from 8 milliseconds to 16 milliseconds" sounds more complicated then "the game had stutters and dropped from 120fps to 60fps". And since most tests record the FPS, it's easier to calculate too, as you could just take the lowest X% of the numbers and average them.
i7-4790K @ 4.7 GHz | 2x8GB DDR3-2400 Avexir Blitz 1.1 | Z87 Gigabyte G1.Sniper 5 | Asus ROG Poseidon GTX 1080 Ti | HyperX 3K SSDs | Phanteks Enthoo Luxe Tempered Glass | Super Flower Leadex Platinum 1,2kW | Thermaltake Riing RGB | Long Sig! | XSPC RayStorm | Alphacool NexXxoS 420 XT45 | Alphacool Eisbecher & Laing DDC 3.25 & Alphacool Aurora RGB | Hardline PETG Tubing | EKWB CryoFuel AcidGreen | Green&Black Sleeved Cables | Acer Predator Z35 | Still reading? | Logitech G302 + G633 + G410 | etc.
Last Edit: 2 months 1 week ago by Birdhunter. Reason: Added text
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 6 days ago #15906

  • SpeedinBanana
  • SpeedinBanana's Avatar
  • Offline
  • Lurker
  • Posts: 10
  • Karma: 0
No, that's not what I meant (the part about 2 configurations), average frame rate doesn't tell anything about frame time because some frames might stay longer than others (and cause stuttering), 1% lows show what the worst 1% of frames look like during the entire benchmark, but what if we have 2 configurations with the same or similar 1% lows, benchmarked for the same amount of time, and have different amount of stuttering on each of them (i suppose average frame rate doesn't matter because it doesn't tell anything about fluidity).This means that benchmarkers should either show wave graphs of frame times or include a new metric called average frame times (for example pentium g4560 and ryzen 3 perform the same in games, but does one stutter more often than the other because of the different amount of cores?).
Last Edit: 2 months 6 days ago by SpeedinBanana.
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 6 days ago #15907

  • Birdhunter
  • Birdhunter's Avatar
  • Offline
  • First Blood!
  • Gamer, Modder, Engineer... the usual stuff.
  • Posts: 76
  • Thank you received: 23
  • Karma: 6
SpeedinBanana wrote:
average frame rate doesn't tell anything about frame time because some frames might stay longer than others (and cause stuttering),

Just to clarify: Framerates and Frametimes are directly correlated. If the time between Frames gets longer, a lower amount of frames will be produced within a given time. Higher FPS = lower Frametimes.

It's the average that's the problem, not the fact that framerates instead of frametimes are used.

-SpeedinBanana wrote:
1% lows show what the worst 1% of frames look like during the entire benchmark

GN take a section (fixed timespan, not a fixed number of frames) and uses that as 100%.
They then take the "bottom" 1% of that time and average the framerate of that 1% timespan.
Because the timespan is a constant, the only variable is the amount of frames that were produced in that timespan.

To directly meassure the average Frametime, a fixed amount of frames (not timespan) would have to be used. This is not practical, as most benchmarks (especially games) run a constant amount of time and not a constant amount of frames.

However, since the results is "frames per time",
it's easy to calculate "times per frame" from it. The frametime is the reciprocal (inverse) of the framerat

-SpeedinBanana wrote:
but what if we have 2 configurations with the same or similar 1% lows, benchmarked for the same amount of time, and have different amount of stuttering on each of them

That's exactly the example that I've posted previously. I just went the extra mile and made the average framerate/frametime the same aswell.

-SpeedinBanana wrote:
(i suppose average frame rate doesn't matter because it doesn't tell anything about fluidity).

It kinda does. A higher framerate is perceived as "more fluid", which is why it's not completely disregarded from reviews. It doesn't tell about frametime/framerate consistency.

-SpeedinBanana wrote:
This means that benchmarkers should either show wave graphs of frame times

If "framerate drops" or "stutters" can be adequately expressed by a single number, a graph is not needed.

-SpeedinBanana wrote:
include a new metric called average frame times

Take 1.
Divide it by the 1% low fps shown.
There you have the average frametime of the 1% low.
Since the frametime is the reciprocal of the shown framerate, adding it in charts would be a waste of space and only make a chart more difficult to read - especially in charts with multiple GPUs/CPUs/etc.

-SpeedinBanana wrote:
(for example pentium g4560 and ryzen 3 perform the same in games, but does one stutter more often than the other because of the different amount of cores?).

That's exactly the reason why GN started to include the 1% and 0.1% low in the first place:

Check their video on the subject:
youtu.be/uXepIWi4SgM
i7-4790K @ 4.7 GHz | 2x8GB DDR3-2400 Avexir Blitz 1.1 | Z87 Gigabyte G1.Sniper 5 | Asus ROG Poseidon GTX 1080 Ti | HyperX 3K SSDs | Phanteks Enthoo Luxe Tempered Glass | Super Flower Leadex Platinum 1,2kW | Thermaltake Riing RGB | Long Sig! | XSPC RayStorm | Alphacool NexXxoS 420 XT45 | Alphacool Eisbecher & Laing DDC 3.25 & Alphacool Aurora RGB | Hardline PETG Tubing | EKWB CryoFuel AcidGreen | Green&Black Sleeved Cables | Acer Predator Z35 | Still reading? | Logitech G302 + G633 + G410 | etc.
Last Edit: 2 months 6 days ago by Birdhunter. Reason: Edited for formatting
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 6 days ago #15919

  • SpeedinBanana
  • SpeedinBanana's Avatar
  • Offline
  • Lurker
  • Posts: 10
  • Karma: 0
1)Isn't it possible that we have 60 fps but have frames that take longer to get changed by the new ones (and the new ones change faster so we still get 60 fps)?These few slow frames would cause stuttering and yet the software would report that we had 60 fps even though some of them were on screen for more than 16.66 ms.

"If framerate drops or stutters can be adequately expressed by a single number, a graph is not needed"

But 1% lows don't say how often the game stutters, it only says what the worst 1% of frames look like, right?That doesn't tell a lot about overall smoothness (sure, there's average fps but it has the problem from 1) )
The administrator has disabled public write access.

Why aren't average frame times used in benchmarks? 2 months 5 days ago #15924

  • i7Baby
  • i7Baby's Avatar
  • Offline
  • Moderator
  • Posts: 590
  • Thank you received: 87
  • Karma: 18
The figures for 1% lows and 0.1% lows can also be averages. eg for 1% lows = 63 fps, there might actually be a range from 68 to 47 over 20 1 second intervals. Depends how its measured.
R7-1700 @3900/1.375V, H110iGT, ASRock x370 Taichi, GSkill 2x8GB 3200/14 @3466/14 1.38V, Samsung 250gb EVO (W10), 2TB Barracuda, 2 x R9 Nano, EVGA G2 750, Enthoo Pro, LG 34UM88. Aizo kb, M6580, HyperX Cloud
Last Edit: 2 months 5 days ago by i7Baby.
The administrator has disabled public write access.
Moderators: 2ndPlayer, i7Baby
Time to create page: 0.097 seconds
Powered by Kunena Forum

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge