Benchmarking has long had a problem of ensuring numbers relate back to the reality of what players feel when gaming. The issue mostly comes to explaining precisely why stutters and hitching are happening in games, not just that they exist. For the most part, “frametime” testing (or more accurately, frame-to-frame interval testing) has helped to tackle this. Tom Petersen, Scott Wasson, and Ryan Shrout furthered that discussion over a decade ago, and from that, the industry moved toward the eventual bar chart adaptation with 1% and 0.1% lows (with occasional frame interval plots to explore major problems). But under the surface, these game stutters have sometimes been misattributed to frametime pacing issues rather than the actual problem, which was animation error, aka simulation time error. Think of this video like a research / whitepaper piece that presents some experiments and possible representations for animation error in benchmarks. This explains 1% lows, 0.1% lows, average FPS, “frametimes,” and animation error in-depth. We will begin adopting these new charts in our benchmarks.

  • BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 days ago

    It’s awesome we finally have a way to test and benchmark this. It has long been something one would notice, something that would make games feel like shit. But there was never a way to put a number on it before today.