Rendered at 11:24:22 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
g7r 38 seconds ago [-]
Technically, the methods with a queue drop up to an entire frame at the beginning of the window. Depending on how the averageProcessingTime() function is implemented, this can mean either faster recovery after a single heavy frame (if it divides by the sum of the durations of the frames in the window) or slightly lower than actual values overall (if it just divides by the duration of the window).
But that's just the nerd in me talking. The article is great!
flohofwoe 38 seconds ago [-]
...and don't just smooth your measured frame duration for displaying the FPS, but also use it as actual frame time.
For this smoothing/filtering purpose, I found an EMA (Exponential Moving Average) more useful than a simple sliding window average. It reacts quicker and 'less harshly' to frame duration changes (like moving a window to a display with different refresh rate), it's also easier to implement since it doesn't require a ring buffer of previous frame durations.
The measured frame duration will have jitter up to 1 or even 2 milliseconds for various reasons, and what you are measuring is actually the duration of the last frame, but you'll use that duration for timing-dependent computation of the current frame. E.g. if the last frame was a 'long' frame, but the current frame is 'short' you'll overshoot and introduce visible micro-stuttering.
The measurement jitter may be caused by different reasons, e.g. most of it is probably caused by the process/thread scheduler which doesn't care about a thread being scheduled a millisecond wiggle-room here or there. On web browsers all time sources are have reduced precision, but thankfully the 'precision jitter' goes both ways and averaging over enough frames gives you back the exact frame duration (e.g. 8.333 or 16.667 milliseconds).
On some 3D APIs you can also query the 'presentation timestamp', but so far I only found the timestamp provided by CAMetalLayer on macOS and iOS to be completely jitter-free.
TL;DR: frame timing for games is a surprisingly complex topic.
But that's just the nerd in me talking. The article is great!
For this smoothing/filtering purpose, I found an EMA (Exponential Moving Average) more useful than a simple sliding window average. It reacts quicker and 'less harshly' to frame duration changes (like moving a window to a display with different refresh rate), it's also easier to implement since it doesn't require a ring buffer of previous frame durations.
The measured frame duration will have jitter up to 1 or even 2 milliseconds for various reasons, and what you are measuring is actually the duration of the last frame, but you'll use that duration for timing-dependent computation of the current frame. E.g. if the last frame was a 'long' frame, but the current frame is 'short' you'll overshoot and introduce visible micro-stuttering.
The measurement jitter may be caused by different reasons, e.g. most of it is probably caused by the process/thread scheduler which doesn't care about a thread being scheduled a millisecond wiggle-room here or there. On web browsers all time sources are have reduced precision, but thankfully the 'precision jitter' goes both ways and averaging over enough frames gives you back the exact frame duration (e.g. 8.333 or 16.667 milliseconds).
On some 3D APIs you can also query the 'presentation timestamp', but so far I only found the timestamp provided by CAMetalLayer on macOS and iOS to be completely jitter-free.
TL;DR: frame timing for games is a surprisingly complex topic.