DeltaTime is a frame behind
27/09/2024
Here is something I didn't realize until embarrassingly long into my gamedev career: when programming animation, physics or gameplay systems, the DeltaTime that gets passed through the game engine is a frame behind the actual DeltaTime we should be using.
I spent many years programming under the impression that as long as I accounted for DeltaTime properly (such as when dealing with springs) that the game should run smoothly without any visual hitches - and that any visual hitches I saw were probably due to someone not accounting for the DeltaTime properly in whatever their subsystem was.
But this isn't actually the case at all, and if you get frames that take longer than expected visual glitches are basically unavoidable, and in fact, sometimes making use of the DeltaTime you get passed through the game engine can give even worse results than if you ignored it.
That might sound surprising to some of you, but the reason is obvious once it clicks: the DeltaTime you get passed by the engine is the time it took between the last two flips of the screen, while when we are doing gameplay, animation, or physics programming, the DeltaTime we generally should be using is the time between the last screen flip, and the upcoming screen flip.
Perhaps a visual diagram will help explain. Take a look at this basic timeline of a game running, with Frames, Flips (a.k.a. presents or vsyncs), and DeltaTimes labeled.
DeltaTime1 DeltaTime2 DeltaTime3 DeltaTime4
- -|- - - - - -|- - - - - -|- - - - - - - - - - - - -|- - - - - -|- -
Flip 2 Flip 3 Flip 4
----+-----------+-----------+-------------------------+-----------+----
| Frame 1 | Frame 2 | Frame 3 | Frame 4 |
----+-----------+-----------+-------------------------+-----------+----
^ ^ ^
AnimUpdate2 AnimUpdate3 AnimUpdate4
We start on Frame 1
with the game running at a regular 60Hz, and a regular flip happening every 0.016666
seconds. On Frame 2
, at some point within the frame, AnimUpdate2
will get called with the DeltaTime recorded from the previous frame DeltaTime1 = 0.016666
, it will advanced forward the animation by 0.016666
seconds, and generate a pose for the character which will be sent to the rendering system to be displayed on Flip 2
(ignoring triple buffering). So far so good.
When we get to Frame 3
, the problems begin. On Frame 3
, AnimUpdate3
gets called with the DeltaTime from the previous frame DelaTime2 = 0.016666
. So it advances the animation forward 0.016666
seconds in time, and then sends the new pose to the rendering system ready to present on Flip 3
. Unfortunately, for whatever reason Frame 3
takes longer to prepare than expected (let's say it takes 0.0333333
seconds). This introduces our first visual glitch. In the real world 0.0333333
seconds of time has elapsed since the previous image was shown to the player, but the animation of the character has only advanced 0.016666
seconds. In other words, to the player, once they see the new image of the game on Flip 3
, the character will be perceived to be moving slower than usual for a single frame.
The problems continue on Frame 4
. Here AnimUpdate4
gets called with the DeltaTime from the previous frame Delatime3 = 0.0333333
. So it advances forward the animation by 0.0333333
seconds and sends the pose to the rendering system ready for Flip 4
. This frame does not take longer than expected and completes in 0.016666
seconds as usual. Now we have the opposite problem. In the real world only 0.016666
seconds of time has elapsed, but the animation of the character has advanced by 0.0333333
seconds. So now, to the player, when they see the image presented at Flip 4
, the character will be moving faster than expected for a single frame!
The reason such visual glitches are almost unavoidable is because the DeltaTime we actually need in the animation system is something we don't know yet: we need the amount of time we expect the current frame to take - something that can't be known until the frame is actually done! For example, we would need to call AnimUpdate3
with DeltaTime3
- but of course we can't do that because DeltaTime3
isn't actually known yet.
I like to think of it like this: the DeltaTime argument we want as input to our animation system should actually be called something like TimeBetweenLastAndNextFlip
, and the DeltaTime we get from the engine should actually be called something like TimeBetweenLastTwoFlips
, and effectively we are making an assumption that we can estimate TimeBetweenLastAndNextFlip
from TimeBetweenLastTwoFlips
:
float EstimateTimeBetweenLastAndNextFlipUsingTimeBetweenLastTwoFlips(
float TimeBetweenLastTwoFlips)
{
return TimeBetweenLastTwoFlips;
}
Perhaps there is a reason such names did not catch on! But nonetheless, when we think about things like this it makes it clear that there are plenty of other options for handling this, and why in certain cases we might even get better results ignoring the DeltaTime coming from the engine.
For example, the glitch on Frame 3
may be difficult to avoid, but we can avoid the glitch on Frame 4
by not using the long DeltaTime we get from the previous frame. For example, we could simply use some fixed DeltaTime such as 0.016666
.
float EstimateTimeBetweenLastAndNextFlipUsingFixedDeltaTime(
float FixedDeltaTime)
{
return FixedDeltaTime;
}
That would mean no glitch on Frame 4
but does require our game to run pretty reliably at a fixed frame rate. If our game often ticks at irregular frame rates or has long periods of slowdown we could instead maintain some kind of basic rolling average for the DeltaTime instead:
float EstimateTimeBetweenLastAndNextFlipUsingRollingAverage(
float& CurrentAverage, float TimeBetweenLastTwoFlips)
{
CurrentAverage = lerp(CurrentAverage, TimeBetweenLastTwoFlips, 0.1f);
return CurrentAverage;
}
But if we expect individual frames to often take longer than usual, then both of these options could look worse than just using the previous frame's DeltaTime as an estimate. This is because the glitch on Frame 4
(in a way) counter-compensates for the glitch on Frame 3
. If you consider both frames together, the character does actually appear to have moved the right amount - just in a jittery way. If you frequently have single frames that take longer than normal and always try to remove the glitch on Frame 4
without doing anything about the glitch on Frame 3
, it will appear like the character is moving more slowly.
Finally, we might want to try and predict TimeBetweenLastAndNextFlip
from some other information about the frame. There are many ways we could approach this using all sorts of information about the game and what has happened so far in the frame - and finding a method that works well depends heavily on the exact setup of the game.
Here is a stupidly simple idea which might actually work remarkably well and be pretty fool-proof if your animation update runs relatively late in the frame and vsync is enabled: we just round up the elapsed time at the point the animation update gets called to the nearest multiple of 0.016666
:
float EstimateTimeBetweenLastAndNextFlipUsingElapsedTime(
float ElapsedFrameTime, float MinFrameTime, float MaxFrameTime)
{
return clampf(ceilf(ElapsedGameTime * 60.0f) / 60.0f, MinFrameTime, MaxFrameTime)
}
So perhaps next time you notice jittery animation caused by frames taking longer than usual, have some more sympathy for the developers. To completely remove that jitter would require solving one of two almost impossible challenges: a game that runs perfectly at 60Hz every single frame without fault, or the ability to perfectly predict the future!
Appendix: This whole situation gets a lot more complicated once we start taking into account triple buffering, GPU pipelining, and input handling. If you want to read into it in a bit more depth I highly recommend Unity's blog post on the problem, as well as activision's talk on reducing input latency, and this fun blog post on frame timing.