[LibGDX] Quite noticeable difference from delta-time

So I was noticing quite a difference in the height my player jumps. I ran a little test and these were the results for the height of the jump:

This uses a unit scale (1u = 32px) so the results sort of confused me. It has considerable differences nearly every single jump, but my jumping is based on delta time, which should give me very similar results every time.

Here is my jumping algorithm:


if (jump && !jumping) {
	jumping = true;
	velocity.y = MAX_VELOCITY * deltaTime;
} else if (jumping) {
	position.y += velocity.y;
	velocity.y -= ACCELERATION * deltaTime;
	if (velocity.y < 0f) {
		velocity.y = 0;
		fall = true;
		jumping = false;
	}
} 

I don’t think I’m doing anything wrong. Could anyone suggest a possible cause of the differences?

Try this:


if (jump && !jumping) {
   jumping = true;
   velocity.y = MAX_VELOCITY;
} else if (jumping) {
   position.y += velocity.y * deltaTime;
   velocity.y -= ACCELERATION * deltaTime;
   if (velocity.y < 0f) {
      velocity.y = 0;
      fall = true;
      jumping = false;
   }
} 

Oh stupid me. I actually feel embarrassed to have posted this. :L

We’ve all been there my friend.

I’d strongly suggest using fixed time-steps for simulation in future projects.

So I should just base it on 60fps for example? Should I also do that for moving left/right?

If you’re not too far into your game then run all simulation (player and otherwise) at a fixed rate. 60Hz is probably a reasonable choice and just let the game slow down if too much computation is happening. Keep it simple to start with.