Camera "Tweening"

I’m trying to get a smooth camera effect where the camera is slightly behind the player and takes a few moments to catch up to him. For example, the player will be more towards the left side of the screen while the camera is moving left, and vice versa. Apparently the effect is called tweening, and the algorithm for it would be:

Where x is the position of the camera:

x += (target - x) * CONSTANT

It’s explained in this video: https://youtu.be/6cIN6iW5ois

The thing is, it’s explained in GameMaker while I’m making my game in Java. The code in game maker is:

view_yview[0] += ((y- (view_hview[0]/2)) - view_yview[0]) * 0.1

How would I translate this over to Java? I think I have most of it down, I’m just trying to find out what “target” would be.

I would guess that the target position would be the current camera position plus an offset for movement.

What that tutorial says is called as interpolation. For this, let’s assume that your camera position is a vector, and you have two cameras. The first camera is directly adjusted in the code, it doesn’t tween. The second camera, is what you render your game with.

At every frame, after moving the first camera, you interpolate the second camera with the delta towards the first camera, which is the target. The interpolation formula is this:


public Vector3 lerpSelf(Vector3 target, float alpha)
{
    Vector3 temp = Vector3.REUSABLE_STACK.pop();
    scaleSelf(1f - alpha).addSelf(temp.set(target).scaleSelf(alpha));
    Vector3.REUSABLE_STACK.push(temp);

    return this;
}

The same formula is also used with the rotation. This is my lerp method on my camera. All I do is interpolate my position and the rotation.


public PerspCam lerp(PerspCam p, float alpha)
{
    position.lerpSelf(p.position, alpha);
    rotation.lerpSelf(p.rotation, alpha);

    return this;
}

The effect is demonstrated in this GIF (5 MB).

Hope this helps.