Hi All,
I’ve written a camera class to handle FPS-style movement throughout the game world, but for some reason I don’t seem to be traveling the correct distance along the Y axis when pointing in that direction. Here’s the code that handles the calculation of ‘moving forward’ along the current vector:
protected void moveForward(float distance)
{
// track value (for debugging)
lastMoveForward = distance;
speedX -= (distance * (float)Math.sin(Math.toRadians(yaw)));
speedZ += (distance * (float)Math.cos(Math.toRadians(yaw)));
if (flyMode) { speedY += distance * (float) Math.tan(Math.toRadians(pitch)); }
}
Some quick code notes:
yaw: a value between 0-359
pitch: a value clamped between -80 (looking down) and 80 (looking up)
speedX/Y/Z: values that are added to the camera’s X, Y and Z positions respectively, then zeroed-out ready for the next tick.
Moving along the X and Z axes (when looking along those axes) works beautifully. The distance travelled is ~4 units per second.
However, when flyMode is switched on (for a free-flying camera), and looking up or down, the distance travelled is vastly different - more like 20-30 units per second.
Am I calculating the speed increment along the Y axis totally wrong? Or maybe a combination of them all?
Many thanks for any help anyone can offer. I’m already learning how tremendously resourceful this community can be : )