I know this may seem like a dumb question, but I’m having some odd problems when trying to calculate the position of a point in space along a vector.
My objective is to grab the XYZ coordinates of a point in space directly in front of the camera with a set distance.
I know the pitch, roll and yaw angles of the camera already, and my existing calculation looks like the following (angles in this example are originally stored as degrees, then converted into radians):
float addX = 5 * (float)Math.sin(Math.toRadians(camera.yaw));
float addZ = 5 * (float)Math.cos(Math.toRadians(camera.yaw));
float addY = 5 * (float)Math.sin(Math.toRadians(camera.pitch));
float newX = camera.x + addX;
float newZ = camera.z + addZ;
float newY = camera.y + addY;
The problem I have is that the new point - which does project in front of the camera as intended - doesn’t seem to work along the Y axis correctly. It ‘clamps’ between the angles of -45° to +45°.
So, rotating the camera around the Y axis (yawing) seems to work great - the point ‘fires’ off from the camera in the right direction - but the plotting of the point in around the camera’s pitch arc is out of whack!
For reference, this code is intended to be used as a line-of-sight calculation for detecting what the player is looking at.
Any help would be greatly appreciated. Thank you!