Trouble with 3D hitscan

This isn’t programmed in Java, so that’s why I put it here. But since it’s math, people here should be able to understand it. I’m trying to make a hitscan script in a 3D space, but it doesn’t work right at all, since it always “detects” that I aim at my feet, and not where I aiming at. So I want to know, if the problem the math, or somewhere else?

lv_x = Sin(lv_pitch) * Cos(lv_yaw);
lv_y = Sin(lv_pitch) * Sin(lv_yaw);
lv_z = Cos(lv_pitch);

lv_x = PointGetX(CameraGetTarget(lp_player)) + (lv_x * lv_distance);
lv_y = PointGetY(CameraGetTarget(lp_player)) + (lv_y * lv_distance);
lv_z = lv_cameraHeight + (lv_z * lv_distance);

lv_distance increases by 1 in a while loop until it hits something.
lv_pitch and lv_law is both in degrees.

Are you hitting your own player?

Ops, clicked wrong when I was going to click “quote”. Anyway, Sometimes, other times the shot goes straight up in the air.

It looks like the math would work as you’d expect, so I’m guessing the problem is with inputs. Set a breakpoint in the code and check if the values are what you’re expecting. Also make sure your lv_ variables are not integers, but of float/double types.

Maybe it’s too obvious to be worth mentioning, but I assume your trig functions take degrees and not radians? (Apologies if it seems silly to ask, but it is a common source of errors.)

Also, I’m not sure how your code is structured, but if you’re overwriting the original values of lv_x/y/z in the while loop, it doesn’t seem like the arithmetic would make sense past the first iteration (because you’d be adding a point to a point rather than a scaled direction vector to a point). Maybe I’m missing something there though.

In any case, as suggested above, some debug output or using the debugger would probably provide some hints as to what’s going wrong.

Well the idea is that I override x, y, an z in each iteration because I advance the point. So the first iteration would be right in front of me, while every other iteration would be a bit farther along. Then I return the point where I hit something, so I end up with yaw, pitch, and distance to the object I’m aiming at.

Sure, I understand what you mean about advancing along the ray.

Here’s a suggestion. Set up a situation where you know what the values should be. That is, put the player (or whatever) at some nice integer coordinates, and set up the angles so that the direction vector is parallel to a cardinal axis and has a length of one. Then, either follow along in the debugger or print some debug output to see what the coordinates are each time through the loop.

Presumably, if it’s working, you should see something like:

(3,4,5)->(4,4,5)->(5,4,5)->(6,4,5)->etc.

So try it out and see if that’s actually what you get (factoring in any numerical error issues, of course).

Also, as soon as you get this sorted (or maybe even before) I’d suggest switching to using a vector class of some sort rather than tracking each element separately. Although there might be some arguments that using separate elements is preferable, I think overall using a class would probably make your code simpler to understand and debug.

Don’t attempt this. It’s broken beyond repair. You’re attempting to test an intersection with a ray via points on the ray…that doesn’t work (or is prohibitively expensive). Consider the ray passing through a corner of a box. No choice of distance to move along the ray is insure to return an (approximately) correct result.