I’m working on a collision method that detects at what time a moving circle makes contact with a line over the interval [0, 1] (it returns 1.0 if there’s no collision). I’m able to determine whether or not the circle is moving toward the line and if the circle’s velocity could ever move itself close enough to the line for there to be collision. However, I can’t figure out how to find the distance the circle would have to go to hit the line, which I’d get my time from.
Here’s pseudo code for the part of my method that works. I assume the line is infinite:
double calculateCollisionTime(Line2D line, Circle circle) {
// Get unit vector of line’s normal pointing to side of line circle is on,
// as well as shortest distance between line and circle’s centre coordinate.
if (circle.velocityMagnitude() < circle_line_distance - circle.radius())
return 1.0;
// Get unit vector of circle’s velocity.
// Get dot product of circle’s unit velocity vector and line’s unit normal vector.
// This corresponds to the angle between the circle’s velocity and the line.
if (dotprod <= 0) // Circle parallel to or going away from line.
return 1.0;
// This is the part I’m having trouble with.
// Find distance between circle and line parallel to circle’s velocity. If it’s less
// than the circle’s velocity, expect a collision. Divide this distance by the circle’s
// velocity to get time.
return time.
}
I have searched on the internet, and the few articles I found didn’t help me. Does anyone here have any suggestions?
Thank you.