Not sure if maybe this belongs in the geometry forum, but I'm using the methods of linear algebra, so here it is.
Anyway, I have a problem I need to solve for a piece of software I'm writing, and I think I've got it but it would be great if somebody could take a quick look at this proof and see if I've overlooked anything. Thanks in advance.
Here is the problem: We're working in here. Given a line segment defined by endpoints and , and a unit vector , what is the minimum angle between and any point on the line segment? We can assume the segment does not pass through the origin.
And here's what I came up with: If the line defined by and passes through the origin (outside the segment), all angles will be the same and clearly = the angle between and (or ). Otherwise , and the origin define a plane . We can reduce the problem to a two dimensional analysis as follows. Let be the projection of into . If is zero, then all the vectors in will be orthogonal to , and so .
For nonzero , check if lies between and . That is, if and are both . If so, then there exists a vector to some point in that is just a scalar multiple of , so is simply the minimum angle between and , i.e. (which will be less than ). Similarly, if lies between and , then is the angle between and .
If lie outside of the angle between and , then the angle with will be monotonic while traversing the line segment (this is correct, right?). Therefore the minimum angle will occur at one of the endpoints, and so is the minimum of and (which could be greater than ).
Where I have written I will be calculating according to the usual formula . I'm interested in the direction of but not its length, which is why I specified a unit vector.
Okay, if you're still with me and have any thoughts for me, thank you!!