I need a sanity check on a problem I’m trying to solve. I know where a number of moving objects are positioned at t=6 and t=7. I know where the destination is at t=7. So my goal is that I need to pick the “best” object i.e. the object moving most in the direction of the destination.
So for every object I’m doing the following:
I know position of object at t=7 is (x1,y1)
I know position of destination at t=7 is (x2,y2)
dot_product = (x1*x2)+(y1*y2)
length_a = sqrt(x1*x1 + y1*y1)
length_b = sqrt(x2*x2 + y2*y2)
result_radians = acos(dot_product/(length_a*length_b))
result_degrees = result_radians * 180 / PI
So my first question is do you think this is correct? It doesn’t seem to give me the right answer e.g. if an object A is at (2,5) at t=6 and (2,2) at t=7 and the destination is at (4,2) at t=7, I seem to get an angle of approx 57 degrees when I would expect an angle of 90 degrees.
Also am I approaching the problem correctly? After I determine the angle of every object with the destination, is it true to say that to pick the “best” object i.e. the one moving towards the destination, is by picking the one with the smallest angle?
Thanks in advance.