Hi there,
For my final year project at Uni I am coding an AI football team. The pitch provided is setup using a coordinate based system. In order for the players to find their location on the pitch, they need to calculate it from the 'flags' that they can see. These flags have known position vectors. Each player is aware how far each flag is away, and the angle relative to the player.
So...to go about this problem, what i did was create a right angled triangle between the player, the corner flag, and the side line. From the info given, I can work out all the lengths of the sides of the triangle, and all the angles in the triangle. Now, from this information I intended to calculate the position of the player by taking the flag's position vector, and subtracting/adding the x and y components to find the position of the player.
This is all very well...but as I am trying to write this in Java it is not as easy as i hoped.
The problem i have is that as the point (0,0) is the centre spot on the pitch, the player can easily be in a quadrant giving him coordinates (-1, 1) and see a flag with coordinates (52, -34). In this scenario, the x component is calculated as 53 and the y component as 35....
This is fine, but when you do (52 - 53, -34 - 35) you get (-1, -69). To counter this, i wrote a method that does -- as above, as well as calculate -+, ++ & +-. This gives me the answers (-1, -69), (-1, 1), (105, 1) & (105, -69). Now, as the bounds of the pitch are x= +- 52 and y= +- 34 only one of these is possible.
However, would this always be the case depending on if the player was on any position on the pitch, and looking at any of the corner flags? This is really important that I know this works for every case!
Thanks for any help!
Ps. Please excuse the crude paint sketch and the unusual way of having the graph axis. This is how it works on the server, and we just have to use it...