A 2D object has an x/y position, x/y direction, and scalar velocity.
The direction vector will always be a maximum of one unit in length, i.e. between [-1.0, -1.0] and [+1.0, +1.0].
So determining it's next position is as simple as: position + (direction * velocity)
Since the direction vector can be anywhere between -1.0, 0.0, and +1.0, it will always "point" in a relative direction.

With this information, how do you determine the angle between where it currently is, and where it will be, taking all four quadrants into consideration?
This is so the angle may be converted into degrees (from 0, being north, to 360, like on a compass).

What I've tried, so far:
- Find the dot product of the displacement vector. This vector is the one stated above (displacement = position + (direction * velocity))
- Find the inverse cosign of that dot product, resulting in an angle
- Convert that angle to degrees (degrees = angle * 180.0 / PI)

... The problem here is that it only takes the first quadrant into consideration, meaning the resulting degree will only be between 0 and 90.
But we want to figure out the degrees between 0 and 360... How is this achieved?