Originally Posted by

**SterlingM** I have a robot that has some linear and angular velocity.

I receive odometry information, which will be the position (x, y, theta) and velocity (linear and angular) in the robot's frame.

When the robot has linear and angular velocity, v and w, I want to predict the trajectory it is moving on. This should be a circle (or arc if we only consider part of the trajectory).

When I receive the odometry information, I start by finding the angle from the robot frame origin to its position, polar_theta. Then I find the distance of the vector from the origin to its position, polar_r. With these values, I have the current polar coordinates.

Then, I can find polar_theta' (polar theta prime) by displacing polar_theta by w*t, where t is some number of seconds. I have been using 0.25 seconds. Then to find the new x, y, and theta values in the robot's frame, I do:

x_r' = polar_r * cos(polar_theta')

y_r' = polar_r * sin(polar_theta')

theta_r' = theta_r + w*t

Then I can convert these values to the world coordinate system.

My question is why do I not use the radius found by r = v/w ? I feel like that should be the radius for polar coordinates I am using. However, when I use that value, I get incorrect results:

x_r' = r * cos(polar_theta') //Incorrect results!

y_r' = r * sin(polar_theta') //Incorrect results!

The radius r found from v/w and the radius polar_r found from the distance from the origin to the position are different. Why is the radius r = v/w not working here? I feel like I should be using it since the linear and angular velocity should define the circle.

Any help is appreciated.