Suppose you take two unit-length steps from the origin, and suppose that the respective angles A and B of each step with the x-axis are independent random variables uniformly distributed between 0 and 2pi. What's the expected distance you'll end up from the origin?

It's easy to see visually that the answer is independent of the direction of the first step; so if we assume our first step goes one unit in the y-direction (i.e., with A = pi/2), we can see that at the end of the second step that the distance from the origin is given by

So the expectation in question should be equal to

To evaluate this integral directly, I had to use the online Wolfram Integrator and then use l'Hopital's rule multiple times on messy functions. The answer I ended up getting is 4/pi, which I'm reasonably confident is correct; but (assuming it is correct) is there a better way to go about finding the solution?

Also, and probably much harder, is there a way to calculate the expected distance from the origin after n steps? Or a way to find the probability distribution for the distance after n steps?