Originally Posted by

**rn443** Suppose you take two unit-length steps from the origin, and suppose that the respective angles A and B of each step with the x-axis are independent random variables uniformly distributed between 0 and 2pi. What's the expected distance you'll end up from the origin?

It's easy to see visually that the answer is independent of the direction of the first step; so if we assume our first step goes one unit in the y-direction (i.e., with A = pi/2), we can see that at the end of the second step that the distance from the origin is given by $\displaystyle \sqrt{(\cos B)^2 + (1 + \sin B)^2} = \sqrt{2}\sqrt{\sin B + 1}.$ So the expectation in question should be equal to $\displaystyle \frac{\sqrt{2}}{2\pi}\displaystyle\int_{0}^{2\pi}\ sqrt{\sin t + 1}\mathrm dt.$ To evaluate this integral directly, I had to use the online Wolfram Integrator and then use l'Hopital's rule multiple times on messy functions. The answer I ended up getting is 4/pi, which I'm reasonably confident is correct; but (assuming it is correct) is there a better way to go about finding the solution?

Also, and probably much harder, is there a way to calculate the expected distance from the origin after n steps? Or a way to find the probability distribution for the distance after n steps?