Use spherical polar coordinates $\displaystyle (r, /theta, /phi)$ to show that the length of a path joining two points on a sphere of radius R is $\displaystyle L=R\int_{\theta_1}^{\theta_2}\sqrt{1+sin^2\theta\p hi'(\theta)^2}d\theta$

I was initially told that phi should be rotation around the z-axis measured from the x-axis, and theta was the angle up out of the x-y plane. Working it this way I set up a triangle with hypotenuse dL and legs $\displaystyle Rd\theta$ and $\displaystyle |\overrightarrow{r}_{proj}|d\phi$. The following picture shows how I have the coordinates set up:

This gives me:

$\displaystyle |\overrightarrow{r}_{proj}|=Rcos\theta$

$\displaystyle dL^2=R^2d\theta^2+|\overrightarrow{r}_{proj}|^2d\p hi^2=R^2d\theta^2+R^2cos^2\thetad\phi^2$

$\displaystyle L=\int_{1}^{2}dL=R\int_{\theta_1}^{\theta_2}\sqrt{ d\theta^2+cos^2\theta\phi'(\theta)^2d\theta^2} = R\int_{\theta_1}^{\theta_2}\sqrt{1+cos^2\theta\phi '(\theta)^2}d\theta$

Which is what I was supposed to get except I have cosines instead of sines. This problem is easily solved by having $\displaystyle \theta$ be measured down from the z-axis instead of up from the x-y plane, but someone had told me to set up the coordinates as I specified above. I was hoping someone could either verify that what I've done is correct or point out where I messed up. Thanks.