Originally Posted by

**jabernathy** Hello!

I'd like to create a probability density function where the probability of observing a random variable is proportional to the distance from a circle.

Let the pair $\displaystyle (r, \theta)$ represent any location in polar coordinates and

let the pair $\displaystyle (r_1, \theta_1)$ be the centre of the circle of radius $\displaystyle R$.

The distance between any point and the centre of the circle can be found using the law of cosines: $\displaystyle d = \sqrt{r^2 + r_1^2 -2r r_1\cos(\theta - \theta_1)}$

Which means the distance to the closest point on the circle is $\displaystyle D = |d - R|$

Now the distribution is Gaussian shaped so I set the pdf as:

$\displaystyle p(r, \theta) = e^{(-\frac{D^2}{2 \sigma^2})}$

The probability of a random variable falling in a polar area $\displaystyle P_A$ is found by integration:

$\displaystyle P_A = \int_{r_a}^{r_b}\int_{\theta_a}^{\theta_b}p(r, \theta)rdr d \theta = \int_{r_a}^{r_b}\int_{\theta_a}^{\theta_b} e^{-\frac{(\sqrt{r^2 + r_1^2 - 2r r_1 \cos(\theta - \theta_1)} - R)^2}{2 \sigma^2}} rdr d \theta$

But neither Maxima nor Mathematica were able to solve that integral.

I started a Taylor expansion about a fixed point but it got ugly fast.

Any suggestions for another method of solving or approximating this integral are appreciated!