# Rayleigh distribution and unbiased estimator

• Jun 19th 2010, 01:57 PM
losm1
Rayleigh distribution and unbiased estimator
$X_1 ... X_n$ is random sample from Rayleigh distribution $f(x;\Theta)$

1. Show that $E(X^2) = 2\Theta$ and than construct unbiased estimator of parameter $\Theta$ based on $\sum_{k=1}^n X_k^2$

2. Estimate parameter $\Theta$ from following $n=10$ observations:
Code:

16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator $\hat{\Theta}$

$Var(X)=E(X^2) - E^2(X)$

$Var(X)=2\Theta - E^2(X)$

$\frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2$

$4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})$

$\Theta = \Theta$

2. I need help with this one
• Jun 19th 2010, 03:05 PM
mr fantastic
Quote:

Originally Posted by losm1
$X_1 ... X_n$ is random sample from Rayleigh distribution $f(x;\Theta)$

1. Show that $E(X^2) = 2\Theta$ and than construct unbiased estimator of parameter $\Theta$ based on $\sum_{k=1}^n X_k^2$

2. Estimate parameter $\Theta$ from following $n=10$ observations:
Code:

16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator $\hat{\Theta}$

$Var(X)=E(X^2) - E^2(X)$

$Var(X)=2\Theta - E^2(X)$

$\frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2$

$4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})$

$\Theta = \Theta$

2. I need help with this one

Use the unbiased estimator found in part 1!
• Jun 20th 2010, 03:07 AM
losm1
First, how do I know that it is unbiased?
Second, basically I should just calculate $E(X^2)$ with given dataset?

Thanks
• Jun 20th 2010, 03:26 AM
SpringFan25
An estimate is unbiased if its expected value is equal to the true value of the parameter being estimated. So, you can confirm the estimate is unbiased by taking its expectation.

So, assuming your estimate was
$\hat{\Theta} = \frac{\sum{x^2}}{2n}$
$E(\hat{\Theta}) = E \left(\frac{\sum{x^2}}{2n} \right)$
$E(\hat{\Theta}) = 0.5n^{-1}1 E \left(\sum{x^2} \right)$
$E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(x_i^2)} \right)$
$E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(X^2)} \right)$
$E(\hat{\Theta}) = 0.5n^{-1} * n{E(X^2) \right)$
$E(\hat{\Theta}) = 0.5 E(X^2) \right)$
$E(\hat{\Theta}) = \Theta$

And the estimator is unbiased.

I have been told, but i never managed to prove, that all method of moments estimates are unbiased and you dont need to check each time. But im not convinced...
• Oct 27th 2013, 09:49 AM
dbooksta
Re: Rayleigh distribution and unbiased estimator
Guys, what am I missing here? I ran monte carlo simulations for Rayleigh samples and this estimator does not come close for small n. For example, with sigma = 1.0, using sample sizes of 2 the average value of the estimate is 0.50 (over 1MM iterations). Even with sample sizes of 10 the estimate is still only .90! Shouldn't an unbiased estimator work better than this for small n?