Thread: Rayleigh distribution and unbiased estimator

1. Rayleigh distribution and unbiased estimator

$\displaystyle X_1 ... X_n$ is random sample from Rayleigh distribution $\displaystyle f(x;\Theta)$

1. Show that $\displaystyle E(X^2) = 2\Theta$ and than construct unbiased estimator of parameter $\displaystyle \Theta$ based on $\displaystyle \sum_{k=1}^n X_k^2$

2. Estimate parameter $\displaystyle \Theta$ from following $\displaystyle n=10$ observations:
Code:
16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator $\displaystyle \hat{\Theta}$

$\displaystyle Var(X)=E(X^2) - E^2(X)$

$\displaystyle Var(X)=2\Theta - E^2(X)$

$\displaystyle \frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2$

$\displaystyle 4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})$

$\displaystyle \Theta = \Theta$

2. I need help with this one

2. Originally Posted by losm1
$\displaystyle X_1 ... X_n$ is random sample from Rayleigh distribution $\displaystyle f(x;\Theta)$

1. Show that $\displaystyle E(X^2) = 2\Theta$ and than construct unbiased estimator of parameter $\displaystyle \Theta$ based on $\displaystyle \sum_{k=1}^n X_k^2$

2. Estimate parameter $\displaystyle \Theta$ from following $\displaystyle n=10$ observations:
Code:
16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator $\displaystyle \hat{\Theta}$

$\displaystyle Var(X)=E(X^2) - E^2(X)$

$\displaystyle Var(X)=2\Theta - E^2(X)$

$\displaystyle \frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2$

$\displaystyle 4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})$

$\displaystyle \Theta = \Theta$

2. I need help with this one
Use the unbiased estimator found in part 1!

3. First, how do I know that it is unbiased?
Second, basically I should just calculate $\displaystyle E(X^2)$ with given dataset?

Thanks

4. An estimate is unbiased if its expected value is equal to the true value of the parameter being estimated. So, you can confirm the estimate is unbiased by taking its expectation.

$\displaystyle \hat{\Theta} = \frac{\sum{x^2}}{2n}$
$\displaystyle E(\hat{\Theta}) = E \left(\frac{\sum{x^2}}{2n} \right)$
$\displaystyle E(\hat{\Theta}) = 0.5n^{-1}1 E \left(\sum{x^2} \right)$
$\displaystyle E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(x_i^2)} \right)$
$\displaystyle E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(X^2)} \right)$
$\displaystyle E(\hat{\Theta}) = 0.5n^{-1} * n{E(X^2) \right)$
$\displaystyle E(\hat{\Theta}) = 0.5 E(X^2) \right)$
$\displaystyle E(\hat{\Theta}) = \Theta$

And the estimator is unbiased.

I have been told, but i never managed to prove, that all method of moments estimates are unbiased and you dont need to check each time. But im not convinced...

5. Re: Rayleigh distribution and unbiased estimator

Guys, what am I missing here? I ran monte carlo simulations for Rayleigh samples and this estimator does not come close for small n. For example, with sigma = 1.0, using sample sizes of 2 the average value of the estimate is 0.50 (over 1MM iterations). Even with sample sizes of 10 the estimate is still only .90! Shouldn't an unbiased estimator work better than this for small n?

,

,

,

mle gamma unbiased proof

Click on a term to search for related topics.