Rayleigh distribution and unbiased estimator

Apr 2010
36
2
\(\displaystyle X_1 ... X_n\) is random sample from Rayleigh distribution \(\displaystyle f(x;\Theta)\)

1. Show that \(\displaystyle E(X^2) = 2\Theta\) and than construct unbiased estimator of parameter \(\displaystyle \Theta\) based on \(\displaystyle \sum_{k=1}^n X_k^2\)

2. Estimate parameter \(\displaystyle \Theta\) from following \(\displaystyle n=10\) observations:
Code:
16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator \(\displaystyle \hat{\Theta}\)

\(\displaystyle Var(X)=E(X^2) - E^2(X)\)

\(\displaystyle Var(X)=2\Theta - E^2(X)\)

\(\displaystyle \frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2\)

\(\displaystyle 4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})\)

\(\displaystyle \Theta = \Theta \)


2. I need help with this one
 

mr fantastic

MHF Hall of Fame
Dec 2007
16,948
6,768
Zeitgeist
\(\displaystyle X_1 ... X_n\) is random sample from Rayleigh distribution \(\displaystyle f(x;\Theta)\)

1. Show that \(\displaystyle E(X^2) = 2\Theta\) and than construct unbiased estimator of parameter \(\displaystyle \Theta\) based on \(\displaystyle \sum_{k=1}^n X_k^2\)

2. Estimate parameter \(\displaystyle \Theta\) from following \(\displaystyle n=10\) observations:
Code:
16.88 10.23 4.59 6.66 13.68 14.23 19.87 9.40 6.51 10.95
---

1. I have just plugged in 2 theta in Rayleigh's variance formula and it evaluates to true, but I'm not sure about correct way of constructing unbiased estimator \(\displaystyle \hat{\Theta}\)

\(\displaystyle Var(X)=E(X^2) - E^2(X)\)

\(\displaystyle Var(X)=2\Theta - E^2(X)\)

\(\displaystyle \frac{4-\pi}{2}\Theta=2\Theta - \sqrt{\Theta\cdot\frac{\pi}{2}}^2\)

\(\displaystyle 4\Theta - \pi\Theta = 2(2\Theta - \frac{\pi\Theta}{2})\)

\(\displaystyle \Theta = \Theta \)


2. I need help with this one
Use the unbiased estimator found in part 1!
 
Apr 2010
36
2
First, how do I know that it is unbiased?
Second, basically I should just calculate \(\displaystyle E(X^2)\) with given dataset?

Thanks
 
May 2010
1,034
272
An estimate is unbiased if its expected value is equal to the true value of the parameter being estimated. So, you can confirm the estimate is unbiased by taking its expectation.

So, assuming your estimate was
\(\displaystyle \hat{\Theta} = \frac{\sum{x^2}}{2n}\)
\(\displaystyle E(\hat{\Theta}) = E \left(\frac{\sum{x^2}}{2n} \right)\)
\(\displaystyle E(\hat{\Theta}) = 0.5n^{-1}1 E \left(\sum{x^2} \right)\)
\(\displaystyle E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(x_i^2)} \right)\)
\(\displaystyle E(\hat{\Theta}) = 0.5n^{-1} \left(\sum{E(X^2)} \right)\)
\(\displaystyle E(\hat{\Theta}) = 0.5n^{-1} * n{E(X^2) \right)\)
\(\displaystyle E(\hat{\Theta}) = 0.5 E(X^2) \right)\)
\(\displaystyle E(\hat{\Theta}) = \Theta\)

And the estimator is unbiased.

I have been told, but i never managed to prove, that all method of moments estimates are unbiased and you dont need to check each time. But im not convinced...
 
Last edited:
Oct 2013
11
0
United States
Guys, what am I missing here? I ran monte carlo simulations for Rayleigh samples and this estimator does not come close for small n. For example, with sigma = 1.0, using sample sizes of 2 the average value of the estimate is 0.50 (over 1MM iterations). Even with sample sizes of 10 the estimate is still only .90! Shouldn't an unbiased estimator work better than this for small n?