hello there,
I need some help with this question.
X1,...Xn~f(x,theta)
f(x,theta)=exp{-(x-theta)}exp{-exp{-(x-theta)}}
( look picture attached )
find the Cramer Rao lower bound for the unbiased estimator of theta.
cheers !
Hello,
Assuming the Xi are independent... This is how I've been taught. Dunno how you've learnt it...
We have the joint pdf of the Xi equal to :
$\displaystyle g(x,\theta)=\prod_{i=1}^n \exp\left(-(x_i-\theta)-\exp\left(-(x_i-\theta)\right)\right)=\prod_{i=1}^n \exp\left(\theta-x_i-\exp\left(\theta-x_i\right)\right)$
$\displaystyle =\exp\left(\sum_{i=1}^n [\theta-x_i-e^{\theta-x_i}]\right)$
$\displaystyle =\exp\left(n\theta-\sum_{i=1}^n x_i+e^{\theta-x_i}\right)$
$\displaystyle \Rightarrow \log(g(x,\theta))=n\theta-\sum_{i=1}^n x_i-\sum_{i=1}^n e^{\theta-x_i}$
$\displaystyle \Rightarrow \frac{\partial \log(x,\theta)}{\partial \theta}=n-\sum_{i=1}^n e^{\theta-x_i}=n-e^{\theta} \sum_{i=1}^n e^{-x_i}$
An estimator $\displaystyle \hat{\theta}$ will be solution of $\displaystyle \frac{\partial \log(x,\hat{\theta})}{\partial \theta}=0$
Thus $\displaystyle e^{\hat{\theta}}\sum_{i=1}^n e^{-x_i}=n \Rightarrow \hat{\theta}=\log\left(\frac{n}{\sum_{i=1}^n e^{-x_i}}\right)$
Wow...now, that's ugly !
I've been taught to find Fisher's information, $\displaystyle I(\theta)$ and h such that $\displaystyle h(\theta)=\mathbb{E}(\hat{\theta})$
(if it's an unbiased estimator, then h=Id)
And then Cramer Rao's lower bound is $\displaystyle \frac{[g'(\theta)]^2}{I(\theta)}$