1. ## Cramer Rao

hello there,

I need some help with this question.

X1,...Xn~f(x,theta)

f(x,theta)=exp{-(x-theta)}exp{-exp{-(x-theta)}}

( look picture attached )

find the Cramer Rao lower bound for the unbiased estimator of theta.

cheers !

2. Hello,

Assuming the Xi are independent... This is how I've been taught. Dunno how you've learnt it...

We have the joint pdf of the Xi equal to :

$g(x,\theta)=\prod_{i=1}^n \exp\left(-(x_i-\theta)-\exp\left(-(x_i-\theta)\right)\right)=\prod_{i=1}^n \exp\left(\theta-x_i-\exp\left(\theta-x_i\right)\right)$

$=\exp\left(\sum_{i=1}^n [\theta-x_i-e^{\theta-x_i}]\right)$

$=\exp\left(n\theta-\sum_{i=1}^n x_i+e^{\theta-x_i}\right)$

$\Rightarrow \log(g(x,\theta))=n\theta-\sum_{i=1}^n x_i-\sum_{i=1}^n e^{\theta-x_i}$

$\Rightarrow \frac{\partial \log(x,\theta)}{\partial \theta}=n-\sum_{i=1}^n e^{\theta-x_i}=n-e^{\theta} \sum_{i=1}^n e^{-x_i}$

An estimator $\hat{\theta}$ will be solution of $\frac{\partial \log(x,\hat{\theta})}{\partial \theta}=0$

Thus $e^{\hat{\theta}}\sum_{i=1}^n e^{-x_i}=n \Rightarrow \hat{\theta}=\log\left(\frac{n}{\sum_{i=1}^n e^{-x_i}}\right)$

Wow...now, that's ugly !

I've been taught to find Fisher's information, $I(\theta)$ and h such that $h(\theta)=\mathbb{E}(\hat{\theta})$
(if it's an unbiased estimator, then h=Id)

And then Cramer Rao's lower bound is $\frac{[g'(\theta)]^2}{I(\theta)}$

3. it is ugly !

anyone knows how to finish it ?