find the fisher information

Sep 2009
36
0
\(\displaystyle X \ \ \ has \ \ \ distribution \ \ \ Exp(\lambda) \ \ \ \ p(x| \lambda)=\lambda exp(-\lambda x) \ \ \ \ \ \ and \ \ \ \ \phi=Prob(X>x|\lambda) \)

compute the fisher information for phi.


I think \(\displaystyle \phi=e^{-\lambda x} \)

and therefore the \(\displaystyle p(x|\phi)=\phi\frac{\log{\phi}}{-x}, \ \ \ L(x,\phi)=\log{\phi}+\log{\frac{
\log{\phi}}{-x}} \)

then the calculation becomes too complex.
 
Nov 2009
517
130
Big Red, NY
\(\displaystyle X \ \ \ has \ \ \ distribution \ \ \ Exp(\lambda) \ \ \ \ p(x| \lambda)=\lambda exp(-\lambda x) \ \ \ \ \ \ and \ \ \ \ \phi=Prob(X>x|\lambda) \)

compute the fisher information for phi.


I think \(\displaystyle \phi=e^{-\lambda x} \)

and therefore the \(\displaystyle p(x|\phi)=\phi\frac{\log{\phi}}{-x}, \ \ \ L(x,\phi)=\log{\phi}+\log{\frac{
\log{\phi}}{-x}} \)

then the calculation becomes too complex.
You want the second derivative w.r.t. phi?
 
Nov 2009
517
130
Big Red, NY
\(\displaystyle L(x,\phi)=\log{\phi}+\log{\log{\phi}}\underbrace{ - \log{(-x)}}\) \(\displaystyle \color{red}{\text{[]EDIT[] You have to incorporate the -log(-x) part since:}}\) \(\displaystyle \color{red}{\phi=e^{-\lambda x} \implies x = \frac{\log\phi}{-\lambda}}\)

\(\displaystyle \frac{d}{d\phi}L(x,\phi)=\frac{1}{\phi}+\frac{1}{\phi}\cdot\frac{1}{\log{\phi}}\color{red}{+...}\)

\(\displaystyle \frac{d^2}{d\phi^2}L(x,\phi)=-\frac{1}{\phi^2}+\left\{\frac{1}{\phi}\cdot\frac{d}{d\phi}[\frac{1}{\log{\phi}}] + \frac{1}{\log{\phi}}\cdot\frac{d}{d\phi}[\frac{1}{\phi}] \right\}\color{red}{+...}\)

...
 
Last edited: