# Thread: Cramer-Rao lower bound for Binomial dist

1. ## Cramer-Rao lower bound for Binomial dist

I have a random variable X, with $\displaystyle Bin(12,\theta)$ distribution.

I take n samples of this r.v., all independent, with distribution $\displaystyle Bin(12,\theta)$, and I want to find the Fisher's information and hence the Cramer-Rao lower bound.

So, obviously, I have the pmf of the Bin dist:

$\displaystyle p(x) =\left(\begin{array}{cc}n\\x\end{array}\right) \theta^x (1-\theta)^{n-x}$

So the first step would be to take the log of the pmf, correct?

If so, how do I do this? How do I take the log of $\displaystyle \left(\begin{array}{cc}n\\x\end{array}\right)$?

2. Hello,

$\displaystyle {n\choose x}=\frac{n!}{x!(n-x)!}$

So if you take the log, that'll give something like $\displaystyle \sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}$

But actually, the pmf of X is $\displaystyle f(x)={12 \choose x} \theta^x (1-\theta)^{12-x}$
Then the global pmf of the sample of n binomial distributions is $\displaystyle \prod_{i=1}^n f(x_i)$
And then you have to take the log of this product.

But I feel like there'll be a problem for differentiating, since it's a discrete distribution... and googling for "discrete fisher information" didn't yield anything interesting. Maybe you know...

I hope that helps.

3. Oh sorry I missed out that I only need to get the Fisher's info for a single X observation. So I don't need the "global pmf", right? And then I can just take the log of the $\displaystyle Bin(12,\theta)$ pmf, and I can differentiate that...

And my eventual answer for the Fisher's info is

$\displaystyle I(\theta) = \frac{n^2}{\theta} + \frac{12n - n^2\theta}{(1 - \theta)^2}$

EDIT: Hang on, I know what n is... so sticking n = 12 in, gets...

$\displaystyle I(\theta) = 144(\frac{1}{\theta} + \frac{1}{1 - \theta})$

Is this right?

4. Yes, n=12.

Oh, ****** !!!
$\displaystyle {12 \choose x}$ is a constant with respect to $\displaystyle \theta$, so there is no problem for differentiation.

Sorry for any confusion I could've made...

Your Fisher info is weird... How did you get it ?

After taking the logarithm, we have :

$\displaystyle \log f(x,\theta)=\sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}+x\log \theta+(12-x) \log(1-\theta)$

After differentiating once, we get
$\displaystyle \frac{\partial}{\partial \theta} f(x,\theta)=\frac x\theta-\frac{12-x}{1-\theta}$

After differentiating twice, we get
$\displaystyle \frac{\partial^2}{\partial \theta^2} f(x,\theta)=-\frac{x}{\theta^2}-\frac{12-x}{(1-\theta)^2}$

Under some conditions (which I think hold here), $\displaystyle \mathcal{I}(\theta)=-\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2} f(X,\theta)\right]$

Which gives, if I'm not mistaking, $\displaystyle 12 \left(\frac 1\theta+\frac{1}{1-\theta}\right)$

5. Originally Posted by Moo
Yes, n=12.

Oh, ****** !!!
$\displaystyle {12 \choose x}$ is a constant with respect to $\displaystyle \theta$, so there is no problem for differentiation.

Sorry for any confusion I could've made...

Your Fisher info is weird... How did you get it ?

After taking the logarithm, we have :

$\displaystyle \log f(x,\theta)=\sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}+x\log \theta+(12-x) \log(1-\theta)$

After differentiating once, we get
$\displaystyle \frac{\partial}{\partial \theta} f(x,\theta)=\frac x\theta-\frac{12-x}{1-\theta}$

After differentiating twice, we get
$\displaystyle \frac{\partial^2}{\partial \theta^2} f(x,\theta)=-\frac{x}{\theta^2}-\frac{12-x}{(1-\theta)^2}$

Under some conditions (which I think hold here), $\displaystyle \mathcal{I}(\theta)=-\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2} f(X,\theta)\right]$

Which gives, if I'm not mistaking, $\displaystyle 12 \left(\frac 1\theta+\frac{1}{1-\theta}\right)$
I figured out eventually that the $\displaystyle {12 \choose x}$ plays no part in the differentiation, lol!

It's odd because I've been given the definition of Fisher information, with an extra step. It involves $\displaystyle i(\theta)$. Sometimes, I see the step missed out, but here, it says exactly the following:

The Fisher information about a real parameter $\displaystyle \theta$ in the independent sample $\displaystyle X_{1},...,X_{n}$ is given by $\displaystyle \mathcal{I}(\theta) = ni(\theta)$, where

$\displaystyle i(\theta)=\mathbb{E}\left[-\frac{\partial^2}{\partial \theta^2} logf(X,\theta)\right]$

and $\displaystyle f(X;\theta)$ is the probability density or mass function of a single oberservation X.
Hence where I got the 144(...), rather than 12(...), I multiplied an extra n in there. Am I not supposed to? In which case, is there some extra condition I have to take into account regarding that definition that I've been given?

EDIT: Oh no, my mistake. The question specifically asks me for $\displaystyle i(\theta)$, so the 12(...) would be correct.

Thanks so much for your help

6. Yes sorry , there's a confusion with the i and I.
I've always dealt with $\displaystyle \mathcal{I}(\theta)$ and the "global pdf" (don't know how to call it)
and kept this writing for your situation, where it only was a single pdf, with $\displaystyle i(\theta)$

I'm glad you got help... painfully though, because of all those mistakes and misunderstandings !

,

,

,

,

,

,

,

,

,

,

,

,

,

,

# Cramarao lower bound of bionomial

Click on a term to search for related topics.