Results 1 to 6 of 6

Math Help - Cramer-Rao lower bound for Binomial dist

  1. #1
    Junior Member
    Joined
    May 2009
    Posts
    25

    Cramer-Rao lower bound for Binomial dist

    I have a random variable X, with Bin(12,\theta) distribution.

    I take n samples of this r.v., all independent, with distribution Bin(12,\theta), and I want to find the Fisher's information and hence the Cramer-Rao lower bound.

    So, obviously, I have the pmf of the Bin dist:

    p(x) =\left(\begin{array}{cc}n\\x\end{array}\right) \theta^x (1-\theta)^{n-x}

    So the first step would be to take the log of the pmf, correct?

    If so, how do I do this? How do I take the log of \left(\begin{array}{cc}n\\x\end{array}\right)?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    {n\choose x}=\frac{n!}{x!(n-x)!}

    So if you take the log, that'll give something like \sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}


    But actually, the pmf of X is f(x)={12 \choose x} \theta^x (1-\theta)^{12-x}
    Then the global pmf of the sample of n binomial distributions is \prod_{i=1}^n f(x_i)
    And then you have to take the log of this product.

    But I feel like there'll be a problem for differentiating, since it's a discrete distribution... and googling for "discrete fisher information" didn't yield anything interesting. Maybe you know...

    I hope that helps.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    May 2009
    Posts
    25
    Oh sorry I missed out that I only need to get the Fisher's info for a single X observation. So I don't need the "global pmf", right? And then I can just take the log of the Bin(12,\theta) pmf, and I can differentiate that...

    And my eventual answer for the Fisher's info is

    I(\theta) = \frac{n^2}{\theta} + \frac{12n - n^2\theta}{(1 - \theta)^2}

    EDIT: Hang on, I know what n is... so sticking n = 12 in, gets...

    I(\theta) = 144(\frac{1}{\theta} + \frac{1}{1 - \theta})

    Is this right?
    Last edited by Zenter; August 8th 2009 at 08:29 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Yes, n=12.

    Oh, ****** !!!
    {12 \choose x} is a constant with respect to \theta, so there is no problem for differentiation.

    Sorry for any confusion I could've made...

    Your Fisher info is weird... How did you get it ?

    After taking the logarithm, we have :

    \log f(x,\theta)=\sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}+x\log \theta+(12-x) \log(1-\theta)

    After differentiating once, we get
    \frac{\partial}{\partial \theta} f(x,\theta)=\frac x\theta-\frac{12-x}{1-\theta}

    After differentiating twice, we get
    \frac{\partial^2}{\partial \theta^2} f(x,\theta)=-\frac{x}{\theta^2}-\frac{12-x}{(1-\theta)^2}

    Under some conditions (which I think hold here), \mathcal{I}(\theta)=-\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2} f(X,\theta)\right]

    Which gives, if I'm not mistaking, 12 \left(\frac 1\theta+\frac{1}{1-\theta}\right)
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    May 2009
    Posts
    25
    Quote Originally Posted by Moo View Post
    Yes, n=12.

    Oh, ****** !!!
    {12 \choose x} is a constant with respect to \theta, so there is no problem for differentiation.

    Sorry for any confusion I could've made...

    Your Fisher info is weird... How did you get it ?

    After taking the logarithm, we have :

    \log f(x,\theta)=\sum_{k=2}^n \log(k)-\sum_{k=2}^x \{\log(k)+\log(n-k)\}+x\log \theta+(12-x) \log(1-\theta)

    After differentiating once, we get
    \frac{\partial}{\partial \theta} f(x,\theta)=\frac x\theta-\frac{12-x}{1-\theta}

    After differentiating twice, we get
    \frac{\partial^2}{\partial \theta^2} f(x,\theta)=-\frac{x}{\theta^2}-\frac{12-x}{(1-\theta)^2}

    Under some conditions (which I think hold here), \mathcal{I}(\theta)=-\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2} f(X,\theta)\right]

    Which gives, if I'm not mistaking, 12 \left(\frac 1\theta+\frac{1}{1-\theta}\right)
    I figured out eventually that the {12 \choose x} plays no part in the differentiation, lol!

    It's odd because I've been given the definition of Fisher information, with an extra step. It involves i(\theta). Sometimes, I see the step missed out, but here, it says exactly the following:

    The Fisher information about a real parameter  \theta in the independent sample X_{1},...,X_{n} is given by \mathcal{I}(\theta) = ni(\theta), where

    i(\theta)=\mathbb{E}\left[-\frac{\partial^2}{\partial \theta^2} logf(X,\theta)\right]


    and f(X;\theta) is the probability density or mass function of a single oberservation X.
    Hence where I got the 144(...), rather than 12(...), I multiplied an extra n in there. Am I not supposed to? In which case, is there some extra condition I have to take into account regarding that definition that I've been given?

    EDIT: Oh no, my mistake. The question specifically asks me for i(\theta), so the 12(...) would be correct.

    Thanks so much for your help
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Yes sorry , there's a confusion with the i and I.
    I've always dealt with \mathcal{I}(\theta) and the "global pdf" (don't know how to call it)
    and kept this writing for your situation, where it only was a single pdf, with i(\theta)

    I'm glad you got help... painfully though, because of all those mistakes and misunderstandings !
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. cramer-rao lower bound
    Posted in the Calculus Forum
    Replies: 0
    Last Post: July 28th 2010, 12:33 AM
  2. Cramér–Rao Lower Bound for a Function of a Parameter
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: March 27th 2010, 07:21 AM
  3. Replies: 0
    Last Post: February 19th 2010, 01:06 AM
  4. Greatest lower bound and lower bounds
    Posted in the Differential Geometry Forum
    Replies: 3
    Last Post: October 13th 2009, 02:26 PM
  5. least upper bound and greatest lower bound
    Posted in the Calculus Forum
    Replies: 2
    Last Post: September 22nd 2007, 09:59 AM

Search Tags


/mathhelpforum @mathhelpforum