Results 1 to 7 of 7

Math Help - Information Entropy Problem

  1. #1
    Newbie
    Joined
    Sep 2011
    Posts
    4

    Information Entropy Problem

    can anyone show me the proof how the logarithmic series on the left becomes the expression on the right hand side . kindly see the attached file..Thanks in advance for any help.

    regards

    [IMG]file:///C:/Users/sazid/AppData/Local/Temp/msohtmlclip1/01/clip_image002.gif[/IMG]
    Attached Files Attached Files
    Last edited by CaptainBlack; September 5th 2011 at 06:33 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor Siron's Avatar
    Joined
    Jul 2011
    From
    Norway
    Posts
    1,250
    Thanks
    20

    Re: Logarithmic series

    Are there any summation bounds given?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Sep 2011
    Posts
    4

    Re: Logarithmic series

    Quote Originally Posted by Siron View Post
    Are there any summation bounds given?
    s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor Siron's Avatar
    Joined
    Jul 2011
    From
    Norway
    Posts
    1,250
    Thanks
    20

    Re: Logarithmic series

    Quote Originally Posted by mathstudentcpm View Post
    s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.
    \sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)

    Is this what you have to prove?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Sep 2011
    Posts
    4

    Re: Logarithmic series

    Quote Originally Posted by Siron View Post
    \sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)

    Is this what you have to prove?
    not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.
    Attached Files Attached Files
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4

    Information Entropy Problem

    Quote Originally Posted by mathstudentcpm View Post
    not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.
    There is no variable of summation, it cannot be n or s as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

    You are interested in messages of length n in an alphabet with s symbols, there are s^n such messages.

    Then as by assumption each message is equally likely:

    p_i=\frac{1}{s^n},\ i=1...s^n

    So the entropy becomes:

    H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)

    .... = \log(s^n)

    CB
    Last edited by CaptainBlack; September 5th 2011 at 06:37 AM.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Sep 2011
    Posts
    4

    Re: Information Entropy Problem

    Quote Originally Posted by CaptainBlack View Post
    There is no variable of summation, it cannot be n or s as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

    You are interested in messages of length n in an alphabet with s symbols, there are s^n such messages.

    Then as by assumption each message is equally likely:

    p_i=\frac{1}{s^n},\ i=1...s^n

    So the entropy becomes:

    H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)

    .... = \log(s^n)

    CB
    Thanks Guru!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Entropy questions
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 24th 2011, 12:53 PM
  2. Replies: 0
    Last Post: August 3rd 2010, 08:02 PM
  3. Entropy
    Posted in the Discrete Math Forum
    Replies: 2
    Last Post: October 6th 2008, 11:48 AM
  4. Entropy
    Posted in the Advanced Applied Math Forum
    Replies: 4
    Last Post: December 27th 2007, 05:58 AM
  5. Query about Information Entropy formula
    Posted in the Advanced Statistics Forum
    Replies: 4
    Last Post: May 29th 2007, 05:55 AM

Search Tags


/mathhelpforum @mathhelpforum