Results 1 to 3 of 3

Math Help - Entropy

  1. #1
    Newbie
    Joined
    Oct 2008
    Posts
    3

    Entropy

    Hi. I am a little confused about the entropy problem.

    How many bits of information are required to express 32 equally probable alternatives? Is it
    2^(5*32) = 2^160
    160 bits of information


    Is the entropy 5 or 2.32?

    Thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by frozz View Post
    Hi. I am a little confused about the entropy problem.

    How many bits of information are required to express 32 equally probable alternatives? Is it
    2^(5*32) = 2^160
    160 bits of information


    Is the entropy 5 or 2.32?

    Thanks!
    5 bits are needed for 32 equally likely messages.

    Binary entropy is:

    H(x)=-\sum_{i=1}^{32}p(x_i)\log_2(p(x_i))=???

    where p(x_i)=1/32; \ i=1, \ ..\ ,\ 32

    RonL
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Oct 2008
    Posts
    3

    Maximum entropy

    Thank you so much. Can I ask a little further?

    Is the maximum entropy of the scenario above

    H = - ([0.2 (log 0.2)] * 5)
    = -[0.2(-2.32)*5]
    = 2.32
    ?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Information Entropy Problem
    Posted in the Advanced Applied Math Forum
    Replies: 6
    Last Post: September 5th 2011, 11:01 AM
  2. Entropy questions
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 24th 2011, 12:53 PM
  3. Replies: 0
    Last Post: August 3rd 2010, 08:02 PM
  4. Maximum entropy given mean and variance
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: October 23rd 2009, 12:20 AM
  5. Entropy
    Posted in the Advanced Applied Math Forum
    Replies: 4
    Last Post: December 27th 2007, 05:58 AM

Search Tags


/mathhelpforum @mathhelpforum