Hi. I am a little confused about the entropy problem. How many bits of information are required to express 32 equally probable alternatives? Is it 2^(5*32) = 2^160 160 bits of information Is the entropy 5 or 2.32? Thanks!
Follow Math Help Forum on Facebook and Google+
Originally Posted by frozz Hi. I am a little confused about the entropy problem. How many bits of information are required to express 32 equally probable alternatives? Is it 2^(5*32) = 2^160 160 bits of information Is the entropy 5 or 2.32? Thanks! 5 bits are needed for 32 equally likely messages. Binary entropy is: $\displaystyle H(x)=-\sum_{i=1}^{32}p(x_i)\log_2(p(x_i))=???$ where $\displaystyle p(x_i)=1/32; \ i=1, \ ..\ ,\ 32$ RonL
Thank you so much. Can I ask a little further? Is the maximum entropy of the scenario above H = - ([0.2 (log 0.2)] * 5) = -[0.2(-2.32)*5] = 2.32 ?
View Tag Cloud