# Information Entropy Problem

• Sep 4th 2011, 11:59 AM
mathstudentcpm
Information Entropy Problem
can anyone show me the proof how the logarithmic series on the left becomes the expression on the right hand side . kindly see the attached file..Thanks in advance for any help.

regards

[IMG]file:///C:/Users/sazid/AppData/Local/Temp/msohtmlclip1/01/clip_image002.gif[/IMG]
• Sep 4th 2011, 12:17 PM
Siron
Re: Logarithmic series
Are there any summation bounds given?
• Sep 5th 2011, 01:53 AM
mathstudentcpm
Re: Logarithmic series
Quote:

Originally Posted by Siron
Are there any summation bounds given?

s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.
• Sep 5th 2011, 02:08 AM
Siron
Re: Logarithmic series
Quote:

Originally Posted by mathstudentcpm
s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.

$\displaystyle \sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)$

Is this what you have to prove?
• Sep 5th 2011, 03:54 AM
mathstudentcpm
Re: Logarithmic series
Quote:

Originally Posted by Siron
$\displaystyle \sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)$

Is this what you have to prove?

not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.
• Sep 5th 2011, 05:16 AM
CaptainBlack
Information Entropy Problem
Quote:

Originally Posted by mathstudentcpm
not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.

There is no variable of summation, it cannot be $\displaystyle n$ or $\displaystyle s$ as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

You are interested in messages of length $\displaystyle n$ in an alphabet with $\displaystyle s$ symbols, there are $\displaystyle s^n$ such messages.

Then as by assumption each message is equally likely:

$\displaystyle p_i=\frac{1}{s^n},\ i=1...s^n$

So the entropy becomes:

$\displaystyle H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)$

.... $\displaystyle = \log(s^n)$

CB
• Sep 5th 2011, 10:01 AM
mathstudentcpm
Re: Information Entropy Problem
Quote:

Originally Posted by CaptainBlack
There is no variable of summation, it cannot be $\displaystyle n$ or $\displaystyle s$ as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

You are interested in messages of length $\displaystyle n$ in an alphabet with $\displaystyle s$ symbols, there are $\displaystyle s^n$ such messages.

Then as by assumption each message is equally likely:

$\displaystyle p_i=\frac{1}{s^n},\ i=1...s^n$

So the entropy becomes:

$\displaystyle H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)$

.... $\displaystyle = \log(s^n)$

CB

Thanks Guru!