1. ## Information Entropy Problem

can anyone show me the proof how the logarithmic series on the left becomes the expression on the right hand side . kindly see the attached file..Thanks in advance for any help.

regards

[IMG]file:///C:/Users/sazid/AppData/Local/Temp/msohtmlclip1/01/clip_image002.gif[/IMG]

2. ## Re: Logarithmic series

Are there any summation bounds given?

3. ## Re: Logarithmic series

Originally Posted by Siron
Are there any summation bounds given?
s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.

4. ## Re: Logarithmic series

Originally Posted by mathstudentcpm
s is not the base in the attached file. s^n is a positive real number and the series bound is 1 to a finite positive term.
$\sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)$

Is this what you have to prove?

5. ## Re: Logarithmic series

Originally Posted by Siron
$\sum_{n=1}^{k} \frac{1}{s^2}\log_s(n)=\log_s(n)$

Is this what you have to prove?
not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.

6. ## Information Entropy Problem

Originally Posted by mathstudentcpm
not exactly...ok i have attached the (mutu_inf_best.pdf)file (its a research paper). Kindly see the second page of the paper and i have denoted with a double headed arrow("hanging on forum"). Thats the thing i have to proof.
There is no variable of summation, it cannot be $n$ or $s$ as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

You are interested in messages of length $n$ in an alphabet with $s$ symbols, there are $s^n$ such messages.

Then as by assumption each message is equally likely:

$p_i=\frac{1}{s^n},\ i=1...s^n$

So the entropy becomes:

$H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)$

.... $= \log(s^n)$

CB

7. ## Re: Information Entropy Problem

Originally Posted by CaptainBlack
There is no variable of summation, it cannot be $n$ or $s$ as they appear on the right as well as under the summation sign. You need to look at the meanings of the symbols not the formula to see why this is true.

You are interested in messages of length $n$ in an alphabet with $s$ symbols, there are $s^n$ such messages.

Then as by assumption each message is equally likely:

$p_i=\frac{1}{s^n},\ i=1...s^n$

So the entropy becomes:

$H=-\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left(\frac{1}{s^n}\right)=\sum_{i=1}^{s^n} \frac{1}{s^n} \log\left({s^n}\right)=s_n\left(\frac{1}{s^n} \log({s^n} )\right)$

.... $= \log(s^n)$

CB
Thanks Guru!