# Thread: "Information function" help? Thanks

1. ## "Information function" help? Thanks

Here's the question
---------------------------------

Let I be the unit interval, i.e. I = {x | 0 x 1}, and let I N be the N-fold cross product of I with itself, i.e., the unit N-box. Let P be that subset of I N consisting of points with positive coordinates summing to 1, i.e.

$P=\{(p_{1},p_{2},...,p_{N})1^N|\sum_p{i}=1, p{i}>0\}$

Let R be the real numbers and define the information function H : P R by

$H(p_{1},p_{2},...,p_{N})=-\sum_{p_{i}=1}^{N} p{i}\log{p{i}}$

H gives the amount of information in a communication system with N alternative messages where the ith message is transmitted with probability pi. (Interesting fact: If the base of the logarithm is chosen to be 2, then the unit of information is the bit, and corresponds to the amount of information in one yes-no question.) Prove that if all the messages are transmitted with equal probability, then the amount of information is equal to log N.

--------------

Here's what I did

$
Let p{i}=\frac{1}{N}$

$-\sum_{p_{i}=1}^{N} p{i}\log{p{i}}=[(-\frac{1}{2}\log\frac{1}{2})+(-\frac{1}{3}\log\frac{1}{3})+...+(-\frac{1}{N}\log\frac{1}{N})]$

$=[(\log2^\frac{1}{2})+(\log3^\frac{1}{3})+...+(\log{ N}^\frac{1}{N})]
$

Then I stuck here.
Did I head to wrong direction?
How should I solve this?
Thank you.

2. At this point, you might use the fact that log(a)+ log(b)= log(ab).

3. For $p_{i}= \frac{1}{N}$ is $\ln p_{i} = - \ln N$ so that is...

$\displaystyle H= - \sum_{i=1}^{N} p_{i}\ \ln p_{i} = \ln N\ \sum_{i=1}^{N} \frac{1}{N} = \ln N$ (1)

Kind regards

$\chi$ $\sigma$

4. Originally Posted by mrsi
Here's the question
---------------------------------

Let I be the unit interval, i.e. $I = \{x\ |\ 0 \leqslant x \leqslant 1\}$, and let $I^ N$ be the N-fold cross product of I with itself, i.e., the unit N-box. Let P be that subset of $I^ N$ consisting of points with positive coordinates summing to 1, i.e.

$P=\{(p_{1},p_{2},...,p_{N})\ |\ \sum\limits_{i=1}^Np_i=1,\ p_{i}>0\}$

Let R be the real numbers and define the information function $H : P \to R$ by

$H(p_{1},p_{2},...,p_{N})=-\sum\limits_{i=1}^{N} p_{i}\log{p_{i}}$

[I have modified that to correct several typos.]

H gives the amount of information in a communication system with N alternative messages where the ith message is transmitted with probability pi. (Interesting fact: If the base of the logarithm is chosen to be 2, then the unit of information is the bit, and corresponds to the amount of information in one yes-no question.) Prove that if all the messages are transmitted with equal probability, then the amount of information is equal to log N.

--------------

Here's what I did

Let $p_{i}=\frac{1}{N}$

$-\sum\limits_{i=1}^{N} p_{i}\log{p_{i}}=[(-\frac{1}{2}\log\frac{1}{2})+(-\frac{1}{3}\log\frac{1}{3})+...+(-\frac{1}{N}\log\frac{1}{N})]$
That last line is wrong [edit: as chisigma has just pointed out]. If each $p_i$ is equal to 1/N then then sum consists of N terms, each of which is equal to $-\frac1N\log\frac1N$. (In other words, instead of terms with 2,3,...,N, each term should be the same as the last term.) Then the sum is equal to $N\bigl(-\frac1N\log\frac1N\bigr) = -\log \frac1N = \log N$.