# Relative increase in entropy is always less than relative increase in maximum entropy

• Aug 3rd 2010, 07:02 PM
EmpSci
Relative increase in entropy is always less than relative increase in maximum entropy
Hello,

Suppose we have a set of $\displaystyle N$ states out of which probabilities are calculated based on a frequency approach where N is the grand total, and the following entropy function is given:

$\displaystyle H_1 = -\sum_{i=1}^{N}p_i \log_2 p_i .$

In this case, the maximal entropy is $\displaystyle H_{{max}_{1}}=\log_2 N$.

Now, suppose we increase the number of states from $\displaystyle N$ to $\displaystyle M$, $\displaystyle M>N$, and we re-evaluate the probabilities. Now, we get the following entropy function:

$\displaystyle H_2 = -\sum_{i=1}^{M}q_i \log_2 q_i .$

Now, the maximal entropy is $\displaystyle H_{{max}_{2}}=\log_2 M$.

Based on information theory, increasing the number of states (from N to M) will increase the entropy. My question is related to the conclusion given as the title of this thread: Does the following inequality hold:

$\displaystyle \frac{H_2 - H_1}{H_1} < \frac{H_{{max}_{2}}-H_{{max}_{1}}}{H_{{max}_{1}}}$

That is, is the relative increase on entropy less than the relative increase in maximum entropy when increasing the number of states?

Thanks.