Hello,

Suppose we have a set of N states out of which probabilities are calculated based on a frequency approach where N is the grand total, and the following entropy function is given:

<br />
H_1 = -\sum_{i=1}^{N}p_i \log_2 p_i .<br />

In this case, the maximal entropy is H_{{max}_{1}}=\log_2 N.


Now, suppose we increase the number of states from N to M, M>N, and we re-evaluate the probabilities. Now, we get the following entropy function:


<br />
H_2 = -\sum_{i=1}^{M}q_i \log_2 q_i .<br />

Now, the maximal entropy is H_{{max}_{2}}=\log_2 M.

Based on information theory, increasing the number of states (from N to M) will increase the entropy. My question is related to the conclusion given as the title of this thread: Does the following inequality hold:

<br />
\frac{H_2 - H_1}{H_1} < \frac{H_{{max}_{2}}-H_{{max}_{1}}}{H_{{max}_{1}}}<br />

That is, is the relative increase on entropy less than the relative increase in maximum entropy when increasing the number of states?

Thanks.