Relative increase in entropy is always less than relative increase in maximum entropy
Suppose we have a set of states out of which probabilities are calculated based on a frequency approach where N is the grand total, and the following entropy function is given:
In this case, the maximal entropy is .
Now, suppose we increase the number of states from to , , and we re-evaluate the probabilities. Now, we get the following entropy function:
Now, the maximal entropy is .
Based on information theory, increasing the number of states (from N to M) will increase the entropy. My question is related to the conclusion given as the title of this thread: Does the following inequality hold:
That is, is the relative increase on entropy less than the relative increase in maximum entropy when increasing the number of states?