## Asytmptotic equipartition property, using incorrect PMF

Problem setup:

X1, X2... are i.i.d. random variables drawn according to PMF p(x), x $\in$ {1, 2, ... , m}. Thus, $p(x_1, x_2,..., x_n) = \prod_{i=1}^{n}p(x_i)$. We know that $-\frac{1}{n}log(p(X_1, X_2,..., X_n)) \to H(X)$ in probability. Let $q(x_1, x_2,...,x_n) = \prod_{i=1}^{n}q(x_i)$, where q is another PMF on {1, 2,...,m}.

Evaluate $lim -\frac{1}{n}log(q(X_1,X_2,...,X_n)),$ where X1, X2,... are i.i.d. ~p(x).

So the tricky part is that the X's are still distributed according to p(x), but now I should compute instead what the entropy would be using the wrong PMF. Or is that even the entropy I'm computing? It looks more like the entropy rate.

Solution:

I tried to solve the problem like this, do you agree, and if you do, perhaps you can see an alternative solution?

$lim -\frac{1}{n}log(q(X_1,X_2,...,X_n)) = -\frac{1}{n} \sum_{i=1}^{n}(log(q(x_i)))$ // need to introduce a relation between q and p since we know what this expression evaluates to when p(x) is plugged in

$= \left\{q(x_i} = p(x_i) + r_i\right\}$ // r_i is the difference between the real and the incorrect probability of r.v. number i

$= - \frac{1}{n} \sum_{i=1}^{n} log p(x_i) + r_i$

$= - \frac{1}{n} \sum_{i=1}^{n} log p(x_i) - \frac{1}{n} \sum_{i=1}^{n} r_i$

$= \left\{\frac{1}{n} \sum_{i=1}^{n} r_i = R\right\}$

$= H(X) - R$