I don't know anything about K-L divergence, but your P's do not sum to 1. Could that be it?The Kullback-Leibler Divergence, also known as Relative Entropy in Information Theory, measures the "distance" between two distributions. Usually, one of the distributions is the theoretical, and the other is the empirical. Now, based on Gibbs' Inequality, relative entropy is nonnegative; it is zero only when the two distributions are similar. Below is the general formula for discrete distributions:
So, we take the logarithm of p relative to q and multiply it by p, summing over all the values.
My problem is the following. I have two empirical distributions, each having 8 values:
Now, when I calculate the relative entropy D(P||Q), I get a negative value equal to -0.0444. This violates Gibbs' Inequality. Could you think of any possible reason why?
It seems to me that, the terms with indexes 3, 4, 7 yield negative logarithms (to base 2) in the summation, and the result is negative. What am I doing wrong here?