(Solved) Derivative of relative entropy

Hello,

I am currently confused over relative entropy as it is defined here: Relative Entropy -- from Wolfram MathWorld

Rewritten in a more tidy fashion, I have:

I just computed the first and second derivative, because I want to demonstrate how a marginal alteration of probability will cause a growth of relative entropy that becomes the bigger, the more differs from . At least this is what my textbook says on the intuition of relative entropy. Now I eneded up with this:

and

I am very much confused about the "+1" in the upper formula. Why is it there? Should setting not cause a partial minimum? According to what I know, the derivative should then be zero, not .

Help is very appreciated!

Best, Rafael

Re: Derivative of relative entropy

I think, writing the problem down just answered my question. Of course, the partial derivative does not help me, since i the minimization must take the interdependency of all into account. By just deriving to , I would already find a minimum which is why my derivative indicates differently than my intention. Thanks for reading!