Hello, I am having trouble setting up the equations for maximum entropy given mean and variance.

The entropy is defined as $\displaystyle f(x) = -\int p(x) \log p(x) dx$. Problem statement is "use lagrangian optimization to find the pdf of maximal entropy if the variance is $\displaystyle \mu$ and variance is $\displaystyle \sigma^2$.

I started with

(a) function to maximize $\displaystyle f(x) = -\int p(x) \log p(x) dx$

(b) first constraint (probabilities must add up to one) $\displaystyle h_1(x) = \int p(x)dx = 1 $

(c) second constraint (expectation constraint) $\displaystyle h_2(x) = \int xp(x) dx = \mu $

(d) third constraint (variance constraint) $\displaystyle h_3(x) = \int (x-\mu)^2p(x)dx = \sigma^2 $

So the lagrangian looks something like:

$\displaystyle L(x,\lambda_1, \lambda_x, \lambda_3) = -\int p(x) \log p(x) dx + \lambda_1(h(x) - 1) + \lambda_2( h_2(x) - \mu) + \lambda_3( h_3(x) - \sigma^2 ) $

(sorry for using h_1,h_2,h_3 in the lagrangian, i get a "latex error, image too big" if I substitute the actual values in the lagrangian formula).

Now I know that for the discrete case, its fairly easy to do and you end up with the normal/gaussian distribution. I'm assuming the answer will be the same in the continuous case, but it appears that I need to use functional derivatives, because I am taking the derivative of a function p(x) , not a variable (at least that's what my math major friend seems to think), but I don't know how to do that. Is there an easier way of doing this?