The followings. I know this is Hessian matrix. But i still could not understand how it work. Could you help me expand these 2 formula. And tell me how this process work.

----------------------------------

http://en.wikipedia.org/wiki/Newton%...n_optimization

This iterative scheme can be generalized to several dimensions by replacing the derivative with the gradient, , and the reciprocal of the second derivative with the inverse of the Hessian matrix, . One obtains the iterative scheme

Usually Newton's method is modified to include a small step size γ > 0 instead of γ = 1