Multidimensional Newton's method optimization

The followings. I know this is Hessian matrix. But i still could not understand how it work. Could you help me expand these 2 formula. And tell me how this process work.

----------------------------------

http://en.wikipedia.org/wiki/Newton%...n_optimization

This iterative scheme can be generalized to several dimensions by replacing the derivative with the gradient, http://upload.wikimedia.org/math/d/b...0810163b5c.png, and the reciprocal of the second derivative with the inverse of the Hessian matrix, http://upload.wikimedia.org/math/0/4...8b8aa15e54.png. One obtains the iterative scheme

http://upload.wikimedia.org/math/a/3...7bbcb221d6.png

Usually Newton's method is modified to include a small step size γ > 0 instead of γ = 1

http://upload.wikimedia.org/math/f/1...ebadbb7279.png