Assuming you're correctly using the Hessian method as referenced here in the "Higher Dimensions" section, I would say that it's entirely possible that, given your starting point, you converged on a local minimum. This is the problem with many optimization techniques - certainly Newton's method is susceptible to converging on a local extrema instead of the global one. Newton's method, like many other methods, is highly dependent on your starting point. Try a few other starting points and see if they give you different numbers.

Or try evolutionary-type algorithms, or simulated annealing. One excellent book I read recently was this one. Perhaps your library has it.

Hope this helps.