Proof of convergence theory in optimization

The question is:

Suppose that lim $\displaystyle x_k=x_*$, where $\displaystyle x_*$ is a local minimizer of the nonlinear function $\displaystyle f$. Assume that $\displaystyle \triangledown^2 f(x_*)$ is symmetric positive definite. Prove that the sequence $\displaystyle \left \{ f(x_k)-f(x_*) \right \}$ converges linearly if and only if $\displaystyle \left \{ ||x_k-x_*|| \right \}$ converges linearly. Prove that the two sequences converge at the same rate, regardless of what the rate is. What is the relationship between the rate constant for the two sequences?

(I guess we may use the orthogonal diagonalization of a symmetric matrix and $\displaystyle f(x_k)-f(x_*)=\triangledown f(x_*)+\frac{1}{2}(x_k-x_*)^T\cdot\triangledown^2 f(\xi)(x_k-x_*)$ and $\displaystyle \triangledown f(x_*)=0$...... But I got stuck here. So what's your answer?)

Re: Proof of convergence theory in optimization

Hey ianchenmu.

What did you get for the second term taking into account that the Hessian is symmetric positive definite? I haven't looked at this in a while, but what identities do you get for a^T X a where X is symmetric (in addition to positive definite)?