This is a small part of a much larger question to do with optimization but is the only part I cant get!

Apply Newton's method (without linesearch) to minimize the univariate function,

, ,

starting from and let be the generated sequence of iterates.

Prove that the limit points of the sequence of iterates are +1 and -1 as .

This is the sequence...

And Newtons method is...

So how would I do this. Some examples in class just had the sequence of iterates with the statement "clearly the sequence converges to ..." but I think I should be giving a proper proof.