It's hard to give hints or advice without knowing what you know or can do with eigenvalues and eigenvectors of matrices. What are your thoughts on this problem.
I am stuck and not sure I understand this part of my course, I'm working through some problems and this is one I'm stuck on;
heres the problem;
I have the Quadratic form f(x; y) = ax2 + 2bxy + cy2:
and the Matrix A
I have (Lambda) and (Mew) as the eigenvalues of the Matrix A
I now have to demonstrate that
Lambda + Mew = a + c = tr(A)
Any help would be great
Well i've just come onto this section, I understand most of the topics prior to this one but this part is all new to me, I've looked through some books and understand the formulas and whats going on but the thing that throws me is the eigenvalues being put into the equation.