Hi,
I recently learnt that trace (sum of diagonal elements) of a matrix is equal to the sum of its eigen values. I was wondering why this holds true. Could anyone explain (I do not necessarily need a rigorous proof - concept by means of geometry, or what is happening in a vector space where all the columns are vectors etc would do too) ?
Thank you!
we know that the sum of zeros of a polynomial is now the eigenvalues of a matrix are the zeros of the polynomial so we only need
to prove that the coefficient of in is equal to this can be easily proved: if is an matrix, then:
it's easy to see that in the expansion of this determinant, all terms are polynomials (in ) of degree at most except for the term
thus the coefficient of in is equal to the coefficient of in which clearly is:
What, exactly, is the question? You have essentially two equations, one in three unknowns, the other for two unknowns. They can't be solved for specific numbers.
By the way, you have committed three major sins here:
1) you posted a question that made no sense.
2) you posted a question in "Linear Algebra and Abstract Algebra" that has nothing to do with either.
3) you "hijacked" someone else's thread to ask a completely unrelated question.
Let A =
Define E = = =
Note contains , iff, is an entry in E’s diagonal satisfies i = j).
Set . Applying the Fundamental Theorem of Algebra to repeatedly, we have , where are the eigenvalues of A (with possible repetitions). Multiplying factors and comparing the coefficients of , we deduce .
Let N = {1, 2, … n} and denote the symmetric group of degree n . Let be the identity element of .
Using the Leibniz formula for determinants,
For , consider an arbitrary term . There exists an such that . Since and is injective, . Hence, contains at least two distinct non diagonal entries of E: and ; therefore, n - 2 is the highest power of it can have.
We deduce the product of E’s diagonal terms includes all terms.
As shown previously, .
Comparing coefficients, we conclude: