A vector is an eigenvector of with eigenvalue if:
So, calculate each side of the equation.
No typo. is the matrix of column vectors , so it is an matrix, as well.
There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under . Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.
Can you see if my proof is correct?Can you show these eigenvectors are linearly independent?
the linear combination of vectors and weights can be written as the following:
if we multiply the above expression by some matrix C_p we get:
we can also multiply the first equation with the respective eigenvalues and equate with the second equation:
Now multiply the first equation with just
subtracting the expressions, we get:
We know that by the expression above, the equality only holds if etc... However since we assume from the question that the eigenvalues of the matrix is indeed unique, then it follows that for the equality to hold the values of . With this established we can relate back to the original equation;
since , thus this shows that the eigenvector , hence showing the linear independence of the eigenvectors.
Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?
Here is some basics about matrix multiplication. Suppose you have a matrix and two sets of vectors and with for each . Let be the matrix whose columns are . Let be the matrix whose columns are . Then . I recommend trying that out in the 2x2 case for and . It follows almost trivially from the how matrix multiplication works.
So, is just a matrix whose columns are . So, the multiplication will give . The -th column is just the result of the matrix times the -th column of . That is how matrix multiplication is defined.
Now, consider the -th column. It is . Because is an eigenvector, . So, we can rewrite the matrix:
Now, consider . Break it down column by column.
So, .
Hence, as desired.