V is an n dimensional vector space over F. A is a linear transformation from V to itself.
Prove that if V has a basis of eigenvectors for A, then the matrix representing A with respect t this basis is diagonal with the eigenvalues as diagonal entries.
Prove that with respect to an arbitrary basis for V, A is similar to a diagonal matrix if and only if V has a basis of eigenvectors for A.
for (=>) I know that there exists a P so that PAP^-1= D, where D is a diagonal matrix. I feel like if V did not have a basis of eigenvectors of A, then such a P should not exist, but I know I'm missing something.
for (<=) can I explicitly construct a P using the eigenvalue-diagonal matrix from the part you solved?
Before we prove this let with and with , note that if and . Since . By the definition of matrix product we know . However, . Therefore, which means for any particular we have . Thus, we can form the following matrix equality (just set ):
Thus, where is the -th row.
Hence, where is the -th coloumn.