1. ## Basis and Eigenvalues

V is an n dimensional vector space over F. A is a linear transformation from V to itself.

Prove that if V has a basis of eigenvectors for A, then the matrix representing A with respect t this basis is diagonal with the eigenvalues as diagonal entries.

Prove that with respect to an arbitrary basis for V, A is similar to a diagonal matrix if and only if V has a basis of eigenvectors for A.

2. Originally Posted by robeuler
V is an n dimensional vector space over F. A is a linear transformation from V to itself.

Prove that if V has a basis of eigenvectors for A, then the matrix representing A with respect t this basis is diagonal with the eigenvalues as diagonal entries.

Prove that with respect to an arbitrary basis for V, A is similar to a diagonal matrix if and only if V has a basis of eigenvectors for A.
Let $B=\{v_1,v_2,...,v_n\}$ be the set of eigenvectors for $A$ which form a basis for $V$. By definition $Av_j = \ell_j v_j$ for some $\ell_j \in F$. Therefore, $[Av_j]_B = (0,0,...,\ell_j,...0)$ where $\ell_j$ appears in the $j$-coordinate. Therefore, the matrix corresponding to $A$ with respect to this basis is:
$\begin{bmatrix} \ell_1 & 0 & ... & 0 \\ 0 & \ell_2 & ... & 0 \\ ...&...&...&...\\0& 0 & ... & \ell_n \end{bmatrix}$

3. Originally Posted by ThePerfectHacker
Let $B=\{v_1,v_2,...,v_n\}$ be the set of eigenvectors for $A$ which form a basis for $V$. By definition $Av_j = \ell_j v_j$ for some $\ell_j \in F$. Therefore, $[Av_j]_B = (0,0,...,\ell_j,...0)$ where $\ell_j$ appears in the $j$-coordinate. Therefore, the matrix corresponding to $A$ with respect to this basis is:
$\begin{bmatrix} \ell_1 & 0 & ... & 0 \\ 0 & \ell_2 & ... & 0 \\ ...&...&...&...\\0& 0 & ... & \ell_n \end{bmatrix}$
Thank you! For the second part...
for (=>) I know that there exists a P so that PAP^-1= D, where D is a diagonal matrix. I feel like if V did not have a basis of eigenvectors of A, then such a P should not exist, but I know I'm missing something.

for (<=) can I explicitly construct a P using the eigenvalue-diagonal matrix from the part you solved?

4. Originally Posted by robeuler
Thank you! For the second part...
for (=>) I know that there exists a P so that PAP^-1= D, where D is a diagonal matrix. I feel like if V did not have a basis of eigenvectors of A, then such a P should not exist, but I know I'm missing something.
I think the following observation will help thee. Let $A$ be an $n\times n$ matrix and $P$ an $n\times n$ invertible matrix so that $PAP^{-1} = D$ where $D$ is a diagnol matrix with $ii$-th entry $\ell_i$. The observation is that the $i$-th coloumn of $P$ is an eigenvalue of $A$ with eigenvalue $\ell_i$.

Before we prove this let $A = (a_{ij})$ with $P=(p_{ij})$ and with $D = (d_{ij})$, note that $d_{ij} = 0$ if $i\not = j$ and $d_{ii} = \ell_i$. Since $PAP^{-1} = D\implies PA = DP$. By the definition of matrix product we know $PA = \left( \Sigma_k p_{ik}a_{kj} \right) = \left( \Sigma_k d_{ik}p_{kj} \right)$. However, $\Sigma_k d_{ik}p_{kj} = (\ell_ip_{ij})$. Therefore, $\left( \Sigma_k p_{ik}a_{kj} \right) = (\ell_ip_{ij})$ which means for any particular $1\leq i\leq n$ we have $\Sigma_k p_{ik}a_{kj} = \ell_i p_{ij}$. Thus, we can form the following matrix equality (just set $j=1,2,...,n$):
$\begin{bmatrix} \Sigma_k p_{ik}a_{k1} \\ \Sigma_k p_{ik}a_{k2}\\ ... \\ \Sigma_k p_{ik}a_{kn} \end{bmatrix} = \begin{bmatrix} \ell_i p_{i1} \\ \ell_i p_{i2} \\ ... \\ \ell_i p_{in} \end{bmatrix}\implies$ $\begin{bmatrix} a_{11} & a_{21} & ... & a_{n1}\\ a_{12} & a_{22} & ... & a_{n2} \\ ... & ... & ... & ... \\ a_{1n} & a_{2n} & ... & a_{nn} \end{bmatrix}\begin{bmatrix}p_{i1}\\p_{i2} \\ ... \\ p_{in} \end{bmatrix} = \ell_i \begin{bmatrix}p_{i1} \\ p_{i2} \\ ... \\ p_{in}\end{bmatrix}$

Thus, $A^T\bold{p}_{(i)} = \ell_i \bold{p}_{(i)}$ where $\bold{p}_{(i)}$ is the $i$-th row.
Hence, $A\bold{p}_i = \ell_i \bold{p}_i$ where $\bold{p}_i$ is the $i$-th coloumn.