Results 1 to 4 of 4

Math Help - Basis and Eigenvalues

  1. #1
    Member
    Joined
    Oct 2008
    Posts
    130

    Basis and Eigenvalues

    V is an n dimensional vector space over F. A is a linear transformation from V to itself.

    Prove that if V has a basis of eigenvectors for A, then the matrix representing A with respect t this basis is diagonal with the eigenvalues as diagonal entries.

    Prove that with respect to an arbitrary basis for V, A is similar to a diagonal matrix if and only if V has a basis of eigenvectors for A.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    9
    Quote Originally Posted by robeuler View Post
    V is an n dimensional vector space over F. A is a linear transformation from V to itself.

    Prove that if V has a basis of eigenvectors for A, then the matrix representing A with respect t this basis is diagonal with the eigenvalues as diagonal entries.

    Prove that with respect to an arbitrary basis for V, A is similar to a diagonal matrix if and only if V has a basis of eigenvectors for A.
    Let B=\{v_1,v_2,...,v_n\} be the set of eigenvectors for A which form a basis for V. By definition Av_j = \ell_j v_j for some \ell_j \in F. Therefore, [Av_j]_B = (0,0,...,\ell_j,...0) where \ell_j appears in the j-coordinate. Therefore, the matrix corresponding to A with respect to this basis is:
    \begin{bmatrix} \ell_1 & 0 & ... & 0 \\ 0 & \ell_2 & ... & 0 \\ ...&...&...&...\\0& 0 & ... & \ell_n \end{bmatrix}
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Oct 2008
    Posts
    130
    Quote Originally Posted by ThePerfectHacker View Post
    Let B=\{v_1,v_2,...,v_n\} be the set of eigenvectors for A which form a basis for V. By definition Av_j = \ell_j v_j for some \ell_j \in F. Therefore, [Av_j]_B = (0,0,...,\ell_j,...0) where \ell_j appears in the j-coordinate. Therefore, the matrix corresponding to A with respect to this basis is:
    \begin{bmatrix} \ell_1 & 0 & ... & 0 \\ 0 & \ell_2 & ... & 0 \\ ...&...&...&...\\0& 0 & ... & \ell_n \end{bmatrix}
    Thank you! For the second part...
    for (=>) I know that there exists a P so that PAP^-1= D, where D is a diagonal matrix. I feel like if V did not have a basis of eigenvectors of A, then such a P should not exist, but I know I'm missing something.

    for (<=) can I explicitly construct a P using the eigenvalue-diagonal matrix from the part you solved?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    9
    Quote Originally Posted by robeuler View Post
    Thank you! For the second part...
    for (=>) I know that there exists a P so that PAP^-1= D, where D is a diagonal matrix. I feel like if V did not have a basis of eigenvectors of A, then such a P should not exist, but I know I'm missing something.
    I think the following observation will help thee. Let A be an n\times n matrix and P an n\times n invertible matrix so that PAP^{-1} = D where D is a diagnol matrix with ii-th entry \ell_i. The observation is that the i-th coloumn of P is an eigenvalue of A with eigenvalue \ell_i.

    Before we prove this let A = (a_{ij}) with P=(p_{ij}) and with D = (d_{ij}), note that d_{ij} = 0 if i\not = j and d_{ii} = \ell_i. Since PAP^{-1} = D\implies PA = DP. By the definition of matrix product we know PA = \left( \Sigma_k p_{ik}a_{kj} \right) = \left( \Sigma_k d_{ik}p_{kj} \right). However, \Sigma_k d_{ik}p_{kj} = (\ell_ip_{ij}). Therefore, \left( \Sigma_k p_{ik}a_{kj} \right) = (\ell_ip_{ij}) which means for any particular 1\leq i\leq n we have \Sigma_k p_{ik}a_{kj} = \ell_i p_{ij}. Thus, we can form the following matrix equality (just set j=1,2,...,n):
    \begin{bmatrix} \Sigma_k p_{ik}a_{k1} \\ \Sigma_k p_{ik}a_{k2}\\ ... \\ \Sigma_k p_{ik}a_{kn} \end{bmatrix} = \begin{bmatrix} \ell_i p_{i1} \\ \ell_i p_{i2} \\ ... \\ \ell_i p_{in} \end{bmatrix}\implies \begin{bmatrix} a_{11} & a_{21} & ... & a_{n1}\\ a_{12} & a_{22} & ... & a_{n2} \\ ... & ... & ... & ... \\ a_{1n} & a_{2n} & ... & a_{nn} \end{bmatrix}\begin{bmatrix}p_{i1}\\p_{i2} \\ ... \\ p_{in} \end{bmatrix} = \ell_i \begin{bmatrix}p_{i1} \\ p_{i2} \\ ... \\ p_{in}\end{bmatrix}

    Thus, A^T\bold{p}_{(i)} = \ell_i \bold{p}_{(i)} where \bold{p}_{(i)} is the i-th row.
    Hence, A\bold{p}_i = \ell_i \bold{p}_i where \bold{p}_i is the i-th coloumn.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Basis of ker L --> Basis of vector space?
    Posted in the Advanced Algebra Forum
    Replies: 6
    Last Post: September 17th 2011, 08:57 AM
  2. Replies: 4
    Last Post: August 30th 2011, 04:48 PM
  3. Basis and co-ordinates with respect to a basis.
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: December 5th 2010, 07:26 AM
  4. eigenvalues and basis of F(R,R)
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: January 20th 2010, 06:55 PM
  5. Basis of Eigenspace using Eigenvalues
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: September 17th 2009, 10:58 PM

Search Tags


/mathhelpforum @mathhelpforum