Results 1 to 6 of 6

Math Help - Proof about eigenvalues/vectors of a linear transformation

  1. #1
    MHF Contributor arbolis's Avatar
    Joined
    Apr 2008
    From
    Teyateyaneng
    Posts
    1,000
    Awards
    1

    Proof about eigenvalues/vectors of a linear transformation

    I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
    Let T be a linear transformation and let v_1 and v_2 be 2 eigenvectors of T with eigenvalues \lambda _1 and \lambda _2 respectively. If \lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \} is linear independent.
    I say true : Proof :
    I have to prove A\Rightarrow B, so I'll prove \neg B \Rightarrow \neg A which is equivalent. In other words I have to prove that \{ v_1, v_2 \} linear dependent implies \lambda _1 = \lambda _2.
    So we have \{ v_1,v_2 \} linear dependent \Rightarrow \exists c \neq 0 such that v_1=cv_2.
    We have that Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square.
    Is there any flaw?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by arbolis View Post
    I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
    Let T be a linear transformation and let v_1 and v_2 be 2 eigenvectors of T with eigenvalues \lambda _1 and \lambda _2 respectively. If \lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \} is linear independent.
    I say true : Proof :
    I have to prove A\Rightarrow B, so I'll prove \neg B \Rightarrow \neg A which is equivalent. In other words I have to prove that \{ v_1, v_2 \} linear dependent implies \lambda _1 = \lambda _2.
    So we have \{ v_1,v_2 \} linear dependent \Rightarrow \exists c \neq 0 such that v_1=cv_2.
    We have that Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square.
    Is there any flaw?
    it's correct! at the end of your proof you should mention that \lambda _1 v_2 = \lambda _2 v_2 implies \lambda_1=\lambda_2, because v_2 \neq 0 (eigenvectors are always non-zero).
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor arbolis's Avatar
    Joined
    Apr 2008
    From
    Teyateyaneng
    Posts
    1,000
    Awards
    1
    Quote Originally Posted by NonCommAlg View Post
    it's correct! at the end of your proof you should mention that \lambda _1 v_2 = \lambda _2 v_2 implies \lambda_1=\lambda_2, because v_2 \neq 0 (eigenvectors are always non-zero).
    Ah yes, I assumed it. Glad it works.
    It's worth notice that here we have A \Rightarrow B and not A \Leftrightarrow B. As a counter example of the latter, taking v_1=(1,0) and v_2=(0,1), they are clearly linear independent, however if T is the identity transformation, then \lambda _1= \lambda _2, contradicting the hypothesis.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member Random Variable's Avatar
    Joined
    May 2009
    Posts
    959
    Thanks
    3
    I attempted to prove it directly by showing that if  \alpha v_{1} + \beta v_{2} = 0 , then  \alpha and  \beta must both be zero. I didn't get very far. Proof by contradiction seems like the way to go.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Random Variable View Post
    I attempted to prove it directly by showing that if  \alpha v_{1} + \beta v_{2} = 0 , then  \alpha and  \beta must both be zero. I didn't get very far. Proof by contradiction seems like the way to go.
    Apply T to your equation, it gives \alpha \lambda_1 v_1+\beta\lambda_2v_2=0. Together with the first one ( \beta v_2=-\alpha v_1), we deduce \alpha (\lambda_1-\lambda_2) v_1=0. Since v_1\neq 0 and \lambda_1\neq\lambda_2, we must have \alpha=0. Then \beta=0 from initial equation and the fact that v_2\neq 0.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by arbolis View Post
    I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
    Let T be a linear transformation and let v_1 and v_2 be 2 eigenvectors of T with eigenvalues \lambda _1 and \lambda _2 respectively. If \lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \} is linear independent.
    I say true : Proof :
    I have to prove A\Rightarrow B, so I'll prove \neg B \Rightarrow \neg A which is equivalent. In other words I have to prove that \{ v_1, v_2 \} linear dependent implies \lambda _1 = \lambda _2.
    So we have \{ v_1,v_2 \} linear dependent \Rightarrow \exists c \neq 0 such that v_1=cv_2.
    We have that Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square.
    Is there any flaw?
    the general case is much more interesting: suppose v_1, \cdots , v_n are n eigenvectors with corresponding pairwise distinct eigenvalues \lambda_1, \cdots , \lambda_n. then v_1, \cdots , v_n are linearly independent:

    the proof is by induction over n: there's nothing to prove for n = 1. so suppose the claim is true for n-1 and let c_1v_1 + \cdots + c_nv_n=0, for some scalars c_j. call this (1). we want to prove that

    c_1= \cdots = c_n = 0 : from (1) we have c_1\lambda_1v_1 + \cdots + c_n \lambda_n v_n=c_1T(v_1) + \cdots + c_nT(v_n)=T(c_1v_1 + \cdots + c_nv_n)=T(0)=0. call this (2). now multiply (1) by \lambda_1 and then subtract the result from (2)

    to get: c_2(\lambda_2-\lambda_1)v_2 + \cdots + c_n(\lambda_n - \lambda_1)v_n=0. hence by induction hypothesis we must have c_2(\lambda_2 - \lambda_1) = \cdots = c_n(\lambda_n-\lambda_1)=0, which gives us c_2= \cdots = c_n = 0, because \lambda_j are assumed

    to be pairwise distinct. so c_1=0 by (1). this completes the proof.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 4
    Last Post: October 23rd 2011, 12:24 PM
  2. Find the eigenvalues of the linear transformation:
    Posted in the Advanced Algebra Forum
    Replies: 5
    Last Post: April 26th 2010, 06:19 PM
  3. Linear Transformation of product of two vectors
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: December 12th 2009, 03:22 AM
  4. Linear transformation of vectors
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: October 26th 2009, 04:08 PM
  5. eigenvalues and vectors related proof
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: October 19th 2009, 07:42 AM

Search Tags


/mathhelpforum @mathhelpforum