# Thread: Proof about eigenvalues/vectors of a linear transformation

1. ## Proof about eigenvalues/vectors of a linear transformation

I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
Let $T$ be a linear transformation and let $v_1$ and $v_2$ be 2 eigenvectors of $T$ with eigenvalues $\lambda _1$ and $\lambda _2$ respectively. If $\lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \}$ is linear independent.
I say true : Proof :
I have to prove $A\Rightarrow B$, so I'll prove $\neg B \Rightarrow \neg A$ which is equivalent. In other words I have to prove that $\{ v_1, v_2 \}$ linear dependent implies $\lambda _1 = \lambda _2$.
So we have $\{ v_1,v_2 \}$ linear dependent $\Rightarrow \exists c \neq 0$ such that $v_1=cv_2$.
We have that $Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square$.
Is there any flaw?

2. Originally Posted by arbolis
I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
Let $T$ be a linear transformation and let $v_1$ and $v_2$ be 2 eigenvectors of $T$ with eigenvalues $\lambda _1$ and $\lambda _2$ respectively. If $\lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \}$ is linear independent.
I say true : Proof :
I have to prove $A\Rightarrow B$, so I'll prove $\neg B \Rightarrow \neg A$ which is equivalent. In other words I have to prove that $\{ v_1, v_2 \}$ linear dependent implies $\lambda _1 = \lambda _2$.
So we have $\{ v_1,v_2 \}$ linear dependent $\Rightarrow \exists c \neq 0$ such that $v_1=cv_2$.
We have that $Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square$.
Is there any flaw?
it's correct! at the end of your proof you should mention that $\lambda _1 v_2 = \lambda _2 v_2$ implies $\lambda_1=\lambda_2,$ because $v_2 \neq 0$ (eigenvectors are always non-zero).

3. Originally Posted by NonCommAlg
it's correct! at the end of your proof you should mention that $\lambda _1 v_2 = \lambda _2 v_2$ implies $\lambda_1=\lambda_2,$ because $v_2 \neq 0$ (eigenvectors are always non-zero).
Ah yes, I assumed it. Glad it works.
It's worth notice that here we have $A \Rightarrow B$ and not $A \Leftrightarrow B$. As a counter example of the latter, taking $v_1=(1,0)$ and $v_2=(0,1)$, they are clearly linear independent, however if $T$ is the identity transformation, then $\lambda _1= \lambda _2$, contradicting the hypothesis.

4. I attempted to prove it directly by showing that if $\alpha v_{1} + \beta v_{2} = 0$ , then $\alpha$ and $\beta$ must both be zero. I didn't get very far. Proof by contradiction seems like the way to go.

5. Originally Posted by Random Variable
I attempted to prove it directly by showing that if $\alpha v_{1} + \beta v_{2} = 0$ , then $\alpha$ and $\beta$ must both be zero. I didn't get very far. Proof by contradiction seems like the way to go.
Apply $T$ to your equation, it gives $\alpha \lambda_1 v_1+\beta\lambda_2v_2=0$. Together with the first one ( $\beta v_2=-\alpha v_1$), we deduce $\alpha (\lambda_1-\lambda_2) v_1=0$. Since $v_1\neq 0$ and $\lambda_1\neq\lambda_2$, we must have $\alpha=0$. Then $\beta=0$ from initial equation and the fact that $v_2\neq 0$.

6. Originally Posted by arbolis
I had to say whether the following affirmation was true or not and explain. I fell in love with the proof I gave : (I hope there's no flaws in it)
Let $T$ be a linear transformation and let $v_1$ and $v_2$ be 2 eigenvectors of $T$ with eigenvalues $\lambda _1$ and $\lambda _2$ respectively. If $\lambda _1 \neq \lambda _2 \Rightarrow \{ v_1,v_2 \}$ is linear independent.
I say true : Proof :
I have to prove $A\Rightarrow B$, so I'll prove $\neg B \Rightarrow \neg A$ which is equivalent. In other words I have to prove that $\{ v_1, v_2 \}$ linear dependent implies $\lambda _1 = \lambda _2$.
So we have $\{ v_1,v_2 \}$ linear dependent $\Rightarrow \exists c \neq 0$ such that $v_1=cv_2$.
We have that $Tv_1=\lambda _1 v_1 \Leftrightarrow Tcv_2=\lambda _1 v_1 \Leftrightarrow Tv_2= \frac{\lambda _1 v_1}{c}=\lambda _2 v_2 \Leftrightarrow \lambda _1 v_2 = \lambda _2 v_2 \Leftrightarrow \lambda _1 = \lambda _2 \square$.
Is there any flaw?
the general case is much more interesting: suppose $v_1, \cdots , v_n$ are $n$ eigenvectors with corresponding pairwise distinct eigenvalues $\lambda_1, \cdots , \lambda_n.$ then $v_1, \cdots , v_n$ are linearly independent:

the proof is by induction over $n$: there's nothing to prove for n = 1. so suppose the claim is true for $n-1$ and let $c_1v_1 + \cdots + c_nv_n=0,$ for some scalars $c_j.$ call this (1). we want to prove that

$c_1= \cdots = c_n = 0$ : from (1) we have $c_1\lambda_1v_1 + \cdots + c_n \lambda_n v_n=c_1T(v_1) + \cdots + c_nT(v_n)=T(c_1v_1 + \cdots + c_nv_n)=T(0)=0.$ call this (2). now multiply (1) by $\lambda_1$ and then subtract the result from (2)

to get: $c_2(\lambda_2-\lambda_1)v_2 + \cdots + c_n(\lambda_n - \lambda_1)v_n=0.$ hence by induction hypothesis we must have $c_2(\lambda_2 - \lambda_1) = \cdots = c_n(\lambda_n-\lambda_1)=0,$ which gives us $c_2= \cdots = c_n = 0,$ because $\lambda_j$ are assumed

to be pairwise distinct. so $c_1=0$ by (1). this completes the proof.