Eigenvalues & Independence

• Nov 13th 2009, 12:05 PM
paolopiace
Eigenvalues & Independence

"Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

Assuming λi = 0 for one certain i, that statement does not sound correct to me.

What do you guys think?
• Nov 13th 2009, 12:38 PM
Defunkt
Quote:

Originally Posted by paolopiace

"Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

Assuming λi = 0 for one certain i, that statement does not sound correct to me.

What do you guys think?

It's rather known that this theorem is "correct" (otherwise it wouldn't be one! :P). I'm sure you can find many proofs for it, the simplest one is by induction. See if you can prove it by yourself.

But why do you think having 0 as an eigenvalue would make this incorrect?
• Nov 13th 2009, 12:57 PM
tonio
Quote:

Originally Posted by paolopiace

"Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

Assuming λi = 0 for one certain i, that statement does not sound correct to me.

What do you guys think?

The book's correct, no matter what the eigenvalues are, for suppose $\displaystyle ({**})\sum\limits_{i=1}^ka_ix_i=0\,,\,\,a_i\in\mat hbb{F}=$ our definition field, and that this is a minimal such trivial combination, meaning: if $\displaystyle m<k$ then the only way to get $\displaystyle \sum\limits_{i=1}^mc_ix_{n_i}=0$ with $\displaystyle c_i\in\mathbb{F}\,\,and\,\,\{x_{n_i}\}\subset\{x_1 ,...,x_k\}\,,$ is with $\displaystyle c_1=...=c_m=0$.

As all the eigenvalues are different, multiplying relation $\displaystyle (**)$ by $\displaystyle \lambda_1$ and substracting this from $\displaystyle 0=T(0)=T\left(\sum\limits_{i=1}^ka_ix_i\right)=\su m\limits_{i=1}^ka_iTx_i=\sum\limits_{i=1}^ka_i\lam bda_ix_i$ , we get:

$\displaystyle 0=\sum\limits_{i=1}^ka_i\lambda_ix_i-\sum\limits_{i=1}^ka_i\lambda_1x_i=\sum\limits_{i= 1}^ka_i\left(\lambda_i-\lambda_1\right)x_i=\sum\limits_{i=2}^ka_i\left(\l ambda_i-\lambda_1\right)x_i$ $\displaystyle \Longrightarrow$ by minimality of k, we get $\displaystyle a_i\left(\lambda_i-\lambda_1\right)=0\,\,\forall\,\, 2\leq i\leq k\,\Longrightarrow\,a_2=...=a_k=0$ , since

all the eigenvalues are different, and going back to $\displaystyle (**)$ and substituing $\displaystyle a_i=0\,\,\,for\,\,2\leq i\leq k$ , we get that also $\displaystyle a_1=0$ (why? Remember the

definition of eigenvector) ALL the coefficients are zero and thus the eigenvectors are lin. indep.

Tonio
• Nov 13th 2009, 02:46 PM
paolopiace
Quote:

...(why? Remember the definition of eigenvector) ALL the coefficients are zero and thus the eigenvectors are lin. indep.
Tonio
Clear. Thanks. Though, it further rises one question:

$\displaystyle \lambda_i = 0 \rightarrow |X| = \Pi_{i=1}^k \lambda_i = 0$

Where X is the (k x k) eigensystem matrix (i.e. the matrix with the i-th eigenvector as row i)

How can be rank(X)=k (which means linear independence) if |X|=0 ?

Scanning through books and cannot see connections.
Pls, bear with me. It's ages that I do not touch these things.
• Nov 13th 2009, 04:32 PM
tonio
Quote:

Originally Posted by paolopiace
Clear. Thanks. Though, it further rises one question:

$\displaystyle \lambda_i = 0 \rightarrow |X| = \Pi_{i=1}^k \lambda_i = 0$

Where X is the (k x k) eigensystem matrix.

How can rank(X)=k (linear independence) if |X|=0 ?

I've no idea what that "eigensystem matrix" is: never heard of it, but if you meant the Vandermonde determinant then remember it equals $\displaystyle \prod\limits_{i\neq j}(\lambda_i-\lambda_j)$ and thus it only vanishes if two of these roots are equals, not when one of them is zero.

Tonio

Scanning through books and cannot see connections.
Pls, bear with me. It's ages that I do not touch these things.

.
• Nov 13th 2009, 05:14 PM
Defunkt
Quote:

Originally Posted by tonio
.

I think he meant that X is the matrix which has 0 as one of its eigenvalues. In this case, obviously $\displaystyle rank(X) < k$ by definition. However, this makes no difference -- we are asking that eigenvectors of different eigenvalues be linearly independent. Do you remember the definition of an eigenvector OP?