Originally Posted by
CaptainBlack Show that any set of m eigenvectors of an nxn matrix A, n> (or equal) m,
that correspond to distinct eigenvalues is independent
Suppose m=2, and that X1, and X2 are the e-vectors corresponding to
e-values lambda1, lambda2 and that lambda1 != lambda2, further suppose
that X1 and X2 are not linearly independent. Then there exist constants
a, b != 0 such that:
a.X1 + b.X2 = 0 ....(1)
Now:
A(a.X1+b.X2)=a.lambda1.X1 + b.lambda2.X2=0
from (1) we have a.X1=-b.X2, so:
(lambda1-lambda2).b.X2=0,
but as b != 0, X2 !=0 and lambda1 != lambda2 this is
a contradiction, so we have the desired result is true
when m=2.
The above constitutes the base case for an inductive
proof that the result is true for any m<=n.
Suppose X(i), i=1..k, k<n-1 are linearly independent e-vectors
corresponding to distinct e-values lambda(i), i=1,..k. Further
suppose that X(i), i=1..k+1, k<n-1 are not linearly independent
e-vectors corresponding to distinct e-values lambda(i), i=1,..k+1.
Then there exist constants a(i), i=1,..k+1 not all zero such that:
sum(i=1,k+1) a(i).X(i) = 0,
so:
a(k+1).X(k+1) = - sum(i=1,k) a(i).X(i)...(2)
Now multiply this by A:
lambda(k+1).a(k+1).X(k+1) = - sum(i=1,k) a(i).lambda(i).X(i).
Substitute from (2):
lambda(k+1).sum(i=1,k) a(i).lambda(i).X(i)= - sum(i=1,k) a(i).lambda(i).X(i)
or:
sum(i=1,k) a(i).(lambda(i)-lambda(k+1)).X(i)=0.
but as lambda(k+1) != lambda(i), i=1,..k this implies that the X(i) i=1,..k
are not linearly independent - a contradiction.
This last result together with the base case proves by the desired
result by induction (it does not prove it true for all m in this case
as we have at most n distinct e-values)
RonL