Results 1 to 6 of 6

Math Help - Eigenvalues & Independence

  1. #1
    Member
    Joined
    Dec 2007
    Posts
    80
    Awards
    1

    Eigenvalues & Independence

    In a book I read:

    "Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

    Assuming λi = 0 for one certain i, that statement does not sound correct to me.

    What do you guys think?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Quote Originally Posted by paolopiace View Post
    In a book I read:

    "Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

    Assuming λi = 0 for one certain i, that statement does not sound correct to me.

    What do you guys think?
    It's rather known that this theorem is "correct" (otherwise it wouldn't be one! :P). I'm sure you can find many proofs for it, the simplest one is by induction. See if you can prove it by yourself.

    But why do you think having 0 as an eigenvalue would make this incorrect?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Banned
    Joined
    Oct 2009
    Posts
    4,261
    Thanks
    2
    Quote Originally Posted by paolopiace View Post
    In a book I read:

    "Suppose that {λ1, . . . , λk} is a set of distinct eigenvalues and {x1, . . . , xk} is the corresponding set of eigenvectors. Then x1, . . . , xk are linearly independent; that is, eigenvectors associated with distinct eigenvalues are linearly independent."

    Assuming λi = 0 for one certain i, that statement does not sound correct to me.

    What do you guys think?
    The book's correct, no matter what the eigenvalues are, for suppose ({**})\sum\limits_{i=1}^ka_ix_i=0\,,\,\,a_i\in\mat  hbb{F}= our definition field, and that this is a minimal such trivial combination, meaning: if m<k then the only way to get \sum\limits_{i=1}^mc_ix_{n_i}=0 with c_i\in\mathbb{F}\,\,and\,\,\{x_{n_i}\}\subset\{x_1  ,...,x_k\}\,, is with c_1=...=c_m=0.

    As all the eigenvalues are different, multiplying relation (**) by \lambda_1 and substracting this from 0=T(0)=T\left(\sum\limits_{i=1}^ka_ix_i\right)=\su  m\limits_{i=1}^ka_iTx_i=\sum\limits_{i=1}^ka_i\lam  bda_ix_i , we get:

    0=\sum\limits_{i=1}^ka_i\lambda_ix_i-\sum\limits_{i=1}^ka_i\lambda_1x_i=\sum\limits_{i=  1}^ka_i\left(\lambda_i-\lambda_1\right)x_i=\sum\limits_{i=2}^ka_i\left(\l  ambda_i-\lambda_1\right)x_i \Longrightarrow by minimality of k, we get a_i\left(\lambda_i-\lambda_1\right)=0\,\,\forall\,\, 2\leq i\leq k\,\Longrightarrow\,a_2=...=a_k=0 , since

    all the eigenvalues are different, and going back to (**) and substituing a_i=0\,\,\,for\,\,2\leq i\leq k , we get that also a_1=0 (why? Remember the

    definition of eigenvector) ALL the coefficients are zero and thus the eigenvectors are lin. indep.

    Tonio
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member
    Joined
    Dec 2007
    Posts
    80
    Awards
    1
    ...(why? Remember the definition of eigenvector) ALL the coefficients are zero and thus the eigenvectors are lin. indep.
    Tonio
    Clear. Thanks. Though, it further rises one question:

    \lambda_i = 0 \rightarrow |X| = \Pi_{i=1}^k \lambda_i = 0

    Where X is the (k x k) eigensystem matrix (i.e. the matrix with the i-th eigenvector as row i)

    How can be rank(X)=k (which means linear independence) if |X|=0 ?

    Scanning through books and cannot see connections.
    Pls, bear with me. It's ages that I do not touch these things.
    Last edited by paolopiace; November 13th 2009 at 06:57 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Banned
    Joined
    Oct 2009
    Posts
    4,261
    Thanks
    2
    Quote Originally Posted by paolopiace View Post
    Clear. Thanks. Though, it further rises one question:

    \lambda_i = 0 \rightarrow |X| = \Pi_{i=1}^k \lambda_i = 0

    Where X is the (k x k) eigensystem matrix.

    How can rank(X)=k (linear independence) if |X|=0 ?

    I've no idea what that "eigensystem matrix" is: never heard of it, but if you meant the Vandermonde determinant then remember it equals \prod\limits_{i\neq j}(\lambda_i-\lambda_j) and thus it only vanishes if two of these roots are equals, not when one of them is zero.


    Tonio


    Scanning through books and cannot see connections.
    Pls, bear with me. It's ages that I do not touch these things.
    .
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Quote Originally Posted by tonio View Post
    .
    I think he meant that X is the matrix which has 0 as one of its eigenvalues. In this case, obviously rank(X) < k by definition. However, this makes no difference -- we are asking that eigenvectors of different eigenvalues be linearly independent. Do you remember the definition of an eigenvector OP?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Independence
    Posted in the Statistics Forum
    Replies: 1
    Last Post: August 15th 2011, 12:33 PM
  2. Independence
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: February 23rd 2011, 03:31 PM
  3. Replies: 0
    Last Post: November 15th 2010, 01:24 PM
  4. Independence with mod
    Posted in the Number Theory Forum
    Replies: 5
    Last Post: September 18th 2009, 10:53 PM
  5. independence
    Posted in the Advanced Statistics Forum
    Replies: 7
    Last Post: September 5th 2009, 10:53 PM

Search Tags


/mathhelpforum @mathhelpforum