Results 1 to 7 of 7

Math Help - Linear Independence/Span Proof

  1. #1
    Member
    Joined
    Apr 2008
    Posts
    191

    Linear Independence/Span Proof

    Suppose that V is a vector space, that X=\{v_1,...v_n\} \subset V and that w \in span X. Set Y=\{w, v_1,...,v_n\}. If necessary you may denote the field of scalars by F.

    (a) Show that Y is linearly dependent.

    (b) Show that spanY = spanX

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. w=a_1 v_1 +...+a_n v_n, and a_1,...,a_n are scalars).

    Suppose there are real numbers \lambda_1, \lambda_2,...,\lambda_m (m=n+1) such that:

    \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0

    Then Y is linearly dependent if \lambda_1 = \lambda_2 =...= \lambda_m = 0. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0

    But that doesn't seem to help...
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Quote Originally Posted by Roam View Post
    Suppose that V is a vector space, that X=\{v_1,...v_n\} \subset V and that w \in span X. Set Y=\{w, v_1,...,v_n\}. If necessary you may denote the field of scalars by F.

    (a) Show that Y is linearly dependent.

    (b) Show that spanY = spanX

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. w=a_1 v_1 +...+a_n v_n, and a_1,...,a_n are scalars).

    Suppose there are real numbers \lambda_1, \lambda_2,...,\lambda_m (m=n+1) such that:

    \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0

    Then Y is linearly dependent if \lambda_1 = \lambda_2 =...= \lambda_m = 0. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0

    But that doesn't seem to help...
    Y=\left\{v_1,v_2,...,v_m,w \right\} is linearly dependent if there exist some scalars a_1,a_2,...,a_n \in F, not all 0, such that  a_1v_1 + a_2v_2 + ... + a_mv_m + a_nw = 0

    Now, assume there exist some scalars a_1,a_2,...,a_n \ s.t. \ a_1v_1 + a_2v_2 + ... + a_mv_m+a_nw = 0 . Recall that w=\beta_1v_1 + \beta_2v_2 + ... + \beta_mv_m for some \beta_1,...,\beta_m \in F. If w=0, we are done. Assume w \neq 0.

    Rewriting the equation, we get:

    a_1v_1 + a_2v_2 + ... + a_mv_m + a_n\beta_1v_1 + a_n\beta_2v_2 + ... + a_n\beta_mv_m = 0 \Rightarrow  (a_1+a_n\beta_1)v_1 + (a_2 + a_n\beta_2)v_2 + ... + (a_m + a_n\beta_m)v_m = 0.

    So, if we choose, say, a_n=1, then we'll choose a_1 = -\beta_1, a_2 = -\beta_2,...,a_m = -\beta_m, and now we have a_1,a_2,...,a_n \in F, not all 0, such that a_1v_1 + a_2v_2 + ... + a_mv_m + a_nw = 0 and thus Y is linearly dependent.

    Can you follow on the second part from here?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor Bruno J.'s Avatar
    Joined
    Jun 2009
    From
    Canada
    Posts
    1,266
    Thanks
    1
    Awards
    1
    Quote Originally Posted by Roam View Post
    Suppose that V is a vector space, that X=\{v_1,...v_n\} \subset V and that w \in span X. Set Y=\{w, v_1,...,v_n\}. If necessary you may denote the field of scalars by F.

    (a) Show that Y is linearly dependent.

    (b) Show that spanY = spanX

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. w=a_1 v_1 +...+a_n v_n, and a_1,...,a_n are scalars).

    Suppose there are real numbers \lambda_1, \lambda_2,...,\lambda_m (m=n+1) such that:

    \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0

    Then Y is linearly dependent if \lambda_1 = \lambda_2 =...= \lambda_m = 0. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0

    But that doesn't seem to help...
    There's your part a) ! You proved it but didn't realize.

    Since w=a_1 v_1 +...+a_n v_n you have 0=-w+a_1 v_1 +...+a_n v_n and you're done; Y is linearly dependent!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member
    Joined
    Apr 2008
    Posts
    191
    Hi Defunkt! Thanks a lot, I get it now! Btw, I think it has to be a_mv_n in your post. So it should be:
    (a_1+a_n\beta_1)v_1 + (a_2 + a_n\beta_2)v_2 + ... + (a_m + a_n\beta_m)v_n = 0

    Now, I don't know how to prove part (b)...

    But here is my attempt: We can try showing that an element in Y belongs to the span of X, and an element in X belongs to the span of Y.

    Since we know that w \in spanX, we also know that w \in Y.

    Therefore spanX \subset Y (?)

    Now we can take v_1 \in X and we know that v_1 belongs to the spanY,

    Therefore X \subset spanY (?)

    I'm very confused...
    Last edited by Roam; September 27th 2009 at 08:40 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Apr 2008
    Posts
    191
    I don't get part (b)
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Let x \in span(X), then x = \alpha_1v_1 + \alpha_2v_2 + ... + \alpha_nv_n where \alpha_1, \alpha_2,...,\alpha_n \in F

    If x=0 then obviously x \in span(Y); assume x \neq 0.

    We want to show that x \in span(Y), ie. that there exist \beta_1, \beta_2,...,\beta_n,\beta_{n+1} \in F such that x = \beta_1v_1 + \beta_2v_2 + ... + \beta_nv_n + \beta_{n+1}w.

    It can easily be seen that if we let \beta_i = \alpha_i \ \forall 1 \leq i \leq n, \ \beta_{n+1} = 0 then x \in span(Y). So span(X) \subset span(Y)

    Now let y \in span(Y). Then y = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}w where \gamma_1,\gamma_2,...,\gamma_n,\gamma_{n+1} \in F.

    We want to show that there exist \lambda_1,\lambda_2,...,\lambda_n \in F such that y = \lambda_1v_1 + \lambda_2v_2 + ... + \lambda_nv_n.

    But y = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}w = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}\delta_1v_1 + \gamma_{n+1}\delta_2v_2 + ... + \gamma_{n+1}\delta_nv_n
     = (\gamma_1+\gamma_{n+1}\delta_1)v_1 + (\gamma_2+\gamma_{n+1}\delta_2)v_2 + ... + (\gamma_n + \gamma_{n+1}\delta_n)v_n

    So if we take \lambda_i = \gamma_i + \gamma_{n+1}\delta_i \ \forall 1 \leq i \leq n, then y = \lambda_1v_1 + \lambda_2v_2 + ... + \lambda_nv_n \Rightarrow y \in span(X)

    So  span(Y) \subset span(X) and  span(X) \subset span(Y) \Rightarrow span(X) = span(Y)
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Sep 2009
    Posts
    1

    lin dependence

    Alright, so I'm still new to the whole lin. algebra thing w/ the span S =R^3

    I'm trying to find an example of a subset S of R^3 that is also lin. dependent and span S = R^3....

    I would have dim = 3, right?

    but what about the values for the subset, can I use something S: {1,x,x^2?}

    what does it mean to be lin. dependent?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Proof, linear independence
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: June 4th 2011, 01:51 PM
  2. Linear Independence Proof
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: March 31st 2011, 10:20 PM
  3. proving span and linear independence
    Posted in the Algebra Forum
    Replies: 0
    Last Post: October 25th 2009, 03:12 PM
  4. Linear Span Proof
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: February 3rd 2009, 07:35 PM
  5. Proof of linear independence
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: May 23rd 2007, 12:20 AM

Search Tags


/mathhelpforum @mathhelpforum