Results 1 to 7 of 7

Thread: Linear Independence/Span Proof

  1. #1
    Member
    Joined
    Apr 2008
    Posts
    191

    Linear Independence/Span Proof

    Suppose that V is a vector space, that $\displaystyle X=\{v_1,...v_n\} \subset V$ and that $\displaystyle w \in span X$. Set $\displaystyle Y=\{w, v_1,...,v_n\}$. If necessary you may denote the field of scalars by F.

    (a) Show that $\displaystyle Y$ is linearly dependent.

    (b) Show that $\displaystyle spanY = spanX$

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. $\displaystyle w=a_1 v_1 +...+a_n v_n$, and $\displaystyle a_1,...,a_n$ are scalars).

    Suppose there are real numbers $\displaystyle \lambda_1, \lambda_2,...,\lambda_m$ (m=n+1) such that:

    $\displaystyle \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0$

    Then $\displaystyle Y$ is linearly dependent if $\displaystyle \lambda_1 = \lambda_2 =...= \lambda_m = 0$. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    $\displaystyle \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0$

    But that doesn't seem to help...
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Quote Originally Posted by Roam View Post
    Suppose that V is a vector space, that $\displaystyle X=\{v_1,...v_n\} \subset V$ and that $\displaystyle w \in span X$. Set $\displaystyle Y=\{w, v_1,...,v_n\}$. If necessary you may denote the field of scalars by F.

    (a) Show that $\displaystyle Y$ is linearly dependent.

    (b) Show that $\displaystyle spanY = spanX$

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. $\displaystyle w=a_1 v_1 +...+a_n v_n$, and $\displaystyle a_1,...,a_n$ are scalars).

    Suppose there are real numbers $\displaystyle \lambda_1, \lambda_2,...,\lambda_m$ (m=n+1) such that:

    $\displaystyle \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0$

    Then $\displaystyle Y$ is linearly dependent if $\displaystyle \lambda_1 = \lambda_2 =...= \lambda_m = 0$. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    $\displaystyle \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0$

    But that doesn't seem to help...
    $\displaystyle Y=\left\{v_1,v_2,...,v_m,w \right\}$ is linearly dependent if there exist some scalars $\displaystyle a_1,a_2,...,a_n \in F$, not all 0, such that $\displaystyle a_1v_1 + a_2v_2 + ... + a_mv_m + a_nw = 0$

    Now, assume there exist some scalars $\displaystyle a_1,a_2,...,a_n \ s.t. \ a_1v_1 + a_2v_2 + ... + a_mv_m+a_nw = 0 $. Recall that $\displaystyle w=\beta_1v_1 + \beta_2v_2 + ... + \beta_mv_m$ for some $\displaystyle \beta_1,...,\beta_m \in F$. If $\displaystyle w=0$, we are done. Assume $\displaystyle w \neq 0$.

    Rewriting the equation, we get:

    $\displaystyle a_1v_1 + a_2v_2 + ... + a_mv_m + a_n\beta_1v_1 + a_n\beta_2v_2 + ... + a_n\beta_mv_m = 0 \Rightarrow$ $\displaystyle (a_1+a_n\beta_1)v_1 + (a_2 + a_n\beta_2)v_2 + ... + (a_m + a_n\beta_m)v_m = 0$.

    So, if we choose, say, $\displaystyle a_n=1$, then we'll choose $\displaystyle a_1 = -\beta_1, a_2 = -\beta_2,...,a_m = -\beta_m$, and now we have $\displaystyle a_1,a_2,...,a_n \in F$, not all 0, such that $\displaystyle a_1v_1 + a_2v_2 + ... + a_mv_m + a_nw = 0$ and thus Y is linearly dependent.

    Can you follow on the second part from here?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor Bruno J.'s Avatar
    Joined
    Jun 2009
    From
    Canada
    Posts
    1,266
    Thanks
    1
    Awards
    1
    Quote Originally Posted by Roam View Post
    Suppose that V is a vector space, that $\displaystyle X=\{v_1,...v_n\} \subset V$ and that $\displaystyle w \in span X$. Set $\displaystyle Y=\{w, v_1,...,v_n\}$. If necessary you may denote the field of scalars by F.

    (a) Show that $\displaystyle Y$ is linearly dependent.

    (b) Show that $\displaystyle spanY = spanX$

    This is my attempt for part (a): We know that since w is in the span of X then it's a linear combination of its vectors (i.e. $\displaystyle w=a_1 v_1 +...+a_n v_n$, and $\displaystyle a_1,...,a_n$ are scalars).

    Suppose there are real numbers $\displaystyle \lambda_1, \lambda_2,...,\lambda_m$ (m=n+1) such that:

    $\displaystyle \lambda_1 w+ \lambda_2 v_1 + \lambda_m v_n = 0$

    Then $\displaystyle Y$ is linearly dependent if $\displaystyle \lambda_1 = \lambda_2 =...= \lambda_m = 0$. Now I'm not sure how to prove this.

    Of course we can rewrite it as:

    $\displaystyle \lambda_1 (a_1 v_1 +...+a_n v_n)+ \lambda_2 v_1 + \lambda_m v_n = 0$

    But that doesn't seem to help...
    There's your part a) ! You proved it but didn't realize.

    Since $\displaystyle w=a_1 v_1 +...+a_n v_n$ you have $\displaystyle 0=-w+a_1 v_1 +...+a_n v_n$ and you're done; $\displaystyle Y$ is linearly dependent!
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member
    Joined
    Apr 2008
    Posts
    191
    Hi Defunkt! Thanks a lot, I get it now! Btw, I think it has to be $\displaystyle a_mv_n$ in your post. So it should be:
    $\displaystyle (a_1+a_n\beta_1)v_1 + (a_2 + a_n\beta_2)v_2 + ... + (a_m + a_n\beta_m)v_n = 0$

    Now, I don't know how to prove part (b)...

    But here is my attempt: We can try showing that an element in Y belongs to the span of X, and an element in X belongs to the span of Y.

    Since we know that $\displaystyle w \in spanX$, we also know that $\displaystyle w \in Y$.

    Therefore $\displaystyle spanX \subset Y$ (?)

    Now we can take $\displaystyle v_1 \in X$ and we know that v_1 belongs to the $\displaystyle spanY$,

    Therefore $\displaystyle X \subset spanY$ (?)

    I'm very confused...
    Last edited by Roam; Sep 27th 2009 at 07:40 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Member
    Joined
    Apr 2008
    Posts
    191
    I don't get part (b)
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Aug 2009
    From
    Israel
    Posts
    976
    Let $\displaystyle x \in span(X)$, then $\displaystyle x = \alpha_1v_1 + \alpha_2v_2 + ... + \alpha_nv_n$ where $\displaystyle \alpha_1, \alpha_2,...,\alpha_n \in F$

    If $\displaystyle x=0$ then obviously $\displaystyle x \in span(Y)$; assume $\displaystyle x \neq 0$.

    We want to show that $\displaystyle x \in span(Y)$, ie. that there exist $\displaystyle \beta_1, \beta_2,...,\beta_n,\beta_{n+1} \in F$ such that $\displaystyle x = \beta_1v_1 + \beta_2v_2 + ... + \beta_nv_n + \beta_{n+1}w$.

    It can easily be seen that if we let $\displaystyle \beta_i = \alpha_i \ \forall 1 \leq i \leq n, \ \beta_{n+1} = 0$ then $\displaystyle x \in span(Y)$. So $\displaystyle span(X) \subset span(Y)$

    Now let $\displaystyle y \in span(Y)$. Then $\displaystyle y = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}w$ where $\displaystyle \gamma_1,\gamma_2,...,\gamma_n,\gamma_{n+1} \in F$.

    We want to show that there exist $\displaystyle \lambda_1,\lambda_2,...,\lambda_n \in F$ such that $\displaystyle y = \lambda_1v_1 + \lambda_2v_2 + ... + \lambda_nv_n$.

    But $\displaystyle y = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}w = \gamma_1v_1 + ... + \gamma_nv_n + \gamma_{n+1}\delta_1v_1 + \gamma_{n+1}\delta_2v_2 + ... + \gamma_{n+1}\delta_nv_n$
    $\displaystyle = (\gamma_1+\gamma_{n+1}\delta_1)v_1 + (\gamma_2+\gamma_{n+1}\delta_2)v_2 + ... + (\gamma_n + \gamma_{n+1}\delta_n)v_n$

    So if we take $\displaystyle \lambda_i = \gamma_i + \gamma_{n+1}\delta_i \ \forall 1 \leq i \leq n$, then $\displaystyle y = \lambda_1v_1 + \lambda_2v_2 + ... + \lambda_nv_n \Rightarrow y \in span(X)$

    So $\displaystyle span(Y) \subset span(X)$ and $\displaystyle span(X) \subset span(Y) \Rightarrow span(X) = span(Y)$
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Sep 2009
    Posts
    1

    lin dependence

    Alright, so I'm still new to the whole lin. algebra thing w/ the span S =R^3

    I'm trying to find an example of a subset S of R^3 that is also lin. dependent and span S = R^3....

    I would have dim = 3, right?

    but what about the values for the subset, can I use something S: {1,x,x^2?}

    what does it mean to be lin. dependent?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Proof, linear independence
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: Jun 4th 2011, 12:51 PM
  2. Linear Independence Proof
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: Mar 31st 2011, 09:20 PM
  3. proving span and linear independence
    Posted in the Algebra Forum
    Replies: 0
    Last Post: Oct 25th 2009, 02:12 PM
  4. Linear Span Proof
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: Feb 3rd 2009, 06:35 PM
  5. Proof of linear independence
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: May 22nd 2007, 11:20 PM

Search Tags


/mathhelpforum @mathhelpforum