Thread: Proving span and linear independancy of a matrix.

1. Proving span and linear independancy of a matrix.

Ok I have two very smilar questions for you guys:

1) Prove that if (v1, . . .,vn) spans Rn, then so does the list

(
v1 v2,v2v3, . . .,vn1vn,vn).

2) Prove that if
(v1, . . .,vn) is linearly independent in Rn, then so is the list (v1 v2,v2v3, . . .,vn1vn,vn).

Proofs are new to me this year, and I seem to be struggling with a lot of them so far.

Any thoughts on this one?

Any help is very much appreciated

2. Originally Posted by joe909
Ok I have two very smilar questions for you guys:

1) Prove that if (v1, . . .,vn) spans Rn, then so does the list

(
v1 v2,v2v3, . . .,vn1vn,vn).

2) Prove that if (v1, . . .,vn) is linearly independent in Rn, then so is the list (v1 v2,v2v3, . . .,vn1vn,vn).

Proofs are new to me this year, and I seem to be struggling with a lot of them so far.

Any thoughts on this one?

Any help is very much appreciated

In this case both questions are one and the same IF you already know that dim IR^n = n and n lin. independent vectors in IR^n are a basis.

Now we know that {v_1,...,v_n} are lin. ind. This means that the linear combination a_1*v_1 +...+ a_*v_n = 0 is possible iff a_1 =...= a_n = 0 , where the a_i are scalars in IR (real numbers).

Well, USING THIS, you have to prove that if b_1*(v_1 - v_2) +...+ b_n*(v_(n-1) - v_n) = 0 then IT MUST BE that the scalars b_1 =...= b_n = 0.
Hint: open parentheses, do some order and USE THE FIRST PART...

Tonio

3. Originally Posted by joe909
Ok I have two very smilar questions for you guys:

1) Prove that if (v1, . . .,vn) spans Rn, then so does the list

(
v1 v2,v2v3, . . .,vn1vn,vn).

2) Prove that if
(v1, . . .,vn) is linearly independent in Rn, then so is the list (v1 v2,v2v3, . . .,vn1vn,vn).

Proofs are new to me this year, and I seem to be struggling with a lot of them so far.

Any thoughts on this one?

Any help is very much appreciated

For #1, recall that the condition for a set $\displaystyle A = \{v_1,v_2,...,v_n\}$ to span some space V (over a field F), is that $\displaystyle \forall v \in V$, there exist some scalars $\displaystyle \alpha_1,\alpha_2,...,\alpha_n \in F$, not all zero, such that $\displaystyle v = \alpha_1v_1 + \alpha_2v_2 + ... + \alpha_nv_n$. That is to say, any element in V can be expressed as a linear combination of the set's elements.

In this question we are given that this is correct for the set $\displaystyle B=\{v_1,v_2,...,v_n\}$ and we need to prove that it is also correct for $\displaystyle B' = \{v_1-v_2,v_2-v_3,...,v_{n-1}-v_n,v_n\}$.

Since it is correct for B, there exist $\displaystyle a_1,a_2,...,a_n \in F$ not all zero such that for any $\displaystyle v\in V, \ v=a_1v_1 + a_2v_2 + ... + a_nv_n$. We want to find scalars $\displaystyle \beta_1,\beta_2,...,\beta_n \in F$ such that $\displaystyle v = \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n) + \beta_nv_n$. Try to see what happens if we let
$\displaystyle \beta_1 = \alpha_1$
$\displaystyle \beta_2 = \alpha_2 - \beta_1$
.
.
$\displaystyle \beta_i = \alpha_i - \beta_{i-1} \ \forall 2 \leq i \leq n$

Tonio: I think he has not reached the part of basis yet - this is why he is required to prove both claims... also, it is good practice to do so since he is only now starting with proofs!

4. For #1, recall that the condition for a set to span some space V (over a field F), is that , there exist some scalars , not all zero, such that . That is to say, any element in V can be expressed as a linear combination of the set's elements.

In this question we are given that this is correct for the set and we need to prove that it is also correct for .

Since it is correct for B, there exist not all zero such that for any . We want to find scalars such that . Try to see what happens if we let

.
.

Tonio: I think he has not reached the part of basis yet - this is why he is required to prove both claims... also, it is good practice to do so since he is only now starting with proofs!
I'd like to thank you a lot for your help. Ill show you what I have and if you could let me know if im on the right track that would be great.

So i took your suggestion and I end up with the following

let alpha= a

v=a1v1 + (a2 - 2a1)(v2) + (a3-2a2-2a1)(v3)....

Im not really sure how this qualifies as a proof though. I see that all the a values are scalars so we combine them to one variable value, is that enough to say that its span is Rn?.

Also im not sure how we can make the assumption that .

Im sure theese are probably really bad questions, but im having a lot of trouble getting this for some reason.

5. Originally Posted by joe909
I'd like to thank you a lot for your help. Ill show you what I have and if you could let me know if im on the right track that would be great.

So i took your suggestion and I end up with the following

let alpha= a

v=a1v1 + (a2 - 2a1)(v2) + (a3-2a2-2a1)(v3)....

Im not really sure how this qualifies as a proof though. I see that all the a values are scalars so we combine them to one variable value, is that enough to say that its span is Rn?.

Also im not sure how we can make the assumption that .

Im sure theese are probably really bad questions, but im having a lot of trouble getting this for some reason.

I don't really understand what you did in the first part...

Perhaps you misunderstood what I was saying. In order to prove that $\displaystyle B' = \{v_1-v_2, v_2-v_3,...,v_{n-1}-v_n,v_n\}$ spans $\displaystyle \mathbb{R}^n$, you need to show that there exist some (it is up to you to find them) scalars, $\displaystyle \beta_1,\beta_2,...,\beta_{n-1},\beta_n \in F$, not all of them zero, such that for any v in $\displaystyle \mathbb{R}^n$ , $\displaystyle v = \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n) + \beta_nv_n$.

That is to say, any element in V can be expressed as a linear combination of the elements of B'. So the proof is done by finding those coefficients ($\displaystyle \beta_1,...,\beta_n$). We don't assume that $\displaystyle \beta_1=\alpha_1$ - we just let $\displaystyle \beta_1=\alpha_1$. Surely we can do that.

Now we simply need to show that those specific coefficients satisfy the condition we want: that for every $\displaystyle v \in \mathbb{R}^n$, given that $\displaystyle v = \alpha_1v_1 + \alpha_2v_2 + ... + \alpha_nv_n$, this statement holds: $\displaystyle \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n)+\beta_nv_n = v$.

Now, if we let:
$\displaystyle \beta_1=\alpha_1$
$\displaystyle \beta_i = \alpha_i + \beta_{i-1}$ for all i between 2 and n (this was a mistake in my first post -- there shouldv'e been a plus sign here, not a minus)

And substitute them into $\displaystyle \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n)+\beta_nv_n$, we get:

$\displaystyle \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n)+\beta_nv_n = \alpha_1(v_1-v_2) + (\alpha_2+\alpha_1)(v_2-v_3) + ... +$$\displaystyle (\sum_{i=1}^{n-1}\alpha_i)(v_{n-1}-v_n) + (\sum_{i=1}^{n}\alpha_i)v_n$ (*)

I'll leave it to you to show that (*) = $\displaystyle \alpha_1v_1 + \alpha_2v_2 + ... + \alpha_nv_n$ and with that the proof will be done.

6. I don't really understand what you did in the first part...

Perhaps you misunderstood what I was saying. In order to prove that spans , you need to show that there exist some (it is up to you to find them) scalars, , not all of them zero, such that for any v in , .

That is to say, any element in V can be expressed as a linear combination of the elements of B'. So the proof is done by finding those coefficients (). We don't assume that - we just let . Surely we can do that.

Now we simply need to show that those specific coefficients satisfy the condition we want: that for every , given that , this statement holds: .

Now, if we let:

for all i between 2 and n (this was a mistake in my first post -- there shouldv'e been a plus sign here, not a minus)

And substitute them into , we get:

(*)

I'll leave it to you to show that (*) = and with that the proof will be done.
Thank you very much. Not only was I able to finish the proof, but I now feel I better understand what I was trying to prove and why the proof works. The tip about the helped a lot.

I really appreciate you spending the time to help with me this problem.

7. sorry, im new to this topic as well.

may i know how you know that Beta i can be expressed in terms of alpha and that it is

beta i = alpha i + beta (i-1)

can it be expressed in any forms of alpha or only like taht?

thanks!

8. Originally Posted by alexandrabel90
sorry, im new to this topic as well.

may i know how you know that Beta i can be expressed in terms of alpha and that it is

beta i = alpha i + beta (i-1)

can it be expressed in any forms of alpha or only like taht?

thanks!
Well, you can simply write $\displaystyle \beta_i = \sum_{k=1}^{i} \alpha_k$

To prove that $\displaystyle \{v_1-v_2,...,v_{n-1}-v_n,v_n\}$ spans $\displaystyle \mathbb{R}^n$, we need to show the existence of those $\displaystyle \beta_i$s... that is - we need to find them and show that they withhold our condition. In other words, we want to find coefficients $\displaystyle \beta_1,\beta_2,...,\beta_n$ that solve this equation: $\displaystyle \beta_1(v_1-v_2) + \beta_2(v_2-v_3) + ... + \beta_{n-1}(v_{n-1}-v_n)+\beta_nv_n = \alpha_1v_1 + \alpha_2v_2+...+\alpha_nv_n$ (**)

My thought process on choosing those specific coefficients was this: Consider the fact that $\displaystyle v_1$ is only in one of the elements of $\displaystyle B'$, and its coefficient in the RHS of (**) is $\displaystyle \alpha_1$, so that will be the coefficient of its one element in the LHS as well. That's why $\displaystyle \beta_1=\alpha_1$. Now consider $\displaystyle v_2$ -- its coefficient is now $\displaystyle (-\alpha_1)$ and we want to complete it to $\displaystyle \alpha_2$ with our remaining coefficient. So we will let $\displaystyle \beta_2 = \alpha_2+\alpha_1$ ==> the coefficient of $\displaystyle v_2$ is $\displaystyle \alpha_2+\alpha_1-\alpha_1 = \alpha_2$ just like we wanted. The same process continues for the rest of the elements of B'.

9. 2) Prove that if
(v1, . . .,vn) is linearly independent in Rn, then so is the list (v1 v2,v2v3, . . .,vn1vn,vn).

For the second question is it enough to just say that
0=a1v1+a2v2+...anv such that a1 = 0, a2 = 0, ... ,an = 0.

And we know we can make .
=
by letting
b1 = a1
b2 = a2+b1
ect.
so therefore (v1 v2,v2v3, . . .,vn1vn,vn) is also linearly independant?
is this not kind of cheating though as we are effectively just letting all the b values = 0? Or is this enough for the proof, or am i possibly on the complete wrong track.

10. Originally Posted by joe909
2) Prove that if
(v1, . . .,vn) is linearly independent in Rn, then so is the list (v1 v2,v2v3, . . .,vn1vn,vn).

For the second question is it enough to just say that
0=a1v1+a2v2+...anv such that a1 = 0, a2 = 0, ... ,an = 0.

And we know we can make .
=
by letting
b1 = a1
b2 = a2+b1
ect.
so therefore (v1 v2,v2v3, . . .,vn1vn,vn) is also linearly independant?
is this not kind of cheating though as we are effectively just letting all the b values = 0? Or is this enough for the proof, or am i possibly on the complete wrong track.
Hi - I trust you are correct in your approach but wrong in your argument. You demonstrated one set of values of b1,b2,,,bn which works. There might be other values. for e.g
0.v1 + 0.(2v1) = 0 but are they independent?
2.v1 + -1.(2v1) = 0 also works.

11. Originally Posted by aman_cc
Hi - I trust you are correct in your approach but wrong in your argument. You demonstrated one set of values of b1,b2,,,bn which works. There might be other values. for e.g
0.v1 + 0.(2v1) = 0 but are they independent?
2.v1 + -1.(2v1) = 0 also works.
So is this what you are saying:

since we know (v1, . . .,vn) is linearly independent we know that
0=a1v1+a2v2+...anv such that a1 = 0, a2 = 0, ... ,an = 0. because in order for it to be linearly independant all the scalars must be 0 correct?

And from the first part we proved that we can convert .
into by letting b1 = a1
b2 = a2+b1.

However you are saying that I need to prove that there arent other possible Beta values other then b1=a1, b2=a2+b1, ect. that will add up to 0? That makes sense to me, however im not sure how I would even begin going about doing that, any tips or suggestions?

12. Originally Posted by joe909
So is this what you are saying:

since we know (v1, . . .,vn) is linearly independent we know that
0=a1v1+a2v2+...anv such that a1 = 0, a2 = 0, ... ,an = 0. because in order for it to be linearly independant all the scalars must be 0 correct?

And from the first part we proved that we can convert .
into by letting b1 = a1
b2 = a2+b1.

However you are saying that I need to prove that there arent other possible Beta values other then b1=a1, b2=a2+b1, ect. that will add up to 0? That makes sense to me, however im not sure how I would even begin going about doing that, any tips or suggestions?
Hint - You need to use the fact that v1,v2....vn are independent !
I would also structure your problem better/clearer.
Known fact: a1v1+...+anvn = 0 => a1,a2....an =0
To prove:
= 0 => b1,b2,...,bn = 0

If you read this formulation is equivalent to what you are doing.
You have actually done it pretty much. It's trivial.