if u v and w are vectors that are linearly independent will u-v v-w and u-w be linearly independent?

Printable View

- September 12th 2007, 09:59 PMlord12linear algebra
if u v and w are vectors that are linearly independent will u-v v-w and u-w be linearly independent?

- September 12th 2007, 10:22 PMtukeywilliams
So you are given .

Then you want to show that is true or false.

Expand out: .

Then

But that doesn't imply that . Only that , where . So they are linearly dependent. - September 12th 2007, 10:38 PMtukeywilliams

Linearly dependent. - September 12th 2007, 10:58 PMJhevon
i believe they are dependent if the Wronskian is zero.

recall that a set of two or more vectors is linearly dependent if and only if at least one of the vectors in the set can be expressed as the linear combination of finitely many other vectors in the set.

now, note that:

and

so we see that each of the vectors can be expressed as a linear combination of the other two, thus the vectors u - v, v - w, and u - w form a linearly dependent set of vectors - September 12th 2007, 11:04 PMtukeywilliams
yeah I did it a different way and got that they were linearly dependent also (my edit). You are correct, I forgot to edit the last sentence.

- September 12th 2007, 11:07 PMJhevon
- September 12th 2007, 11:10 PMtukeywilliams
that was a different problem. My solution to the first problem is post #2.

- September 12th 2007, 11:12 PMJhevon
- September 12th 2007, 11:13 PMtukeywilliams
yeah I think he did.

- September 12th 2007, 11:17 PMlord12
FIND the coefficients of the linear combination.

- September 12th 2007, 11:40 PMJhevon
- September 12th 2007, 11:55 PMtukeywilliams
in post #3, the determinant is 0, which implies that those vectors are linearly dependent.

Hes asking how you would find the coefficients. - September 12th 2007, 11:56 PMJhevon
- September 12th 2007, 11:58 PMlord12
the linear combination of the values in the matrix

- September 13th 2007, 12:02 AMJhevon