# Linearly independent set.

• Apr 22nd 2009, 11:08 PM
scorpion007
Linearly independent set.
Given $\vec{x_1} = (1, 2, -1,0),\vec{x_2} = (15 , 1, 0, 3), \vec{x_3} = (-2, 13, 3, -2)$. Find a vector $\vec{x_4}$ such that $A=\lbrace \vec{x_1}, \vec{x_2}, \vec{x_3},\vec{x_4}\rbrace$ is a linearly independent set.

Response:
Let $\vec{x_4}=(a_1,a_2,a_3,a_4)$ for arbitrary numbers $a_1, ..., a_4$.

Then for A to be linearly independent, $c_1\vec{x_1}+c_2\vec{x_2}+c_3\vec{x_3}+c_4\vec{x_4 }=\vec{0}$ must imply $c_1=c_2=c_3=c_4=0$.

Therefore we have the homogeneous linear system, in matrix form:
$\left( \begin{array}{cccc}
1&15&-2&a_1\\
2&1&13&a_2\\
-1&0&3&a_3\\
0&3&-2&a_4 \end{array}\right)
$

My question(s):

Do I really have to reduce it to reduced echelon form? It gets really messy!

Is there an easy way to solve this? (Easier than my approach?)
• Apr 22nd 2009, 11:18 PM
Gamma
Generalized Cross Product
Cross product - Wikipedia, the free encyclopedia

You can look in the section on generalizations of the cross product into higher dimensions. This will give you a vector that is perpendicular to the subspace spanned by the first 3 vectors. If they are linearly independent, the collection of all 4 should be as well.

Alternatively, guess and check could be popular, you just gotta pick 4 numbers to give you a nonzero determinant that will show your vectors are linearly independent.

Otherwise, yeah, that is probably the best way to do it.
• Apr 22nd 2009, 11:48 PM
scorpion007
Quote:

Originally Posted by Gamma
Otherwise, yeah, that is probably the best way to do it.

Which is "that" referring to? Cross product or reducing the matrix?

If it is the latter, I am actually have trouble reducing it with arbitrary variables in there.

Here is another (simpler) matrix exhibiting the same problem for me: How does one reduce it to reduced echelon form?

$\left( \begin{array}{ccc}1&4&-2\\-1&2&b\\2&3&1\\2&1&a \end{array} \right)$

for arbitrary a and b.

The problem for me is when I get to the third leading variable (whose column contains arbitrary functions of a and b).

(In this case though, I need to find an a, b such that the system has many sol'ns (or equivalently, the vectors are to be linearly dependent).)
• Apr 23rd 2009, 01:24 AM
Gamma
Easy way
okay, I thought about this a little more, and I think the easiest way to do this is as follows.

Basically you just need a vector that will make the set linearly independent. The way to check if a set of 4 vectors in $\mathbb{R}^4$ is linearly independent is to take the determinant and see if it is nonzero. If the determinant of the 4x4 matrix is not zero, your set is linearly independent.

Here is my suggestion, look at the 3 vectors given to you, line them up in a 4x3 matrix and start taking determinants of the 3x3 minors. In other words delete one row at a time and check the determinant to see if it is not 0. One of these better not be 0 (in fact I think all of them are) otherwise those vectors themselves are linearly dependent and the problem is impossible. So I checked the first 3x3 minor, ie deleted the 4th row. It had nonzero determinant, so if you pick your 4th vector to just be <0,0,0,1> by the cofactor expansion for determinant you see that the determinant of the 4x4 matrix is the same as the determinant of that 3x3 minor that you picked out in magnitude (will be off by a -1 in this case) but the point is it is nonzero. So you are done.
• Apr 23rd 2009, 01:54 AM
scorpion007
Thanks for the response. It doesn't immediately ring a bell with me, but I'll re-read it a few more times and think about it some more.
• Apr 23rd 2009, 09:36 AM
Gamma
Here
Allright for your first one here is another way to look at it. set $a_4=1$ and the rest 0. There first 3 are linearly independent by assumption (otherwise the problem is impossible, see why?). Now if this set of 4 were linearly DEpendent, you should be able to get the last column as a linear combination of the first 3.

Well, look at it, compare the 4th row, how are you going to get 1? anything times column 1 + column 2 + column 3. Okay now compare row 3, how are you going to get 0? 3* column 1 + anything times column 2 + column 3.

But that determines what you have to multply each of the columns by to satisfy the bottom two rows. multiply as follows column 1 by 3, column 2 by 1, column 3 by 1 and add. No way for the top two rows to get zeroed out as they are forced in this case to be 16 and 20 respectively.

Thus you see column 4 is in fact not a linear combination of the first 3 columns so this set is linearly independent. Got it? But dude learn to take a determinant it is 1000000 times easier in general to just do it and see if it is 0 or not. if it is 0 the set is linearly DEpendent, if it is not 0 the set is linearly INdependent.
• Apr 23rd 2009, 05:12 PM
scorpion007
Well, the way I learned to take a determinant of a large matrix is by reducing it to echelon form, then taking the product of the main diagonal (of course this is a slightly simplified description).

But here I can't reduce it to echelon form. And I've never learned the co-factor expansion method.

Please see my comment in the other thread regarding this column-wise manipulation. It intrigues me.

Anyhow, thanks, this does seem to make more sense.