This is a really stupid question but for some reason I can't figure it out!!

Given [1,0,1],[2,1,3] (column vectors). How do I figure out if they are linearly independent or not?

Printable View

- May 6th 2008, 09:57 PMpakmanSpanning set
This is a really stupid question but for some reason I can't figure it out!!

Given [1,0,1],[2,1,3] (column vectors). How do I figure out if they are linearly independent or not? - May 6th 2008, 10:05 PMMathnasium
If I recall, in order to be linearly independent, we must be UNABLE to write one as a multiple of the other.

Let's assume they are linearly DEPENDENT. Thus, there exists some numbee a such that a[1,0,1] = [2,1,3].

Note that a[1,0,1] = [a,0,a].

So, we're assuming that [a,0,a] = [2,1,3]. The only way vectors (or matrices) are the same is if each element is equal to the element in the corresponding spot. So we get three equations:

a = 2

0 = 1

a = 3.

Clearly, the second equation is enough to demonstrate that they aren't linearly dependent. Also, equations 1 and 3, taken together, state that a must be both 2 and 3, also an impossibility.

So they aren't linearly dependent, which was our assumption. Thus, they MUST be linearly independent. - May 6th 2008, 10:05 PMTheEmptySet