1. ## ad-bc proof of linear independence

given 2 vectors x1=(a,b) and x2=(c,d), prove that they are linearly independent if and only if ad-bc doesn't equal zero.
I'm familiar with ad-bc as the determinant of a matrix and if it is equal to zero then the matrix is singular. ad-bc is the area of a parallelogram

2. ## Re: ad-bc proof of linear independence

Use the definition of "linearly dependent"! Two vectors, x1 and x2, are "linearly independent" if and only if there exist numbers, A and B, at least one non-zero, such that Ax1+ Bx2= 0. Here, that means A(a, b)+ B(c, d)= (Aa+ Bc, Ab+ Bd)= (0, 0) which is equivalent to Aa+ Bc= 0, Ab+ Bd= 0. Obviously, one solution is A= B= 0. What must be true so that is NOT the only solution? One way to answer that is to try to solve the equations! If we multiply the first equation by b, Aab+ Bbc= 0, multiply the second equation by a, Aab+ Bad= 0, and subtract it from the previous equation, we eliminate A: B(ad- bc)= 0.

IF ad- bc is not 0, we can divide both sides by it getting B= 0. Putting B= 0 into Aa+ Bc= 0, we have Aa= 0. If a is not 0, A=0. If a= 0, we can put B= 0 into the other equation Ab+ Bd= 0 to get Ab= 0. If b is not 0, A= 0 again. (If both A and B are 0, ad- bc= 0.) In any case, the vectors are independent.

If ad- bc is 0, then B(ad- bc)= 0 for any value of B- including non-zero values so the vectors are dependent.

In the case of just two vectors we can also say that if ad- bc= 0 and a is not 0, then d= bc/a so that <c, d>= <c, bc/a>= (c/a)<a, b> showing that <c, d> is just a multiple of <a, b>. If a= 0, ad- bc= 0 becomes bc= 0 so either b=0 or c= 0. If b= 0, <a, b>= <0, 0> which is linearly dependent with any other vector. If c= 0, then we have <0, b> and <0, d> so that <0, d>= (d/b)<0, b> and, again, one vector is a multiple of the other.

3. ## Re: ad-bc proof of linear independence

Thanks very much! I guess it's stronger to prove by algebra.
so you have to prove 2 things, 1. given linear independence, A=B=0, then ad-bc doesn't equal 0, is this because then B(ad-bc)=0 and ad-bc can be arbitrary and does not necessarily equal 0 (but then both B And ad-bc can equal 0 and the proof fails?) if A=B=0 then it doesn't follow necessarily from the Aa+Bc=0, Ab+Bd=0 that ad-bc doesn't equal 0...?

2.given ad-bc doesn't equal 0 then the vectors are linearly independent: B=0 leads to Aa=0, Ab=0, if A does not equal 0 then a=b=0 (0,0) is linearly dependent on (c,d)0. if a and b don't equal 0 then A=0 then you can prove linear independence but then a and b can equal 0 in the problem since the vectors are in all of V2?

the problem seems to allude to this proof: n vectors in Rn are linearly dependent if and only if the determinant of the matrix taking the vectors as its columns=0. or also Steinitz's theorem and linear independence is equivalent to the non-existence of non trivial solutions, proportionality to non trivial solutions (x,y) (-y,x)
from (a,c)=h(b,d) you can derive ad-bc=0 with h=1

I know we are all busy so can someone else please let me know if I've made any mistakes? is there another way to prove this using matrix theory?

4. ## Re: ad-bc proof of linear independence

Apparently there is a Leibniz formula in which the identity permutation is the only one that gives a nonzero contribution and from this or Laplace expansion you can deduce that
This n-linear function is an alternating form. This means that whenever two columns of a matrix are identical, or more generally some column can be expressed as a linear combination of the other columns (i.e. the columns of the matrix form a linearly dependent set), its determinant is 0.

so maybe you can prove in this way that the converse is true that then the columns of a matrix form a linearly independent set then it's determinant is nonzero. but how do you get the 2X2 matrix? ab+cd, V1+V2 gives you a parallelogram whose area is the determinant ad-bc and if Det=0 then the vectors are linearly dependent and the area of the parallelogram is 0. but if the vectors are linearly independent then they can be added to form a parallelogram whose area is non zero- kind of a geometric proof

5. ## Re: ad-bc proof of linear independence

I wonder why he gives this problem without the matrix theory (I started at chapter 12). can you intuitively prove this just from chapter 12 Apostol? it seems impossible to prove without some linear algebra

6. ## Re: ad-bc proof of linear independence

Originally Posted by mathnerd15
I wonder why he gives this problem without the matrix theory (I started at chapter 12). can you intuitively prove this just from chapter 12 Apostol?
You don't need matrix theory.
In $\mathcal{R}^2$ two vectors are linearly independent if and only if they are not parallel.

Two vectors are parallel if and only if they are scalar multiples of each other.