Results 1 to 4 of 4

Math Help - Detemine whether vector are linearly dependent

  1. #1
    Member
    Joined
    Jun 2008
    Posts
    175

    Detemine whether vector are linearly dependent

    The vectors are (1,-2,3) (-1,3,2) and (-1,10,5) . Question asks to find if they are linearly independent.

    So I've put them into matrix form, as below

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   { - 2} & 3 & {10}  \\<br />
   3 & 2 & 5  \\<br />
\end{array}} \right]<br />

    now start gauss-jordon elimination.

    new row2 = old row2 + 2*row1

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   3 & 2 & 5  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 - 3*row1

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 5 & 8  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 - 5&row2

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 0 & { - 32}  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 divided by 32

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row2 = old row2 - 8*row3

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row1 = old row1 + row3

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & 0  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row1 = old row1 + row 2

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & 0 & 0  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    Firstly is my working correct, and secondly I would think they a linearly independent for the fact that there is only one 1 in each column am I correct in thinking this?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,
    Quote Originally Posted by Craka View Post
    The vectors are (1,-2,3) (-1,3,2) and (-1,10,5) . Question asks to find if they are linearly independent.

    So I've put them into matrix form, as below

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   { - 2} & 3 & {10}  \\<br />
   3 & 2 & 5  \\<br />
\end{array}} \right]<br />

    now start gauss-jordon elimination.

    new row2 = old row2 + 2*row1

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   3 & 2 & 5  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 - 3*row1

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 5 & 8  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 - 5&row2

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 0 & { - 32}  \\<br />
\end{array}} \right]<br />

    new row3 = old row3 divided by 32

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 8  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row2 = old row2 - 8*row3

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & { - 1}  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row1 = old row1 + row3

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & { - 1} & 0  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    new row1 = old row1 + row 2

    <br />
\left[ {\begin{array}{*{20}c}<br />
   1 & 0 & 0  \\<br />
   0 & 1 & 0  \\<br />
   0 & 0 & 1  \\<br />
\end{array}} \right]<br />

    Firstly is my working correct, and secondly I would think they a linearly independent for the fact that there is only one 1 in each column am I correct in thinking this?
    Everything is correct
    You are correct in thinking this.
    Because the columns of the final matrix are independent (they're known to form the canonical basis of \mathbb{R}^3


    Another way of doing it was to calculate the determinant of the matrix
    \begin{pmatrix} 1 & -1 & -1 \\<br />
-2 & 3 & 10 \\<br />
3 & 2 & 5\end{pmatrix}
    (see the code, he's much easier)
    If the determinant is not 0, then the vectors are linearly independent.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Jun 2008
    Posts
    175
    Thanks Moo,

    It would seem the finding the determinant would be a little more straight forward than using row reduction. We haven't covered determinant in the course yet, however I know I have used them for using in Cramer's rule, is it possible to use determinants for matrices that are not a square matrix or for 4x4 matrices and above?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Quote Originally Posted by Craka View Post
    Thanks Moo,

    It would seem the finding the determinant would be a little more straight forward than using row reduction. We haven't covered determinant in the course yet, however I know I have used them for using in Cramer's rule, is it possible to use determinants for matrices that are not a square matrix or for 4x4 matrices and above?
    Determinants are used for square matrices. I don't know if there is a way to define determinants for non-square matrices, but you don't have to care about it
    It is possible to use determinantes for nxn matrices in general, so of course it works for 4x4. But above 3x3 (Sarrus's rule), it becomes more difficult.


    As for using the determinant to prove that vectors are linearly independent... You can have an intuitive idea with that Gauss-Jordan elimination. There is indeed a step when you come to an upper triangular matrix. We know that in this case, the determinant is equal to the product of the diagonal. So if it's 0, it means that there is a 0 in the diagonal. Now try to see the situation. If there is a 0 in the diagonal of an upper triangular matrix, can you write it as a combination of other vectors ?
    Looks like you can

    Unfortunately, I can't provide a formal proof (I don't even know if it is a theorem, but I'm quite sure it works xD) I just wanted to give you an intuitive thing.
    If you haven't been taught this, then stick to Gauss-Jordan elimination. It looks like you do it well



    As a side note, once you get an upper-triangular matrix in the Gauss-Jordan elimination, you can stop at this step. Because with the 0's at the left, you can easily see that vectors (in rows) are independent or not.
    I'm sorry, my explanations may look messy... I have never liked explaining linear algebra
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Linearly Dependent
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: May 21st 2011, 05:15 AM
  2. linearly dependent in Q, linearly independent in R
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: April 12th 2011, 01:44 PM
  3. Linearly dependent
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: March 9th 2011, 05:46 PM
  4. Linearly Dependent
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: March 24th 2010, 04:44 PM
  5. linearly dependent
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: April 21st 2009, 05:35 AM

Search Tags


/mathhelpforum @mathhelpforum