# Linearly Independent

• Nov 1st 2009, 08:24 PM
Alterah
Linearly Independent
I am having difficulties with the following problem. I need to find values of t that make the set linearly independent.

Problem:
$\displaystyle S = {(t,1,1), (1,t,1), (1,1,t)}$

The answer is all t not equal to 1 and -2. I am confused on how to approach this problem.

If I set up a matrix I believe I see where -2 comes from. And we need t to not equal that otherwise that's another solution to the homogeneous equation. Matrix:

$\displaystyle \left(\begin{array}{cccc}t&1&1&0\\1&t&1&0\\1&1&t&0 \end{array}\right)$

Beyond seeing where those two numbers come from, I am not sure of the procedure behind getting them. Thanks for any help.
• Nov 1st 2009, 08:32 PM
tonio
Quote:

Originally Posted by Alterah
I am having difficulties with the following problem. I need to find values of t that make the set linearly independent.

Problem:
$\displaystyle S = {(t,1,1), (1,t,1), (1,1,t)}$

The answer is all t not equal to 1 and -2. I am confused on how to approach this problem.

If I set up a matrix I believe I see where -2 comes from. And we need t to not equal that otherwise that's another solution to the homogeneous equation. Matrix:

$\displaystyle \left(\begin{array}{cccc}t&1&1&0\\1&t&1&0\\1&1&t&0 \end{array}\right)$

Beyond seeing where those two numbers come from, I am not sure of the procedure behind getting them. Thanks for any help.

Erase that last column of zeroes (it serves no purpose in this case), and bring your matrix to echelon form, perhaps first interchanging the first and second rows, so that you'll have 1 in the first entry...

Tonio
• Nov 1st 2009, 09:04 PM
Alterah
Quote:

Originally Posted by tonio
Erase that last column of zeroes (it serves no purpose in this case), and bring your matrix to echelon form, perhaps first interchanging the first and second rows, so that you'll have 1 in the first entry...

Tonio

If I try bringing it to RREF, I get a jumble for a matrix. Taking your advice:

$\displaystyle \left(\begin{array}{ccc}1&t&1\\t&1&1\\1&1&t\end{ar ray}\right)$
$\displaystyle \left(\begin{array}{ccc}1-t&t-1&0\\t-1&0&1-t\\0&1-t&t-1\end{array}\right)$

If t = 1 here I get the trivial solution as everything becomes zero. I still think I am missing something. Not seeing where the -2 is coming from regarding the above matrix. What I have done seems to imply t = 2. Thanks for your patience.
• Nov 1st 2009, 10:25 PM
math2009
if $\displaystyle \det(A)\neq 0$ matrix A is invertible,then row and column vectors are linear independence
So find $\displaystyle f(t)=\det\begin{bmatrix}t&1&1\\1&t&1\\1&1&t\end{bm atrix}=\det\begin{bmatrix}0&1-t^2&1-t\\0&t-1&1-t\\1&1&t\end{bmatrix}=\det\begin{bmatrix}1-t^2&1-t\\t-1&1-t\end{bmatrix}=t^3-3t+2$

if $\displaystyle f(t)=0 \rightarrow \{t=1,t=-2\}$, then matrix A is noninvertible, therefore S linearly dependent

So When $\displaystyle t\neq 1\ or\ -2\ , \ S$ is linearly independent
• Nov 2nd 2009, 03:31 AM
HallsofIvy
Quote:

Originally Posted by Alterah
I am having difficulties with the following problem. I need to find values of t that make the set linearly independent.

Problem:
$\displaystyle S = {(t,1,1), (1,t,1), (1,1,t)}$

The answer is all t not equal to 1 and -2. I am confused on how to approach this problem.

If I set up a matrix I believe I see where -2 comes from. And we need t to not equal that otherwise that's another solution to the homogeneous equation. Matrix:

$\displaystyle \left(\begin{array}{cccc}t&1&1&0\\1&t&1&0\\1&1&t&0 \end{array}\right)$

Beyond seeing where those two numbers come from, I am not sure of the procedure behind getting them. Thanks for any help.

I would be inclined to go back to the basic definition of "linearly independent": for what t does a(t,1,1)+ b(1,t,1)+ c(1,1,t)= (0,0,0) have only the solution a= b= c= 0? That gives the three equations at+b+c= 0, a+bt+ c= 0, and a+ b+ ct= 0 (Which, of course have exactly the coefficient matrix you use). If we subtract the second equation from the first, we get (t-1)a+ (1-t)b= 0 or (t-1)a= (t-1)b so a= b for all t except t= 1. Similarly, subtracting the third matrix from the first gives (t- 1)a+ (1- t)c= 0 or (t-1)a= (t-1)c so a= c for all t except 1. Putting b=a and c= a into the first equation, ta+ a+ a= (t+2)a= 0 and we must have a= b= 0 unless t= -2.