
Lin. Dep. Matrix
Find all values of $\displaystyle a$ such that the set $\displaystyle \{\left[ \begin {array}{c} a\\\noalign{\medskip}1\end {array} \right], \left[ \begin {array}{c} a+2\\\noalign{\medskip}a\end {array}\right]\}$ is linearly dependent.
Work for this problem:
We have $\displaystyle \left[ \begin {array}{cc} a&a+2\\\noalign{\medskip}1&a\end {array}\right]$
Row 1  Row 2 yields:
$\displaystyle \left[ \begin {array}{cc} a&a+2\\\noalign{\medskip}a1&2\end {array}\right]$
Now not sure where to proceed.

I believe a = 1 works just by playing with it, but I don't know how to tell if that's the only solution or if there are others. . . anyone ?!

Hello,
I'm sorry no one is replying to your threads :s
Hmm I'll try for this one. Seems easy (Tongueout)
Two vectors are linearly dependent if there are m and n different from 0 such that mu+nv=0. > mu=nv.
So you have to set a such that the coordinates in the matrices have a common ratio.
$\displaystyle \frac a1=\frac{a+2}{a}$
$\displaystyle \implies a^2=a+2 \implies a^2a2=0$
so now find a :)
Note : this is similar to finding a such that the matrix 2x2 you've got is non invertible (determinant = 0)