Linearly independent vectors

• April 5th 2011, 09:19 PM
cottontails
Linearly independent vectors
(Note: the bold letters are vectors.)
"Suppose that v and w are vectors which are not parallel (so are linearly independent) and the following vector equation holds for some scalars α and β.

v + α(w - v) = β(v + 1/2w).

Find α and β."

With my working out:
-> (1 − α − β) v + (α − β/2)w = 0
-> 1 − α − β = 0 = α − β/2 (by linear independence)
...and from this, I attempted to solve the equations simultaneously but did not get the right answer. (Which was, α = 1/3, β = 2/3)

I really don't understand how you could get that answer either. So, my only problem here is just solving the equations to get the final answer.
• April 5th 2011, 09:30 PM
TheEmptySet
Quote:

Originally Posted by cottontails
(Note: the bold letters are vectors.)
"Suppose that v and w are vectors which are not parallel (so are linearly independent) and the following vector equation holds for some scalars α and β.

v + α(w - v) = β(v + 1/2w).

Find α and β."

With my working out:
-> (1 − α − β) v + (α − β/2)w = 0
-> 1 − α − β = 0 = α − β/2 (by linear independence)
...and from this, I attempted to solve the equations simultaneously but did not get the right answer. (Which was, α = 1/3, β = 2/3)

I really don't understand how you could get that answer either. So, my only problem here is just solving the equations to get the final answer.

Your system of equations is correct you have

$1-\alpha-\beta=0 \iff \alpha+\beta=1$
$\displaystyle \alpha-\frac{\beta}{2}=0$

Now subtract the bottom equation from the top to get

$\displaystyle \frac{3\beta}{2}=1 \iff \beta=\frac{2}{3} \implies \alpha=\frac{1}{3}$
• April 5th 2011, 10:14 PM
cottontails
Oh, I actually tried solving it by just making both equations equal to 0 and solving the simultaneously that way (with a way that didn't result in α + β = 1...). So, I now realised where I went wrong. It makes complete sense now though. Thank you!