# Linear Algebra Proofs regarding subspaces and spans

• Feb 8th 2013, 11:04 AM
zachoon
Linear Algebra Proofs regarding subspaces and spans
1. Prove or give a counterexample to the following claim:
Claim: Let V be a vector space over the field F and suppose that W1, W2 and W3
are subspaces of V such that W1 + W3 = W2 + W3. Then W1 = W2.

2. Consider the following subspaces of the vector space R^3
over the field R of real numbers:
subspace U1, which is the plane x + y + z = 0 and subspace U2, which is the yz-plane.
a) Can R^3 be written as a sum of U1 and U2? Justify your answer.
b) Can R^3 be written as a direct sum of U1 and U2? Justify your answer.
Here x, y and z denote the usual Cartesian coordinates.

3. Let V be a vector space over the field F and suppose (v1, v2, ... , vn) is
a linearly independent set of vectors in V . Now suppose there exists w in V such
that (v1 + w, v2 + w, ... , vn + w) is a linearly dependent set of vectors in V . Prove
that w in span(v1, v2, ... , vn).

Thank you.
• Feb 8th 2013, 11:46 AM
HallsofIvy
Re: Linear Algebra Proofs regarding subspaces and spans
I would really like to see how you would at least attempt these. For example, to show that W1= W2, you must show "if vector v is in W1 then v is in W2" and "if vector v is in W2 then it is in W1". If vector v is in W1 then, for any vector, u, in W3, v+ u is in W1+ W3. Because W1+ W3= W2+ W3, v+u is in W3. Therefore, ...
• Feb 8th 2013, 12:23 PM
jakncoke
Re: Linear Algebra Proofs regarding subspaces and spans
For
1)
Take $\displaystyle W_1 = Span(\begin{bmatrix}1\\0\\0 \end{bmatrix},\begin{bmatrix}0\\1\\0\end{bmatrix}) , W_3 = Span(\begin{bmatrix}1\\0\\0 \end{bmatrix}), W_2 = Span(\begin{bmatrix}0\\1\\0 \end{bmatrix})$

Now it is true $\displaystyle W_1 + W_3 = W_2 + W_3$ but is $\displaystyle W_1 = W_2$ ?

For 3)

Consider what it means to be linearly dependent, the homogenrous eqn $\displaystyle Ax = 0$ has atleast one non trivial (atleast one coordinate or entry is non zero)

so you have $\displaystyle c_1(v_1+w) + ... + c_n(v_n+w) = 0$ and $\displaystyle c_{i}(v_i+w) + ... + c_{k}(v_m+w) = 0$ where $\displaystyle i,..,k \in A$ (index set which contains the indicies of the non zero coordinates for the homogernous eqn). or basically $\displaystyle c_{i}v_i +... + c_{k}v_k + c_{i}w+...+c_{k}w = 0$ or $\displaystyle c_{i}v_i +... + c_{k}v_k = c_{i}w+...+c_{k}w = (c_{i}+..+c_{k})w$Divide both sides by $\displaystyle (c_{i}+..+c_{k})$ Why Can we Say with a gurantee that $\displaystyle (c_{i}+..+c_{k})$wont equal zero? Think about that one.

For
2)
a)yes
b)no

You have to justify it yourself, write out here and i can guide you.