# Math Help - Some Basis and Subspace problem/s

1. ## Some Basis and Subspace problem/s

1. Find a basis for the subspace of $M_{3}$ defined as V = $\{\begin{bmatrix}x & 0 & 0 \\0 & x & 0\\0 & 0 & x\end{bmatrix}: x \ \epsilon \ R\}$

2. Given the following subset, B, of $P_{2}$, show that B is a basis for $P_{2}$ or show that B is not a basis: $B = \{f_{1}, \ f_{2}, \ f_{3} \}$, where $\ f_{1}(x)= x^2 - 1 \$, $\ f_{2}(x)= x + 1 \$, $\ f_{3}(x)= x^2 + x + 1$.

3. Given that $A \ \epsilon M_{3} \ and \ V = \{X \ \epsilon \ R^3 : AX = 0 \}$, show that V is a subspace of $R^3$

Here's what I think:

1.
(x $\begin{bmatrix}1 & 0 & 0 \\0 & 0 & 0\\0 & 0 & 0\end{bmatrix}$, x $\begin{bmatrix}0 & 0 & 0 \\0 & 1 & 0\\0 & 0 & 0\end{bmatrix}$, x $\begin{bmatrix}0 & 0 & 0 \\0 & 0 & 0\\0 & 0 & 1\end{bmatrix}$) (not sure if the x's belong)

2. $( (f_{3}-f_{2}, \ \ f_{2}-(f_{3} -(f_{2}+f_{1})), \ \ f_{3} - (f_{2}+f_{1}))$(I just learned basis so I'm not really sure if I can do that or not but since the basis for $P_{2}$ is $(x^2, \ x, \ 1)$ I just used the given equations to show that that's arrivable with what's given)

3. Could I just use the 0 matrix? It IS a subspace, right?

2. The first one can be expressed as a vector $span=\{ \alpha_1(1,0,0),
\alpha_2(0,1,0),\alpha_3(0,0,1)\}$
(check it though)

Second one can be developed by a basis $P=(1,x,x2)$
so to check whether $f_1,f_2,f_3$ are linearly independent and therefore form a basis you should develop them:
$[f_1]_P , [f_2]_P, [f_3]_P$
$[f_1]_P=(-1,0,1), [f_2]_P=(1,1,0) ,[f_3]_P=(1,1,1)$
so $[b]_P$ $\begin{bmatrix}-1 & 1 & 1 \\0 & 1 & 1\\1 & 0 &1\end{bmatrix}$ get it to reduced row echelon form and you'll see if the vectors are linearly independent or not.

In the third one all you should to is check whether it's a subspace or not it's rather easy.
you need to check if $X \epsilon V$ and $Y \epsilon V$ then $X+Y \epsilon V$ and $\lambda X\epsilon V$

3. 1) Wicked is incorrect on the first one: Since there is only the single parameter, x, the space is one dimensional and a basis consists of a single matrix. Rewrite this as
$\begin{bmatrix}x & 0 & 0 \\ 0 & x & 0 \\ 0 & 0 & x\end{bmatrix}= x\begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}$
and the basis should be obvious.

(Wicked's answer would be correct for the space of matrices $\begin{bmatrix}x & 0 & 0 \\ 0 & y & 0 \\ 0 & 0 & z\end{bmatrix}= x\begin{bmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0\end{bmatrix}$ $+ y\begin{bmatrix}0 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0\end{bmatrix}$ $+ z\begin{bmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1\end{bmatrix}$ where x, y, and z can be any real numbers.

2) A "basis" for a vector space is defined by two properties:
a) It spans the space.
b) The vectors in it are independent.

To show that this set spans the space, you must show that any polynomial in it, $y= ax^2+ bx+ c$ for any numbers a, b, and c, can be written as a linear combination of $x^2-1$, x+ 1, and $x^2+ x+ 1$. That is, given that $ax^2+ bx+ c= \alpha(x^2- 1)+ \beta(x+1)+ \gamma(x^2+ x+ 1)$, show that you can solve for $\alpha$, $\beta$, and $\gamma$ in terms of a, b, c.

To show that these are independent, show that the only way you can have $\alpha(x^2- 1)+ \beta(x+1)+ \gamma(x^2+ x+ 1)= 0$ is if $\alpha= \beta= \gamma= 0$.

You can do both of those by using the fact that if two polynomials are equal for all x, then "corresponding coefficients" must be equal.

Wicked only suggests showing that the functions are independent. That is because a basis has three properties- the two I mentioned above,
a) It spans the space.
b) The vectors in it are independent.
and
c) The number of vectors in it is the same as the dimension of the space.
Further, any two of those is sufficient to prove the third.

So if you know that the dimension of $P^2$ is 3, then showing that the vectors are independent is enough.

3. A is a 3 by 3 matrix and $V= \{X\in R^3| AX= 0\}$
No, you cannot "just use the 0 matrix" because you have no reason to believe that AX= 0 only for X=0!

Wicked is completely correct here. A subset of a vector space is a subspace if and only if it is closed under addtion and scalar multiplication.

IF X and Y are in V, then AX= 0 and AY= 0. What is A(X+ Y)?

If X is in V then AX= 0. If $\lambda$ is a number, what is $A(\lambda X)$?

In both of those use the definition of "linear operator".

4. Originally Posted by HallsofIvy

2) A "basis" for a vector space is defined by two properties:
a) It spans the space.
b) The vectors in it are independent.

To show that this set spans the space, you must show that any polynomial in it, $y= ax^2+ bx+ c$ for any numbers a, b, and c, can be written as a linear combination of $x^2-1$, x+ 1, and $x^2+ x+ 1$. That is, given that $ax^2+ bx+ c= \alpha(x^2- 1)+ \beta(x+1)+ \gamma(x^2+ x+ 1)$, show that you can solve for $\alpha$, $\beta$, and $\gamma$ in terms of a, b, c.

To show that these are independent, show that the only way you can have $\alpha(x^2- 1)+ \beta(x+1)+ \gamma(x^2+ x+ 1)= 0$ is if $\alpha= \beta= \gamma= 0$.

You can do both of those by using the fact that if two polynomials are equal for all x, then "corresponding coefficients" must be equal.

Wicked only suggests showing that the functions are independent. That is because a basis has three properties- the two I mentioned above,
a) It spans the space.
b) The vectors in it are independent.
and
c) The number of vectors in it is the same as the dimension of the space.
Further, any two of those is sufficient to prove the third.

So if you know that the dimension of $P^2$ is 3, then showing that the vectors are independent is enough.
.
Yes,I realized my mistake in the first one but since he said he knows that $P=(1,x,x^2)$ and it's obvious that it's pace is 3,I don't think I mislead him in the second question.

5. Originally Posted by HallsofIvy
1) Wicked is incorrect on the first one: Since there is only the single parameter, x, the space is one dimensional and a basis consists of a single matrix. Rewrite this as
$\begin{bmatrix}x & 0 & 0 \\ 0 & x & 0 \\ 0 & 0 & x\end{bmatrix}= x\begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}$
and the basis should be obvious.
So...is my answer correct or do I just write it as one matrix?

$(x \begin{bmatrix}1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0\end{bmatrix} , x \begin{bmatrix}0 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0\end{bmatrix} , x \begin{bmatrix}0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1\end{bmatrix})$

EDIT: Is it just $\begin{bmatrix}1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}$?