You can check your answer:

Dim W = 5 - Dim Linear Span {a, b, c}, so assuming {a, b, c} are linearly indepedent, have dim W = 2. If your two proposed basis vectors for W are linearly indepedent (as they obviously are), then need only check that both those vectors dotted with a, b, and c give 0.

But the first one, (1,0,2,-1,1), when dotted with a = (1,0,2,-1,-1), gives 5, not 0. Thus you have a mistake somewhere. (Actually, it turned out that Dim Linear Span {a, b, c} = 2, so the Dim W = 3, so you didn;t have enough basis vectors.)

Since I can't type maticies correctly, I'll show how to do it algebraically:

Let v = (x1, x2, x3, x4, x5) be in W. Then v.a = v.b = v.c = 0. That gives you the 3 equations you have in your solution. Now, treat it as 3 equations in 5 unknowns, so solve for one unknown, plug it into the other equations, and repeat. In the common case of linear independance, you'll be left with 2 "free parameters" and 3 of those unknowns are functions of those 2 free parameters. Then the 2 basis vectors for W would drop out when those 5 unknowns (now really 2 unknowns) are put back into v. However, it's going to turn out that one of those 3 equations is redundant, thanks to the linear dependance of { a, b, c }, and thus it's actually 2 equations in 5 unknowns, which will have 3 "free parameters" and the other 2 unknowns solved as functions of those 3 free parameters. Yet, the situation will be the same - plug back into v, and the basis vectors will magically drop out.

It goes like this:

v.a = 0 implies x1 + 2x3 − x4 − x5 = 0

v.b = 0 implies 2x1 + x2 + x3 + x4 = 0

v.c = 0 implies 4x1 + 3x2 − x3 + 5x4 + 2x5 = 0

Now from the first equation, solve for x5, and plug that into the 2nd and 3rd:

x5 = x1 + 2x3 − x4, 2nd equation unchanged, 4x1 + 3x2 − x3 + 5x4 + 2(x1 + 2x3 − x4) = 0, so 3rd equation becomes 6x1 + 3x2 + 3x3 + 3x4 = 0, so now have:

Eq1: x5 = x1 + 2x3 − x4

Eq2: 2x1 + x2 + x3 + x4 = 0

Eq3: 2x1 + x2 + x3 + x4 = 0

The duplication signals linear dependance among {a, b, c}. (It turns out that 2a-3b+c=0, but you don't need to know that to proceed).

Now ignore Eq 3 as redundant. Solve for x4 in Eq2, getting x4 = -2x1 - x2 - x3. Plug this into Eq1 (which we want to keep solved for x5): x5 = x1 + 2x3 − (-2x1 - x2 - x3).

Have: x5 = 3x1 + x2 + 3x3 and x4 = -2x1 - x2 - x3. So that's done: 2 equations (threw out the redundant one) in 5 unknowns solves for 2 unknowns in terms of the other 3.

That's 2 depedent parameters and 3 indepedent parameters. Now here's where the basis vectors for W will magically fall out. Plug this back into v to get:

v = (x1, x2, x3, x4, x5) = ( x1, x2, x3, ( -2x1 - x2 - x3 ), ( 3x1 + x2 + 3x3 ) ) (and now write that as the sum of 3 vectors, EACH INVOLVING ONLY ONE FREE PARAMETER)

= ( x1, 0, 0, -2x1, 3x1 ) + ( 0, x2, 0, - x2, x2 ) + ( 0, 0, x3, -x3, 3x3 ) = x1 * ( 1, 0, 0, -2, 3 ) + x2 * ( 0, 1, 0, -1, 1 ) + x3 * ( 0, 0, 1, -1, 3 ).

Have shown that if v in W, then there are real numbers x1, x2, x3, such that v = x1 * ( 1, 0, 0, -2, 3 ) + x2 * ( 0, 1, 0, -1, 1 ) + x3 * ( 0, 0, 1, -1, 3 ).

Likewise, for any real numbers s1, s2, s3, doing the same calculations in reverse shows that the vector u, u = s1 * ( 1, 0, 0, -2, 3 ) + s2 * ( 0, 1, 0, -1, 1 ) + s3 * ( 0, 0, 1, -1, 3 )

will satisfy u.a = u.b = u.c = 0, and thus u will be in W.

That proves that W = Linear Span { ( 1, 0, 0, -2, 3 ), ( 0, 1, 0, -1, 1 ), ( 0, 0, 1, -1, 3 ) }, because the elements of W are exactly all the R-linear combinations of those 3 vectors.

You can check quickly that those 3 vectors are linearly independent by looking at the first 3 coordinates (though of course, knew already Dim W = 3, so they had to be indepedent).

ANSWER: Thus those 3 vectors, { ( 1, 0, 0, -2, 3 ), ( 0, 1, 0, -1, 1 ), ( 0, 0, 1, -1, 3 ) }, are a basis for W, and so Dim W = 3.

CHECK: Do 9 dot products, each of those 3 proposed basis vectors with each of a, b, and c (actually only need do 6 dot products - don't need to check dotted with c thanks to c's linear dependence.) Doing that, it checks out, and so the problem is solved.

To do it by matricies:

I'll just describe it. Dot products are row on column matrix multiplications. Thus a.v = b.v = c.v = 0 meant that THE 3x5 matrix given by row1 = a, row2 = b, and row3 = c, when applied to the vector v (treated as a 5x1 matix), multiplies out to be the 3x1 0-matrix. Call that 3x5 matrix A. Then Av=0. Thus v is in the kernel (nullspace is often used as kernel, with an identical meaning) of A. Thus W = Kernel(A).

So this procedure is EXACTLY the same as the procedure for finding the basis vectors for the kernel of a matrix (or linear map, depending on how one is conceptualizing the situation - it amounts to the same thing).

Your error was in not interpreting the result of you row-reduced matrix. Assuming you did the arithmetic correctly, it was correct to set up that matrix and row-reduce it (to really understand this, do this problem two ways, using matricies, and algebraically, but do them side by side, step by step.) You just didn't carry on with the problem, and tried to read off the answer without understanding exactly what was going on.

Hope that helps...