In a past paper im stuck on this question
Do i have to seperate them to linear dependency?
Are those definitions of vectors being a vector normal to the plane given? If so, the vectors are not unique, unless it's understood they go through the origin and are unit vectors, or something like that.
Part i) does not parse grammatically - I have no idea what's being asked there. "de" is not a word in the English language, that I'm aware of.
Part ii) makes sense only if you've correctly solved part i).
A little more clarification would be nice!
The problem says "being a basis". Yes, every basis is independent- if these are not already a basis for whatever set they span (I presume that these span some subset of ), then they must be dependent.
In each of those equations, you can solve for one of the variables, say z, in terms of the other two:
z= -3x- 2y, z= -(1/3)x- (2/3)y, z= -(5/2)x+ (1/2)y, and z= (1/4)x+ (3/4)y. That is, the subspaces spanned by each one separately are
(x, y, -3x- 2y)= x(1, 0, -3)+ y(0, 1, -2)
(x, y, -(1/3)x- (2/3)y)= x(1, 0, -1/3)+ y(0, 1, -2/3)
(x, y, -(5/2)x+ (1/2)y)= x(1, 0, -5/2)+ y(0, 1, 1/2)
(x, y, (1/4)x+ (3/4)y= x(1, 0, 1/4)+ y(0, 1, 3/4).
Of course, if it is true that you can separate these into sets that each span the same subspace, then that suspace must have dimension 2- you can separate them into two sets of two vectors each. I would suggest just trying pairs: do (v1, v2) and (v3, v4) span the same subspace? Do (v1, v3) and (v2, v4)? What about (v1, v4) and (v2, v3)?
A set A of vectors spans a space if you can write any vector in the space as a linear combination of vectors in A. The vectors in A need not be linearly independent (i.e., you might have superfluous vectors in A). So, A spanning a space does not necessarily imply that the vectors in A are linearly independent.
A set B of vectors that are linearly independent might not span a particular vector space. In particular, if the number of vectors in B is less than the dimension of the vector space, you're guaranteed not to span it. So, linear independence does not necessarily imply that the vectors in B span the vector space.
However, if you have a set C of n linearly independent vectors in a vector space of dimension n, then C is a basis for the vector space, and spans it. A basis is kind of the "minimum required" set of vectors that spans a space.
Hopefully, this will clear up any confusion about the relationship between spanning and linear independence.
I'm confused, too. The way that question in the OP is worded makes no sense to me. The way they define vectors is highly non-conventional. Until it is made clear exactly which vectors they mean, it's not exactly productive to start doing calculations. Incidentally, in the real world, problem-definition often comprises at least half of the time spent solving problems. The real world is vague and difficult to pin down sometimes. In this case, however, as HallsofIvy has already pointed out, this problem is very poorly edited, if it was edited at all.
However, once you do know exactly what the vectors are in the original set, it's fairly straight-forward to do what needs to be done: partition the set into two pieces, one of which spans the set, and the other of which is linearly dependent on the first. The relative sizes of those two pieces is going to depend on the dimension of the original set.
My advice: look for context in the book from which you have taken this problem. See how they define vectors in a similar fashion to this. See if you can derive a clearer method of defining the vectors that is equivalent to this method.