I have this problem, I think I've solved the first part, anyway, I'd like you to see it, because I'm not sure if I've proceeded right.
The problem says: Let . Find if:
a) , where
b) A linearly independent set
So, basically what I did in a) was stating:
And then .
Is this right in the first place?
And for b) I've stated , here x and y are vectors in X.
I know there's no another possible linearly independent vector in X, so there are no others orthogonal complements I think. But I don't know how to proceed from here.
Bye, and thank you for your help, which is always useful.
Don't forget that an orthogonal complement is always a linear subspace. In (a), for example, cannot consist of a single vector. It will have to be the one-dimensional subspace consisting of all scalar multiples of that vector.
Your answer to (b) is essentially on the right lines, except that there is one vector that lies in any linear subspace. So the answer here will be the (zero-dimensional) subspace consisting of that one vector.