# Thread: Basis and Derivative Transformation Question

1. ## Basis and Derivative Transformation Question

True or False: Provide a QUICK proof or a COUNTER-EXAMPLE to each of the following:

1.) If v1,...,vn are any vectors in a vector space V, then the set {v1,...,vn} is a basis for the subspace span {v1,...,vn}.

I believe that this is false because if you had n vectors which were scalar multiples of each other then it wouldn't be a basis because it couldn't span the subspace or wouldn't be linearly independent, which is required. Is that correct/ on the right track?

2.) The derivative transformation d/dx: R[x] ---> R[x] has no eigenvalues or eigenvectors.

I believe that this is false, but I'm less confident on this one. I was thinking if you took the derivative of degree 0 polynomials, i.e. any constant then things wouldn't work? Any help on this one would be appreciated!

2. ## Re: Basis and Derivative Transformation Question

Hey TimsBobby2.

For the first one: you need to consider the three axioms of the sub-space which are namely a) the zero vector being in the space, b) closure in the sub-space for scalar multiplication and addition.

The basis for a sub-space does not have to be the same as that for the whole space.

For number 2, you should firstly note that the transformation is square and secondly that you have non-zero eigenvalues if the determinant is non-zero.

When does the determinant with respect to the derivative be non-zero? (Hint: Think about when you have a minimum, maximum in a multivariable scenario).

3. ## Re: Basis and Derivative Transformation Question

Originally Posted by TimsBobby2
True or False: Provide a QUICK proof or a COUNTER-EXAMPLE to each of the following:

1.) If v1,...,vn are any vectors in a vector space V, then the set {v1,...,vn} is a basis for the subspace span {v1,...,vn}.

I believe that this is false because if you had n vectors which were scalar multiples of each other then it wouldn't be a basis because it couldn't span the subspace or wouldn't be linearly independent, which is required. Is that correct/ on the right track?
Yes, that is on the right track. Now, write out what you have said: Let $v_1$ be some vector v, then $v_2= 2v$, etc. A simpler example is just to include the 0 vector in the set!

2.) The derivative transformation d/dx: R[x] ---> R[x] has no eigenvalues or eigenvectors.

I believe that this is false, but I'm less confident on this one. I was thinking if you took the derivative of degree 0 polynomials, i.e. any constant then things wouldn't work? Any help on this one would be appreciated!
I don't understand what you mean by "wouldn't work". It is certainly true that if f(x) is a constant then df/dx= 0. That is df/dx= 0f. What does that tell you about "eigenvalues" and "eigenvectors".