# Thread: Scalar multiplicaton of the identity operator

1. ## Scalar multiplicaton of the identity operator

The question I'm trying to answer says:

Suppose the T is an element of L(V), is such that every non-zero vector in V is an eigenvector of T. Prove that T is scalar multiplication of the identity operator

The way I look at it is that for T to be able to take on every vector of V as an eigenvector T must be the identity operator itself, or some multiple of it, otherwise it would not be able to take on these exact values of V. I'm wondering how I would go about proving this or if I am on the wrong track completely.

any help is Greatly appreciated!!

2. Originally Posted by falloutboy10
The question I'm trying to answer says:

Suppose the T is an element of L(V), is such that every non-zero vector in V is an eigenvector of T. Prove that T is scalar multiplication of the identity operator

The way I look at it is that for T to be able to take on every vector of V as an eigenvector T must be the identity operator itself, or some multiple of it, otherwise it would not be able to take on these exact values of V. I'm wondering how I would go about proving this or if I am on the wrong track completely.

any help is Greatly appreciated!!
Let $\{v_1,..,v_n\}$ be a basis of the linear space, so $Tv_i=\lambda_iv_i$.
Now, first show that $\lambda_i=\lambda_j\,\,\forall\,i\,,\,j$ - Show this first for $v_1,v_2$ and then a simple inductions does the work.-
Next, after we know that $Tv_i=\lambda v_i\,\,\forall i$, check that $T-\lambda I=0=$ the zero operator.

Tonio

3. Originally Posted by falloutboy10
The question I'm trying to answer says:

Suppose the T is an element of L(V), is such that every non-zero vector in V is an eigenvector of T. Prove that T is scalar multiplication of the identity operator

The way I look at it is that for T to be able to take on every vector of V as an eigenvector T must be the identity operator itself, or some multiple of it, otherwise it would not be able to take on these exact values of V. I'm wondering how I would go about proving this or if I am on the wrong track completely.

any help is Greatly appreciated!!
Suppose there were two distinct eigenvalues of T, $\lambda_1$ and $\lambda_2$ and that $v_1$ and $v_2$ are non-zero eigenvectors corresponding to $\lambda_1$ and $\lambda_2$, respectively. Then $T(v_1+ v_2)= T(v_1)+ T(v_2)= \lambda_1v_1+ \lambda_2 v_2$. But since every vector is an eigenvector of T, there must exist some $\lambda_3$ such that $T(v_1+ v_2)= \lambda_3(v_1+ v_2)= \lambda_3v_1+ \lambda_3v_2$. That gives $\lambda_1v_2+ \lambda_2v_2$ $= \lambda_3v_1+ \lambda_3v_2$. From that, $(\lambda_1-\lambda_3)v_1= (\lambda_3- \lambda_2)v_2$.

Since $v_1$ and $v_2$ are non-zero eigenvectors corresponding to distinct eigenvalues, they cannot be multiples of one another. But that means we must have $\lambda_1- \lambda_3= 0$ and $\lambda_3-\lambda_2= 0$ leading to the conclusion that $\lambda_1= \lambda_2$, a contradiction. That is, if T has every vector as eigenvector, it must have only one eigenvalue.

4. Thank you both! I completely get it now