# Thread: A linear map and eigenvalue problem

1. ## A linear map and eigenvalue problem

Hi,

I'd be immensely grateful for any help with this problem:

Let $V$ be a finite-dimensional vector space, and let

$A : V \to V$ be a linear operator (a linear map) such that $AB=BA$, for every linear map $B : V \to V$.

Prove that there exists a scalar $\alpha$ such that $A=\alpha I$.

Oh, and there has been a hint provided:

Hint: show that $A$ has at least one eigenvalue and observe the corresponding eigenspace.

***

First, how could I show that $A$ has at least one eigenvalue? Of course, if $A$ has an eigenvalue $\lambda$, then there exists a vector $x$, $x \neq 0$, such that $Ax=\lambda x$; but how to show that there is such an eigenvalue in the first place?

And I would really appreciate if someone could show me how to proceed from that to the final solution.

Many thanks!

2. Originally Posted by gusztav
Hi,

I'd be immensely grateful for any help with this problem:

Let $V$ be a finite-dimensional vector space, and let

$A : V \to V$ be a linear operator (a linear map) such that $AB=BA$, for every linear map $B : V \to V$.

Prove that there exists a scalar $\alpha$ such that $A=\alpha I$.
If you pick a particular basis for $V$ then $A$ can be represented by a matrix $[A]$, and $B$ can be represented by a matrix $[B]$. We are told that $AB = BA$, so $[AB] = [BA] \implies [A][B] = [B][A]$. Therefore, $A: V\to V$ is such a linear trasformation so that its corresponding matrix commutes with all other matrices. In order to complete your problem we need to show that $[A]$ is a scalar multiple of the identity matrix. Let $[A] = (a_{ij})$. Define $E_{ij}$ the matrix so that the $ij$ entry is $1$ and everything else is $0$. Notice that $I + E_{ij}$ is always invertible. Thus, if $[A]$ commutes with every matrix it means $(I + E_{ij})[A] = [A](I + E_{ij})$. If $i\not = j$ the $ij$-entry in that matrix equation tells us that $a_{ij} + a_{jj} = a_{ij} + a_{ii} \implies a_{ii}= a_{jj}$ i.e. the matrix $[A]$ has all its diagnol entries equal. Now consider the matrix equation $(I + E_{ij})[A] = [A](I + E_{ij})$ the $ii$-entry tells us $a_{ii} + a_{ji} = a_{ii} \implies a_{ji} = 0$. Thus, $[A]$ is a scalar of a diagnol matrix.

3. Alternatively, the hypothesis given tell you that A commutes with all linear transformations B. That is, it is in the center of the nxn matrices ( $Z(GL(n,\mathbb{F}))$), where n is the dim(V). The center is precisely the nxn scalar diagonal matrices. But I like TPH's method way better, gets right down to it.

4. Originally Posted by ThePerfectHacker
If you pick a particular basis for $V$ then $A$ can be represented by a matrix $[A]$, and $B$ can be represented by a matrix $[b]$. We are told that $AB = BA$, so $[AB] = [BA] \implies [A][b] = [b][A]$. Therefore, $A: V\to V$ is such a linear trasformation so that its corresponding matrix commutes with all other matrices. In order to complete your problem we need to show that $[A]$ is a scalar multiple of the identity matrix. Let $[A] = (a_{ij})$. Define $E_{ij}$ the matrix so that the $ij$ entry is $1$ and everything else is $0$. Notice that $I + E_{ij}$ is always invertible. Thus, if $[A]$ commutes with every matrix it means $(I + E_{ij})[A] = [A](I + E_{ij})$. If $i\not = j$ the $ij$-entry in that matrix equation tells us that $a_{ij} + a_{jj} = a_{ij} + a_{ii} \implies a_{ii}= a_{jj}$ i.e. the matrix $[A]$ has all its diagnol entries equal. Now consider the matrix equation $(I + E_{ij})[A] = [A](I + E_{ij})$ the $ii$-entry tells us $a_{ii} + a_{ji} = a_{ii} \implies a_{ji} = 0$. Thus, $[A]$ is a scalar of a diagnol matrix.
Brilliant!

ThePerfectHacker, thank you SO much! This has been of great help!

5. Originally Posted by gusztav
Hi,

I'd be immensely grateful for any help with this problem:

Let $V$ be a finite-dimensional vector space, and let

$A : V \to V$ be a linear operator (a linear map) such that $AB=BA$, for every linear map $B : V \to V$.

Prove that there exists a scalar $\alpha$ such that $A=\alpha I$.

Oh, and there has been a hint provided:

Hint: show that $A$ has at least one eigenvalue and observe the corresponding eigenspace.
I knew TPH's way of proving it. But does anyone know a way of doing it along the lines of the hint?

6. Originally Posted by Isomorphism
I knew TPH's way of proving it. But does anyone know a way of doing it along the lines of the hint?
I tried using the hint, but the first obstacle was trying to prove that $A$ has at least one eigenvalue. How can we do that?

We are given a linear operator $A$, and everything we know about it, is that $AB=BA$, for every linear operator $B:V \to V$.

This means that $A(Bx)=B(Ax), (\forall x \in V)(\forall B \in L(V))$.

Let's mark $Bx := x'$, and so we must find $\lambda \in \mathbb{F}$ (the field) such that $Ax'=\lambda x'$.

If I'm not mistaken, if we manage to find a vector $x' \in V$ and scalar $\lambda \in \mathbb{F}$ such that $Ax'=\lambda x'$, then the existence of at least one eigenvalue will have been proved.

But how to do that is a different kettle of fish...

7. Originally Posted by original problem
Let $V$ be a finite-dimensional vector space, and let

$A : V \to V$ be a linear operator (a linear map) such that $AB=BA$, for every linear map $B : V \to V$.

Prove that there exists a scalar $\alpha$ such that $A=\alpha I$.

Hint: show that $A$ has at least one eigenvalue and observe the corresponding eigenspace.
There was also a second hint:

Show that every eigenspace of an operator $S\in L(V)$ is invariant for every operator $T \in L(V)$ which commutes with $S$ (in other words, such that $ST=TS$).

But this can be shown like this:
let $S, T$ be $\in L(V)$, let $ST=TS$ and let $W$ be the eigenspace of $S$ corresponding to an eigenvalue $\lambda$. Let $w \in W$ be an arbitrarily chosen vector.
We have: $S(T(w))=T(S(w))=T(\lambda w)= \lambda T(w)$,
therefore $T(w) \in W$.

*******

To return to the starting problem, it seems to me that if we can show the existence of a scalar $\alpha$ such that $A=\alpha I$ for one operator $B$ such that $AB=BA$, then by the preceding statement it can be shown to hold for every $B \in L(V)$, if I'm not mistaken...

8. If you show that the Eigenspace is the entire vector space, then you are done, as then $A-\lambda I$ is the zero map, and so $A=\lambda I$. I'm not quite sure how you would show that about the Eigenspace though.

On another trail of thought, the matrix is constant under any base change - does that not automatically make it a scalar multiple of the identity? Or will other matrices be preserved?

9. proving that A has an eigenvalue is quite easy:

let $\{v_1, \cdots , v_n \}$ be a basis for $V$ and define the linear map $B$ by $B(r_1v_1 + \cdots + r_nv_n)=r_1v_1.$ so the image of $B$ is $span \{v_1 \}.$ now $A(v_1)=AB(v_1)=BA(v_1) \in span \{v_1 \}.$

so there exists $\lambda$ such that $A(v_1)=\lambda v_1.$ (this also shows that in fact every $v_j$ is an eigenvector of $A.$)