Results 1 to 9 of 9

Math Help - A linear map and eigenvalue problem

  1. #1
    Junior Member gusztav's Avatar
    Joined
    Jan 2008
    Posts
    48
    Awards
    1

    A linear map and eigenvalue problem

    Hi,

    I'd be immensely grateful for any help with this problem:

    Let V be a finite-dimensional vector space, and let

    A : V \to V be a linear operator (a linear map) such that AB=BA, for every linear map B : V \to V.

    Prove that there exists a scalar \alpha such that A=\alpha I.


    Oh, and there has been a hint provided:

    Hint: show that A has at least one eigenvalue and observe the corresponding eigenspace.


    ***

    First, how could I show that A has at least one eigenvalue? Of course, if A has an eigenvalue \lambda, then there exists a vector x, x \neq 0, such that Ax=\lambda x; but how to show that there is such an eigenvalue in the first place?

    And I would really appreciate if someone could show me how to proceed from that to the final solution.

    Many thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by gusztav View Post
    Hi,

    I'd be immensely grateful for any help with this problem:

    Let V be a finite-dimensional vector space, and let

    A : V \to V be a linear operator (a linear map) such that AB=BA, for every linear map B : V \to V.

    Prove that there exists a scalar \alpha such that A=\alpha I.
    If you pick a particular basis for V then A can be represented by a matrix [A], and B can be represented by a matrix [B]. We are told that AB = BA, so [AB] = [BA] \implies [A][B] = [B][A]. Therefore, A: V\to V is such a linear trasformation so that its corresponding matrix commutes with all other matrices. In order to complete your problem we need to show that [A] is a scalar multiple of the identity matrix. Let [A] = (a_{ij}). Define E_{ij} the matrix so that the ij entry is 1 and everything else is 0. Notice that I + E_{ij} is always invertible. Thus, if [A] commutes with every matrix it means (I + E_{ij})[A] = [A](I + E_{ij}). If i\not = j the ij-entry in that matrix equation tells us that a_{ij} + a_{jj} = a_{ij} + a_{ii} \implies a_{ii}= a_{jj} i.e. the matrix [A] has all its diagnol entries equal. Now consider the matrix equation (I + E_{ij})[A] = [A](I + E_{ij}) the ii-entry tells us a_{ii} + a_{ji} = a_{ii} \implies a_{ji} = 0. Thus, [A] is a scalar of a diagnol matrix.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Super Member Gamma's Avatar
    Joined
    Dec 2008
    From
    Iowa City, IA
    Posts
    517
    Alternatively, the hypothesis given tell you that A commutes with all linear transformations B. That is, it is in the center of the nxn matrices ( Z(GL(n,\mathbb{F}))), where n is the dim(V). The center is precisely the nxn scalar diagonal matrices. But I like TPH's method way better, gets right down to it.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member gusztav's Avatar
    Joined
    Jan 2008
    Posts
    48
    Awards
    1
    Quote Originally Posted by ThePerfectHacker View Post
    If you pick a particular basis for V then A can be represented by a matrix [A], and B can be represented by a matrix [b]. We are told that AB = BA, so [AB] = [BA] \implies [A][b] = [b][A]. Therefore, A: V\to V is such a linear trasformation so that its corresponding matrix commutes with all other matrices. In order to complete your problem we need to show that [A] is a scalar multiple of the identity matrix. Let [A] = (a_{ij}). Define E_{ij} the matrix so that the ij entry is 1 and everything else is 0. Notice that I + E_{ij} is always invertible. Thus, if [A] commutes with every matrix it means (I + E_{ij})[A] = [A](I + E_{ij}). If i\not = j the ij-entry in that matrix equation tells us that a_{ij} + a_{jj} = a_{ij} + a_{ii} \implies a_{ii}= a_{jj} i.e. the matrix [A] has all its diagnol entries equal. Now consider the matrix equation (I + E_{ij})[A] = [A](I + E_{ij}) the ii-entry tells us a_{ii} + a_{ji} = a_{ii} \implies a_{ji} = 0. Thus, [A] is a scalar of a diagnol matrix.
    Brilliant!

    ThePerfectHacker, thank you SO much! This has been of great help!
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by gusztav View Post
    Hi,

    I'd be immensely grateful for any help with this problem:

    Let V be a finite-dimensional vector space, and let

    A : V \to V be a linear operator (a linear map) such that AB=BA, for every linear map B : V \to V.

    Prove that there exists a scalar \alpha such that A=\alpha I.


    Oh, and there has been a hint provided:

    Hint: show that A has at least one eigenvalue and observe the corresponding eigenspace.
    I knew TPH's way of proving it. But does anyone know a way of doing it along the lines of the hint?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member gusztav's Avatar
    Joined
    Jan 2008
    Posts
    48
    Awards
    1
    Quote Originally Posted by Isomorphism View Post
    I knew TPH's way of proving it. But does anyone know a way of doing it along the lines of the hint?
    I tried using the hint, but the first obstacle was trying to prove that A has at least one eigenvalue. How can we do that?

    We are given a linear operator A, and everything we know about it, is that AB=BA, for every linear operator B:V \to V.

    This means that A(Bx)=B(Ax), (\forall x \in V)(\forall B \in L(V)).

    Let's mark Bx := x', and so we must find \lambda \in \mathbb{F} (the field) such that Ax'=\lambda x'.

    If I'm not mistaken, if we manage to find a vector x' \in V and scalar \lambda \in \mathbb{F} such that Ax'=\lambda x', then the existence of at least one eigenvalue will have been proved.

    But how to do that is a different kettle of fish...
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Junior Member gusztav's Avatar
    Joined
    Jan 2008
    Posts
    48
    Awards
    1
    Quote Originally Posted by original problem View Post
    Let V be a finite-dimensional vector space, and let

    A : V \to V be a linear operator (a linear map) such that AB=BA, for every linear map B : V \to V.

    Prove that there exists a scalar \alpha such that A=\alpha I.

    Hint: show that A has at least one eigenvalue and observe the corresponding eigenspace.
    There was also a second hint:

    Show that every eigenspace of an operator S\in L(V) is invariant for every operator T \in L(V) which commutes with S (in other words, such that ST=TS).

    But this can be shown like this:
    let S, T be \in L(V), let ST=TS and let W be the eigenspace of S corresponding to an eigenvalue \lambda. Let w \in W be an arbitrarily chosen vector.
    We have: S(T(w))=T(S(w))=T(\lambda w)= \lambda T(w),
    therefore T(w) \in W.

    *******


    To return to the starting problem, it seems to me that if we can show the existence of a scalar \alpha such that A=\alpha I for one operator B such that AB=BA, then by the preceding statement it can be shown to hold for every B \in L(V), if I'm not mistaken...
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor Swlabr's Avatar
    Joined
    May 2009
    Posts
    1,176
    If you show that the Eigenspace is the entire vector space, then you are done, as then A-\lambda I is the zero map, and so A=\lambda I. I'm not quite sure how you would show that about the Eigenspace though.

    On another trail of thought, the matrix is constant under any base change - does that not automatically make it a scalar multiple of the identity? Or will other matrices be preserved?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    proving that A has an eigenvalue is quite easy:

    let \{v_1, \cdots , v_n \} be a basis for V and define the linear map B by B(r_1v_1 + \cdots + r_nv_n)=r_1v_1. so the image of B is span \{v_1 \}. now A(v_1)=AB(v_1)=BA(v_1) \in span \{v_1 \}.

    so there exists \lambda such that A(v_1)=\lambda v_1. (this also shows that in fact every v_j is an eigenvector of A.)
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Eigenvalue Problem
    Posted in the Advanced Algebra Forum
    Replies: 10
    Last Post: July 5th 2011, 02:35 AM
  2. [SOLVED] Eigenvalue problem
    Posted in the Differential Equations Forum
    Replies: 24
    Last Post: February 27th 2011, 08:01 AM
  3. help for an eigenvalue problem!
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: April 6th 2010, 06:07 AM
  4. Eigenvalue problem
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: April 4th 2010, 08:52 PM
  5. eigenvalue problem
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: March 10th 2007, 07:29 AM

Search Tags


/mathhelpforum @mathhelpforum