Results 1 to 9 of 9

Math Help - Little exercise about eigenvalues

  1. #1
    Newbie
    Joined
    Mar 2007
    Posts
    24

    Little exercise about eigenvalues

    Hi,

    I'm doing a course that often uses linear algebra, and it implies at a certain point that, given two symmetric real matrices A and B, both positive (semi)definite, then AB has only real and non-negative eigenvalues.

    I proved it but it's too long, it's gotta be more obvious.

    Here's my proof.

    Unless AB=0 matrix, then AB surely has at least one non-zero complex conjugate pair of eigenvalues and complex conjugate pair of eigenvectors.

    * is for conjugate, ' is for transpose.

    So let c be a complex non-zero eigenvalue, and the corresponding complex eigenvector v:

    ABv=cv

    Let z=Bv, z obviously is nonzero, then Az=cv and Az*=c*v*

    z*'Az is real and positive because A is positive semidefinite.

    z*'Az=cz*'v=(Az*)'v=c*v*'z

    z*'v=v*'Bv, and this is real and positive too because B is positive semidefinite.

    Therefore c is real and positive too.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    Here's a proof that looks short and snappy. But it has to use a certain amount of machinery. I don't think you'll find a really elementary proof of this result – it certainly isn't obvious.

    Fact 1. For any two nxn matrices P and Q, PQ and QP always have the same spectrum (spectrum = set of eigenvalues). There's a short proof of that here.

    Fact 2. A positive semidefinite matrix  A has a positive semidefinite square root A^{1/2}. Proof: A can be diagonalised. The square root is obtained by forming the diagonal matrix whose diagonal entries are the square roots of those in the diagonal matrix obtained from A.

    If A and B are positive semidefinite then the spectrum of AB=A^{1/2}(A^{1/2}B) is the same as the spectrum of A^{1/2}BA^{1/2 (by Fact 1). But A^{1/2}BA^{1/2 is positive semidefinite, so its spectrum is real and non-negative.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Mar 2007
    Posts
    24
    Thanks!

    Thank you also for giving me this very useful result I didn't know (that PQ and QP have the same eigenvalues).
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Mar 2007
    Posts
    24
    Just another little question: so if the spectrum of PQ is the same of the spectrum of QP (including algebraic and geometric multiplicity), does your proof imply also that AB can be diagonalized, since A^(1/2)BA^(1/2) can?

    This would make it even more powerful
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor FernandoRevilla's Avatar
    Joined
    Nov 2010
    From
    Madrid, Spain
    Posts
    2,162
    Thanks
    45
    Quote Originally Posted by rargh View Post
    Just another little question: so if the spectrum of PQ is the same of the spectrum of QP (including algebraic and geometric multiplicity)

    That is not true, choose:

    P=\begin{bmatrix}{1}&{0}\\{0}&{0}\end{bmatrix}\qua  d Q=\begin{bmatrix}{0}&{1}\\{0}&{0}\end{bmatrix}


    Fernando Revilla
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Mar 2007
    Posts
    24
    Ok, but in this counterexample by professor Revilla, PQ=Q and it has only one eigenvalue, 0, with algebraic multiplicity 2. QP=0, which is diagonal and has 0 eigenvalue with algebraic multiplicity 2. Of course, in PQ the geometric multiplicity of 0 is 1, while in QP is 0.

    But I'd like to point out that if PQ has at least one non-zero eigenvalue, then:

    let c be a nonzero eigenvalue of PQ and v a corresponding eigenvector.

    Then, as in Opalg's reference, we see that:

    PQv=cv, and by assumption Qv is not 0.

    Then Q(PQ)v=cQv=(QP)(Qv),

    therefore Qv is an eigenvector of QP with eigenvalue c.

    Similarly, if w is an eigenvector of QP with non-zero eigenvalue, then, Pw is an eigenvector for PQ with the same eigenvalue.


    Let v1,...vi,...vn be a basis of the eigenspace E(c,PQ)

    Suppose Qv1, Qvi, ...., Qvn are linearly dependent, e.g. with z=sum (a_iv_i) for some scalar a_i s, then it implies Qz=0, but z is still an eigenvector with eigenvalue c, therefore it would be absurd, so we can state that:

    dim[E(c,QP)]>=dim[E(c,PQ)]

    We can similarly state, applying P to the eigenspaces of QP, that:

    dim[E(c,PQ)]>=dim[E(c,QP)]

    therefore dim[E(c,PQ)]=dim[E(c,QP)]

    This doesn't apply if c=0, in fact in the counterexample dim[Ker(PQ)]=1, dim[Ker(QP)]=2

    Now, using this result, if we know that:

    if PQ is diagonalizable, and rank(PQ)=rank(QP) (equivalently, dim[Ker(PQ)]=dim[Ker(QP)]), then QP is diagonalizable too.
    Last edited by rargh; February 22nd 2011 at 03:16 AM. Reason: i made incorrect statements
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    It is true that the eigenspaces of the nonzero eigenvalues have the same dimension in PQ as in QP. The trouble in general is that the null spaces of PQ and QP (the eigenspaces corresponding to the zero eigenvalue) need not have the same dimension, as FernandoRevilla's example shows.

    Quote Originally Posted by rargh View Post
    if PQ is diagonalizable and it has at least one nonzero eigenvalue, then also QP is diagonalizable
    Unfortunately that is not true either, as you can see by taking FernandoRevilla's example and tacking on a nonzero eigenvalue:

    P = \begin{bmatrix}1&0&0\\ 0&0&0\\ 0&0&1\end{bmatrix}, \quad Q = \begin{bmatrix}0&1&0\\ 0&0&0\\ 0&0&1\end{bmatrix}.

    However, it does look as though in the particular case where P = AB and Q = A^{1/2}BA^{1/2}, it may be true that the two matrices have the same rank. At any rate, I can't find any counterexample to it and I have a feeling that it's true. If so, then it does follow that AB is diagonalisable.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Mar 2007
    Posts
    24
    Ok, thank you for all the useful corrections, I think I finally got some better specifications.

    So we have seen that the necessary and sufficient condition for QP to be diagonalizable when we already know that PQ is, is that dim[Ker(PQ)]=dim[Ker(QP)] (which we hadn't in all the counterexamples), or equivalently, rank(PQ)=rank(QP).

    Proof: If PQ is diagonalizable, and V indicates the whole finite dimensional vector space on which it operates, then, indicating with c_i the distinct non zero eigenvalues and Ker(PQ) the null space of PQ,

    sum[dim(E(c_i,PQ))]+[dim(Ker(PQ)]=dim(V)

    This equation is the necessary and sufficient condition for PQ to be diagonalizable.

    Since we know the eigenspaces for non-zero eigenvalues have the same dimension for PQ and QP, and dim[(ker(PQ)]=dim[ker(QP)], the equation is satisfied also for QP therefore it is diagonalizable.

    (the assumption of equality of dimensions of null spaces is independent from the assumption that PQ is diagonalizable).

    Now, in our special case, let's check if they have the same rank.

    PQ=AB

    QP=A^(1/2)BA^(1/2)

    We can see that, since (AB)'=BA, rank(AB)=rank(BA).

    Then, we know that: rank(BA)=rank(BA^(1/2)), since Ker(A^(1/2))=Ker(A), and Image(A)=Image(A^(1/2))

    Now, we have to check if: rank(A^(1/2)BA^(1/2))=rank(BA^(1/2)).

    Suppose BA^(1/2)v=w, where w is non zero, but, A^(1/2)BA^(1/2)v=0

    Remembering that A and B are positive semidefinite, would happen if and only if BA^(1/2)v=0, which is against the assumption. Therefore rank(A^(1/2)BA^(1/2))=rank(BA^(1/2))=...=rank(AB).

    I hope I finally got it right! Thank you for all your assistance and patience
    Last edited by rargh; February 22nd 2011 at 10:14 AM.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    Yes, that looks good to me. The crucial point is that Ker(A^(1/2))=Ker(A) for a positive semidefinite matrix A.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. An Exercise
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: January 15th 2011, 09:47 AM
  2. Set exercise
    Posted in the Discrete Math Forum
    Replies: 1
    Last Post: June 19th 2010, 09:37 AM
  3. Exercise
    Posted in the Statistics Forum
    Replies: 1
    Last Post: November 25th 2009, 03:59 AM
  4. Help with an exercise
    Posted in the Algebra Forum
    Replies: 2
    Last Post: June 10th 2009, 11:35 PM
  5. can you help me with this exercise?
    Posted in the Business Math Forum
    Replies: 0
    Last Post: November 8th 2008, 03:18 AM

Search Tags


/mathhelpforum @mathhelpforum