Results 1 to 7 of 7

Math Help - Positive semidefinite matrix

  1. #1
    Newbie
    Joined
    Feb 2010
    Posts
    4

    Positive semidefinite matrix

    A and B square matrices and A is positive semidefinite. Prove that A^2B = BA^2 \Longleftrightarrow AB = BA

    The \Leftarrow is easy:

    A^2B = A(AB) = A(BA) = (AB)A = (BA)A = BA^2

    but I'm having problems with the \Rightarrow.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    Quote Originally Posted by NeTTuR View Post
    A and B square matrices and A is positive semidefinite. Prove that A^2B = BA^2 \Longleftrightarrow AB = BA

    The \Leftarrow is easy:

    A^2B = A(AB) = A(BA) = (AB)A = (BA)A = BA^2

    but I'm having problems with the \Rightarrow.
    Outline solution (see if you can fill in the details):

    Let \lambda_1,\ldots,\lambda_n be the eigenvalues of A, and let p(x) be a polynomial such that p(\lambda_i^2) = \lambda_i for 1\leqslant i\leqslant n. Then p(A^2) = A. But B obviously commutes with p(A^2). So B commutes with A.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Feb 2010
    Posts
    4
    I was thinking something like this:

    A=A^* \Rightarrow A = U D U^*, where D diagonal matrix with eigenvalues of A and U unitary.

    A^2B = BA^2 \Leftrightarrow UD^2U^*B=BUD^2U^* \Leftrightarrow D^2U^*BU = U^*BUD^2

    U^*BU = C \Rightarrow D^2C = CD^2

    D^2C =<br />
\begin{pmatrix}<br />
\lambda_1^2 & 0 &  \cdots & 0 \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
0 & 0 & \cdots & \lambda_n^2<br />
\end{pmatrix}<br />
\begin{pmatrix}<br />
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
c_{n,1} & c_{n,2} & \cdots & c_{n,n}<br />
\end{pmatrix}<br />
 = \begin{pmatrix}<br />
\lambda_1^2 c_{1,1} & \lambda_1^2 c_{1,2} & \cdots & \lambda_1^2c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
\lambda_n^2 c_{n,1} & \lambda_n^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}<br />
\end{pmatrix}

    CD^2 =<br />
\begin{pmatrix}<br />
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
c_{n,1} & c_{n,2} & \cdots & c_{n,n}<br />
\end{pmatrix}<br />
\begin{pmatrix}<br />
 \lambda_1^2 & 0 &  \cdots & 0 \\<br />
 \vdots & \vdots & \ddots & \vdots \\<br />
 0 & 0 & \cdots & \lambda_n^2<br />
 \end{pmatrix}<br />
 = \begin{pmatrix}<br />
\lambda_1^2 c_{1,1} & \lambda_2^2 c_{1,2} & \cdots &  \lambda_n^2c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
\lambda_1^2 c_{n,1} & \lambda_2^2 c_{n,2} & \cdots &  \lambda_n^2 c_{n,n}<br />
\end{pmatrix}
    This gives \lambda_1^2 = \lambda_2^2 = \cdots = \lambda_n^2 and \lambda_i is an eigenvalue to A. If I take the square root of these, then DC will commute.
    So for a Hermitian matrix, there will be a positive semidefinite matrix that commutes with B and whose square is A^2. Now because A and therefore also A^2 is positive semidefinite, there exist an unique positive semidefinite matrix whose square is A^2 and this matrix is of course A.

    Does this look right?
    Last edited by NeTTuR; February 14th 2010 at 02:25 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    Quote Originally Posted by NeTTuR View Post
    I was thinking something like this:

    A=A^* \Rightarrow A = U D U^*, where D diagonal matrix with eigenvalues of A and U unitary.

    A^2B = BA^2 \Leftrightarrow UD^2U^*B=BUD^2U^* \Leftrightarrow D^2U^*BU = U^*BUD^2

    U^*BU = C \Rightarrow D^2C = CD^2

    D^2C =<br />
\begin{pmatrix}<br />
\lambda_1^2 & 0 &  \cdots & 0 \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
0 & 0 & \cdots & \lambda_n^2<br />
\end{pmatrix}<br />
\begin{pmatrix}<br />
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
c_{n,1} & c_{n,2} & \cdots & c_{n,n}<br />
\end{pmatrix}<br />
 = \begin{pmatrix}<br />
\lambda_1^2 c_{1,1} & \lambda_1^2 c_{1,2} & \cdots & \lambda_1^2c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
\lambda_n^2 c_{n,1} & \lambda_n^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}<br />
\end{pmatrix}

    CD^2 =<br />
\begin{pmatrix}<br />
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
c_{n,1} & c_{n,2} & \cdots & c_{n,n}<br />
\end{pmatrix}<br />
\begin{pmatrix}<br />
 \lambda_1^2 & 0 &  \cdots & 0 \\<br />
 \vdots & \vdots & \ddots & \vdots \\<br />
 0 & 0 & \cdots & \lambda_n^2<br />
 \end{pmatrix}<br />
 = \begin{pmatrix}<br />
\lambda_1^2 c_{1,1} & \lambda_2^2 c_{1,2} & \cdots &  \lambda_n^2c_{1,n} \\<br />
\vdots & \vdots & \ddots & \vdots \\<br />
\lambda_1^2 c_{n,1} & \lambda_2^2 c_{n,2} & \cdots &  \lambda_n^2 c_{n,n}<br />
\end{pmatrix} Correct as far as here.
    This gives \lambda_1^2 = \lambda_2^2 = \cdots = \lambda_n^2 and \lambda_i is an eigenvalue to A. No. If all the eigenvalues of A are equal, then A would have to be a multiple of the identity matrix, and that need not be the case. What you can deduce from equations like \color{red}\lambda_i^2 c_{i,j} = \lambda_j^2 c_{i,j} (where \color{red}i\ne j) is that either \color{red}\lambda_i = \lambda_j or \color{red}c_{i,j} = 0.
    If I take the square root of these, then DC will commute.
    So for a Hermitian matrix, there will be a positive semidefinite matrix that commutes with B and whose square is A^2. Now because A and therefore also A^2 is positive semidefinite, there exist an unique positive semidefinite matrix whose square is A^2 and this matrix is of course A. Yes, it's true that a positive semidefinite matrix has a unique positive semidefinite square root.
    So if all the eigenvalues are different then all the off-diagonal elements of C must be zero. If some of them are repeated then C can have some nonzero off-diagonal elements.

    Notice that if p(x) is a polynomial then p(D^2) is a diagonal matrix whose diagonal elements are p(\lambda_1^2),\ldots,p(\lambda_n^2). If p(x) is as in my previous comment then it follows that p(D^2) = D. Since A = UDU^*, it follows that p(A^2) = A. I think you'll find that is a much simpler way to show that AB = BA, rather than trying to work with the matrix C.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Feb 2010
    Posts
    4
    Thank you so much for your time. I really appreciate it.

    I do not however fully understand the your proof yet. I haven't seen the theorem your using to be able to say that p(\lambda_i^2) = \lambda_i \Rightarrow p(A^2) = A or that B obviously should commute with the polynomial expression. It's important because there is a follow up question saying: "What can you say if A is only assumed to be Hermitian?". Is the extra assumption that A is positive semidefinite unnecessarily strong?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7
    Quote Originally Posted by NeTTuR View Post
    Thank you so much for your time. I really appreciate it.

    I do not however fully understand the your proof yet. I haven't seen the theorem your using to be able to say that p(\lambda_i^2) = \lambda_i \Rightarrow p(A^2) = A or that B obviously should commute with the polynomial expression.
    First fact: If p(x) = a_kx^k + \ldots + a_1x + a_0 is a polynomial, and A is an nn matrix, then p(A) denotes the matrix a_kA^k + \ldots + a_1A + a_0I, where I is the nn identity matrix. If B is a matrix that commutes with A, then B commutes with all the powers of A (easy proof by induction), and of course B commutes with the identity matrix. It follows that B commutes with p(A).

    Second fact: If D is a diagonal matrix, with diagonal entries d_1,\ldots,d_n, then each power D^k of D is also diagonal, with entries d_1^k,\ldots,d_n^k (again, an easy proof by induction). It follows that p(D) is diagonal, with entries p(d_1),\ldots,p(d_n).

    I hope those facts will help to make sense of the claims that I made in previous comments.

    Quote Originally Posted by NeTTuR View Post
    It's important because there is a follow up question saying: "What can you say if A is only assumed to be Hermitian?". Is the extra assumption that A is positive semidefinite unnecessarily strong?
    In fact, the result fails if A is only assumed to be hermitian. For an example, look at the 22 matrix A = \begin{bmatrix}0&1\\1&0\end{bmatrix}. This is hermitian, and its square is the identity matrix. So any 22 matrix B will commute with A^2. But you should be able to find a matrix B that does not commute with A.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Feb 2010
    Posts
    4
    But where in the proof does the assumption that A is positive semidefinite come in?

    Never mind. You should never do math when you are tired. Of course the polynomial p(\lambda_i^2) = \lambda_i is only guaranteed to exist if all the eigenvalues are non-negative. Otherwise the polynomial might have to take multiple values for the same \lambda_i^2. The eigenvalues in your example A = \begin{pmatrix}0 & 1 \\ 1 & 0 \end{pmatrix} are \lambda = \pm 1, which gives p(1) = 1 and p(1) = -1.

    Thank you so much for your help.
    Last edited by NeTTuR; February 15th 2010 at 10:19 PM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. positive matrix
    Posted in the Differential Geometry Forum
    Replies: 0
    Last Post: October 16th 2010, 02:10 AM
  2. positive semidefinite gram matrix
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: May 26th 2010, 10:24 PM
  3. Positive semidefinite matrices
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: May 5th 2010, 02:43 PM
  4. Replies: 2
    Last Post: May 30th 2009, 01:51 PM
  5. Replies: 0
    Last Post: June 4th 2008, 03:39 PM

Search Tags


/mathhelpforum @mathhelpforum