1. ## Positive semidefinite matrix

A and B square matrices and A is positive semidefinite. Prove that $A^2B = BA^2 \Longleftrightarrow AB = BA$

The $\Leftarrow$ is easy:

$A^2B = A(AB) = A(BA) = (AB)A = (BA)A = BA^2$

but I'm having problems with the $\Rightarrow$.

2. Originally Posted by NeTTuR
A and B square matrices and A is positive semidefinite. Prove that $A^2B = BA^2 \Longleftrightarrow AB = BA$

The $\Leftarrow$ is easy:

$A^2B = A(AB) = A(BA) = (AB)A = (BA)A = BA^2$

but I'm having problems with the $\Rightarrow$.
Outline solution (see if you can fill in the details):

Let $\lambda_1,\ldots,\lambda_n$ be the eigenvalues of A, and let p(x) be a polynomial such that $p(\lambda_i^2) = \lambda_i$ for $1\leqslant i\leqslant n$. Then $p(A^2) = A$. But B obviously commutes with $p(A^2)$. So B commutes with A.

3. I was thinking something like this:

$A=A^* \Rightarrow A = U D U^*$, where D diagonal matrix with eigenvalues of A and U unitary.

$A^2B = BA^2 \Leftrightarrow UD^2U^*B=BUD^2U^* \Leftrightarrow D^2U^*BU = U^*BUD^2$

$U^*BU = C \Rightarrow D^2C = CD^2$

$D^2C =
\begin{pmatrix}
\lambda_1^2 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n^2
\end{pmatrix}
\begin{pmatrix}
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
c_{n,1} & c_{n,2} & \cdots & c_{n,n}
\end{pmatrix}
$
$= \begin{pmatrix}
\lambda_1^2 c_{1,1} & \lambda_1^2 c_{1,2} & \cdots & \lambda_1^2c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
\lambda_n^2 c_{n,1} & \lambda_n^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}
\end{pmatrix}$

$CD^2 =
\begin{pmatrix}
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
c_{n,1} & c_{n,2} & \cdots & c_{n,n}
\end{pmatrix}
\begin{pmatrix}
\lambda_1^2 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n^2
\end{pmatrix}
$
$= \begin{pmatrix}
\lambda_1^2 c_{1,1} & \lambda_2^2 c_{1,2} & \cdots & \lambda_n^2c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
\lambda_1^2 c_{n,1} & \lambda_2^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}
\end{pmatrix}$

This gives $\lambda_1^2 = \lambda_2^2 = \cdots = \lambda_n^2$ and $\lambda_i$ is an eigenvalue to A. If I take the square root of these, then $DC$ will commute.
So for a Hermitian matrix, there will be a positive semidefinite matrix that commutes with $B$ and whose square is $A^2$. Now because $A$ and therefore also $A^2$ is positive semidefinite, there exist an unique positive semidefinite matrix whose square is $A^2$ and this matrix is of course A.

Does this look right?

4. Originally Posted by NeTTuR
I was thinking something like this:

$A=A^* \Rightarrow A = U D U^*$, where D diagonal matrix with eigenvalues of A and U unitary.

$A^2B = BA^2 \Leftrightarrow UD^2U^*B=BUD^2U^* \Leftrightarrow D^2U^*BU = U^*BUD^2$

$U^*BU = C \Rightarrow D^2C = CD^2$

$D^2C =
\begin{pmatrix}
\lambda_1^2 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n^2
\end{pmatrix}
\begin{pmatrix}
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
c_{n,1} & c_{n,2} & \cdots & c_{n,n}
\end{pmatrix}
$
$= \begin{pmatrix}
\lambda_1^2 c_{1,1} & \lambda_1^2 c_{1,2} & \cdots & \lambda_1^2c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
\lambda_n^2 c_{n,1} & \lambda_n^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}
\end{pmatrix}$

$CD^2 =
\begin{pmatrix}
c_{1,1} & c_{1,2} & \cdots & c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
c_{n,1} & c_{n,2} & \cdots & c_{n,n}
\end{pmatrix}
\begin{pmatrix}
\lambda_1^2 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n^2
\end{pmatrix}
$
$= \begin{pmatrix}
\lambda_1^2 c_{1,1} & \lambda_2^2 c_{1,2} & \cdots & \lambda_n^2c_{1,n} \\
\vdots & \vdots & \ddots & \vdots \\
\lambda_1^2 c_{n,1} & \lambda_2^2 c_{n,2} & \cdots & \lambda_n^2 c_{n,n}
\end{pmatrix}$
Correct as far as here.
This gives $\lambda_1^2 = \lambda_2^2 = \cdots = \lambda_n^2$ and $\lambda_i$ is an eigenvalue to A. No. If all the eigenvalues of A are equal, then A would have to be a multiple of the identity matrix, and that need not be the case. What you can deduce from equations like $\color{red}\lambda_i^2 c_{i,j} = \lambda_j^2 c_{i,j}$ (where $\color{red}i\ne j$) is that either $\color{red}\lambda_i = \lambda_j$ or $\color{red}c_{i,j} = 0$.
If I take the square root of these, then $DC$ will commute.
So for a Hermitian matrix, there will be a positive semidefinite matrix that commutes with $B$ and whose square is $A^2$. Now because $A$ and therefore also $A^2$ is positive semidefinite, there exist an unique positive semidefinite matrix whose square is $A^2$ and this matrix is of course A. Yes, it's true that a positive semidefinite matrix has a unique positive semidefinite square root.
So if all the eigenvalues are different then all the off-diagonal elements of C must be zero. If some of them are repeated then C can have some nonzero off-diagonal elements.

Notice that if p(x) is a polynomial then $p(D^2)$ is a diagonal matrix whose diagonal elements are $p(\lambda_1^2),\ldots,p(\lambda_n^2)$. If p(x) is as in my previous comment then it follows that $p(D^2) = D$. Since $A = UDU^*$, it follows that $p(A^2) = A$. I think you'll find that is a much simpler way to show that AB = BA, rather than trying to work with the matrix C.

5. Thank you so much for your time. I really appreciate it.

I do not however fully understand the your proof yet. I haven't seen the theorem your using to be able to say that $p(\lambda_i^2) = \lambda_i \Rightarrow p(A^2) = A$ or that B obviously should commute with the polynomial expression. It's important because there is a follow up question saying: "What can you say if A is only assumed to be Hermitian?". Is the extra assumption that A is positive semidefinite unnecessarily strong?

6. Originally Posted by NeTTuR
Thank you so much for your time. I really appreciate it.

I do not however fully understand the your proof yet. I haven't seen the theorem your using to be able to say that $p(\lambda_i^2) = \lambda_i \Rightarrow p(A^2) = A$ or that B obviously should commute with the polynomial expression.
First fact: If $p(x) = a_kx^k + \ldots + a_1x + a_0$ is a polynomial, and A is an n×n matrix, then p(A) denotes the matrix $a_kA^k + \ldots + a_1A + a_0I$, where I is the n×n identity matrix. If B is a matrix that commutes with A, then B commutes with all the powers of A (easy proof by induction), and of course B commutes with the identity matrix. It follows that B commutes with p(A).

Second fact: If D is a diagonal matrix, with diagonal entries $d_1,\ldots,d_n$, then each power $D^k$ of D is also diagonal, with entries $d_1^k,\ldots,d_n^k$ (again, an easy proof by induction). It follows that p(D) is diagonal, with entries $p(d_1),\ldots,p(d_n)$.

I hope those facts will help to make sense of the claims that I made in previous comments.

Originally Posted by NeTTuR
It's important because there is a follow up question saying: "What can you say if A is only assumed to be Hermitian?". Is the extra assumption that A is positive semidefinite unnecessarily strong?
In fact, the result fails if A is only assumed to be hermitian. For an example, look at the 2×2 matrix $A = \begin{bmatrix}0&1\\1&0\end{bmatrix}$. This is hermitian, and its square is the identity matrix. So any 2×2 matrix B will commute with $A^2$. But you should be able to find a matrix B that does not commute with A.

7. But where in the proof does the assumption that A is positive semidefinite come in?

Never mind. You should never do math when you are tired. Of course the polynomial $p(\lambda_i^2) = \lambda_i$ is only guaranteed to exist if all the eigenvalues are non-negative. Otherwise the polynomial might have to take multiple values for the same $\lambda_i^2$. The eigenvalues in your example $A = \begin{pmatrix}0 & 1 \\ 1 & 0 \end{pmatrix}$ are $\lambda = \pm 1$, which gives $p(1) = 1$ and $p(1) = -1$.

Thank you so much for your help.