# Thread: Proof of symetric matrix

1. ## Proof of symetric matrix

I am not sure if a square matrix A is symetric ( $
A = {A^T}$
) when we have the relation $AP{A^{ - 1}} = {A^T}P{({A^T})^{ - 1}}
$
. Has anyone an idea how to prove this?

Thanks Ulrich

2. I think you are after something like this

Symmetric Matrices

3. Originally Posted by huldrich
I am not sure if a square matrix A is symetric ( $
A = {A^T}$
) when we have the relation $AP{A^{ - 1}} = {A^T}P{({A^T})^{ - 1}}
$
. Has anyone an idea how to prove this?

Thanks Ulrich
Um, just replace $A$ by $A^T$...

4. Are you trying to show that

$APA^{-1}=A^{T}P(A^{T})^{-1}$

implies

$A=A^{T}$?

5. Originally Posted by Ackbeet
Are you trying to show that

$APA^{-1}=A^{T}P(A^{T})^{-1}$

implies

$A=A^{T}$?
Yes, but I think it is not possible to transform the first equation into the second because the matrices are supposed to not commute. Maybe it should be done by supposing that the first equation is also valid for

$A \ne {A^T}$

and then arrive at some contradiction. Or equivalently puting

$A = {A^T}B$, from which follows

$AP{A^{ - 1}} = {A^T}BP{B^{ - 1}}{({A^T})^{ - 1}}$

and by comparision

$BP{B^{ - 1}} = P$.

Does this imply that B is the identity matrix? If yes, why? It's some time ago when I took courses in linear algebra...

6. Are there any conditions on P?

7. Originally Posted by Ackbeet
Are there any conditions on P?
Yes, there must be. For example, P cannot commute with both $A$ and $A^T$.

8. I found out that $BP{B^{ - 1}} = P$ does not necessarily mean that B is the identity matrix. If for instance

$P=\left(
\begin{array}{ccc}
0 & 0 & 1 \\
1 & 0 & 0 \\
0 & 1 & 0
\end{array}
\right)$

and

$B=\left(
\begin{array}{ccc}
i & f & h \\
h & i & f \\
f & h & i
\end{array}
\right)$

then this is true but such solutions only yield $A = 0$ for $A = {A^T}B$ so A must be symmetric.

9. Counterexample:

$A=\begin{bmatrix}1 &2\\ 3 &4\end{bmatrix}.$ Clearly, $A\not=A^{T}.$

Let $P=\begin{bmatrix}-3 &-4\\ 1 &2\end{bmatrix}.$

Then, by direct computation, $APA^{-1}=A^{T}P(A^{T})^{-1},$ but $A\not=A^{T}.$

Also note that this $P$ does not commute with $A.$

For reference,

$A^{-1}=\begin{bmatrix}-2 &1\\ 3/2 &-1/2\end{bmatrix},$ and, of course,

$(A^{T})^{-1}=(A^{-1})^{T}.$

So I think this answers your hypothesis in the negative.

10. ## Re: Proof of symetric matrix

Hi all,

I had some time recently and also found, as a "by-product", the solution to this problem. In fact the matrices

$A=\begin{bmatrix}1 &2\\ 3 &4\end{bmatrix}$ and

$P=\begin{bmatrix}-3 &-4\\ 1 &2\end{bmatrix}$

do not commute but are nevertheless special because $BP = PB$ where $B=(A^{T})^{-1}A$ (this is also true for the 3-dimensional matrices I mentionned before).

From this follows that $BP{B^{ - 1}} = P$ with B not necessarily being the identity matrix. So if one imposes that B and P do not commute, then $APA^{-1}=A^{T}P(A^{T})^{-1}$ implies that A is symmetric.

I also asked earlier how it is possible to show from $BP{B^{ - 1}} = P$ that B is the identity matrix. This can be done as follows:

Given P and Q two arbitray matrices, we want to determine another matrix B such that $BP {B^{ - 1}} = Q$ is verified. From some theorem (sorry forgot the name) there is a matrix E such that Q can be diagonalised. In other words, we get $EQ{E^{ - 1}} = D$ where E is the matrix formed by the eigenvectors of Q, let's say $E=M(EV Q)$. D is the diagonal matrix with the eigenvalues of Q. This also yields

$E BP {B^{ - 1}}{E^{ - 1}} = (EB)P {(EB)^{ - 1}} = D$.

So EB is the matrix formed with the eigenvectors of P, that is $EB=M(EV P)$, which yields $B=E^{ - 1}}M(EV P)=M^{ - 1}}(EV Q)M(EV P)$.

This is why, if we have $P=Q$ like above, B must be the identity matrix.