# Math Help - Permutation matrix properties proof

1. ## Permutation matrix properties proof

The problem asks to establish the following properties of $n \times n$ permutation matrices, for all $\sigma, \tau \in \mathcal{S}_n$.

1. $P_\sigma = I$ iff $\sigma(k) = k$ for all $k$, where $I$ denotes the $n \times n$ identity matrix.

2. $P_\sigma P_\tau = P_{\sigma \circ \tau}$.

3. $(P_\sigma)^{-1} = P_{\sigma^{-1}}$.

4. $P_\sigma$ is an orthogonal matrix, that is, $(P_\sigma)^{-1} = (P_\sigma)^T$.

Can someone pls check if my attempts below are correct proofs? Does the first one below qualify as a proof?

1. If $\sigma(k) = k$ for all $k$ then $P_\sigma A$ does not swap any rows of $A$ and therefore $P_\sigma A = A = I A$ which implies $P_\sigma = I$. If $P_\sigma = I$ then $P_\sigma A = I A = A$ meaning $\sigma(k) = k$ for all $k$.

2. $P_\sigma P_\tau A$ swaps $k$th row of $A$ with $\tau(k)$th row which is then swapped with $\sigma(\tau(k))$th row. So $k$th row is swapped with $(\sigma \circ \tau)(k)$th row which is what $P_{\sigma \circ \tau} A$ does. Therefore $P_\sigma P_\tau = P_{\sigma \circ \tau}$.

3. Property 2 says that $P_{\sigma}P_{\sigma^{-1}} = P_{\sigma \circ \sigma^{-1}}$. Because $(\sigma \circ \sigma^{-1})(k) = k$ for all $k$, property 1 says $P_{\sigma \circ \sigma^{-1}} = I$. Hence $P_{\sigma}P_{\sigma^{-1}} = I = P_{\sigma}(P_{\sigma})^{-1}$ and $(P_\sigma)^{-1} = P_{\sigma^{-1}}$.

4. If $[P_{\sigma}]_{ij} = 1$, then the $j$th row of $A$ will be the $i$th row of $P_{\sigma} A$. The inverse $P_{\sigma}^{-1}$ must copy the $i$th row back to the $j$th row and therefore $[P_{\sigma}^{-1}]_{ji} = [P_{\sigma}]_{ij}$. Let all other entries of $(P_{\sigma})^{-1}$ be zero, since all other $P_{\sigma}$ entries are also zero, we have $[P_{\sigma}^{-1}]_{ij} = [P_{\sigma}]_{ji} = [P_{\sigma}^T]_{ij}$.

2. permutation matrices do more than "swap rows". a more apt description would be "shuffle rows", or "permute rows".

so a better proof of (1) says the k-th row of PσA = the k-th row of A, hence PσA = A, for all A (important!), because only then can you conclude Pσ = I (because the multiplicative identity I of nxn matrices is unique).

(2) same problem. Pτ takes the k-th row of A to the τ(k)-th row of PτA, and then Pσ takes that to the σ○τ(k)-th row of PσPτA. the rows aren't "swapped" we can't say that the τ(k)-th row of A is taken by Pτ to the k-th row of A.

(3) looks fine.

(4) this is not true. only the other entries in the j-th row and the i-th column of Pσ^-1 will be 0. you need to argue row-by-row, or column-by-column, since Pσ has a 1 in EACH row and EACH column. that is, the j-th column of Pσ is ei, the i-th basis vector. so the transpose has the i-th basis vector in the j-th row.

clearly, when you multply Pσ by its transpose, the only non-zero entries of the product will be on the diagonal, when i = j, because the i,j-th entry of Pσ(Pσ^T) is <ej,ei>, which is what i think you meant. since the ej form an orthonormal basis, this will be the identity matrix I, so the inverse of Pσ is its transpose.

all of this assumes that the columns of Pσ are σ(ei) for each i. if the rows of Pσ are σ(ei), then PσPτ = Pτσ instead of (2), but all the other properties hold. sometimes permutation matrices are defined this way, because sometimes algebraists use τσ to mean do τ first, then σ (so they write (x)τσ).

the point is, that σ-->Pσ defines an monomorphism of Sn into GLn(F). the orthogonality of the Pσ is a nice bonus, geometrically it means all versions of F^n have the same geometric properties, the labelling of the basis vectors doesn't matter that much (well, almost. odd permutations of Sn reverse orientation).

3. Thanks for such a detailed explaination. This has been one more useful lesson about proofs.