# Orthogonal complement

• May 11th 2011, 03:30 PM
qwerty1234
Orthogonal complement
Let V denote the real vector space M_(n,n) (R) of n by n matrices with real entries and let S denote the subspace of all n by n symmetric matrices in V (i.e. such that A=A transpose).

---- Determine the orthogonal complement of S in V. (Hint: use a basis of S)

Thanks..
• May 11th 2011, 05:18 PM
Drexel28
Quote:

Originally Posted by qwerty1234
Let V denote the real vector space M_(n,n) (R) of n by n matrices with real entries and let S denote the subspace of all n by n symmetric matrices in V (i.e. such that A=A transpose).

---- Determine the orthogonal complement of S in V. (Hint: use a basis of S)

Thanks..

The answer is the antisymmetric matrices. Here's a proof: don't peek until you've exhausted all other avenues:
Spoiler:

Let $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)$ be the set of all symmetric $\displaystyle n\times n$ matrices and $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)$ the set of all anti-symmetric matrices (that is matrices $\displaystyle A$ with $\displaystyle A^\top=-A$). I claim that $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)^\perp=\text{AS ym}_n\left(\mathbb{R}\right)$. Indeed, assuming that you are imposing the inner product on $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)$ by identifying it with $\displaystyle \mathbb{R}^{n^2}$ you can readily prove (if you're really desperate a proof can be gleaned from here) that $\displaystyle \left\langle A,B\right\rangle=\text{tr}\left(AB^{\top}\right)$ from where it's immediate that $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)\subseteq\text {Sym}_n\left(\mathbb{R}\right)^\perp$ since if
$\displaystyle A\in\text{ASym}_n\left(\mathbb{R}\right),S\in\text {Sym}_n\left(\mathbb{R}\right)$ then

$\displaystyle \left\langle A,B\right\rangle=\text{tr}\left(AB^{\top}\right)= \text{tr} \left(-AB\right)=-\text{tr}\left(AB\right)$

but with equal validity

$\displaystyle \left\langle A,B\right\rangle=\left\langle B,A\right\rangle=\text{tr}\left(BA^\top\right)= \text{tr} \left(BA\right)$

and so $\displaystyle \left\langle A,B\right\rangle=0$. Now, to finish the argument note that every matrix $\displaystyle M\in\text{Mat}_n\left(\mathbb{R}\right)$ may be written as

$\displaystyle \displaystyle M=\underbrace{\frac{M+M^\top}{2}}_{\text{symmetric }}+\underbrace{\frac{M-M^\top}{2}}_{\text{anti-symmetric}}$

and so $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)=\text{Sym}_n \left(\mathbb{R}\right)+\text{ASym}_n\left(\mathbb {R}\right)$ that said it's evident that $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)\cap\text{ASym} _n\left(\mathbb{R}\right)=\{\bold{0}\}$ since if $\displaystyle M$ is in the intersection then $\displaystyle M=M^\top=-M$. Thus, $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)= \text{Sym}_n \left(\mathbb{R}\right)\oplus\text{ASym}_n\left( \mathbb{R} \right)$ and so

$\displaystyle \text{codim}\left(\text{ASym}_n \mathbb{R}\right)=\dim\text{Sym}_n\left(\mathbb{R} \right)=\text{codim} \text{Sym}_n\left(\mathbb{R}\right)^\perp$

And so recalling that $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)\subseteq\text {Sym}_n\left(\mathbb{R}\right)^\perp$ you may conclude with a dimension argument
• May 11th 2011, 11:33 PM
Deveno
well a basis for $\displaystyle M_n(\mathbb{R})$ is $\displaystyle \{E_{ij}, 1\leq i,j\leq n\}$ where each $\displaystyle E_{ij}$ has a 1 in the i,j-th position and 0's elsewhere.

a basis for S is $\displaystyle \{E_{ii}\} \cup \{E_{ij}+E_{ji}, i \neq j\}$. considering each of the $\displaystyle E_{ij}$ as standard basis elements in $\displaystyle \mathbb{R}^{n^2}$, we see any element of $\displaystyle S^{\perp}$ must be orthogonal to each and every one of the basis elements for S.

in particular, this means that the i,i-th entry of $\displaystyle T \in S^{\perp}$ must be 0 for all i, and that for i ≠ j, the i,j-th entry of T must be the negative of the j,i-th entry.

in short, $\displaystyle T^{t} = -T$, that is, T is anti-symmetric.

on the other hand, if T is anti-symmetric, it is clear that the entries on the diagonal are 0, so $\displaystyle <E_{ii},T> = 0$ and the i,j-th entry of T, $\displaystyle t_{ij} = -t_{ji}$, the j,i-th entry of T, so we have $\displaystyle <E_{ij}+E_{ji},T> = 0$ therefore, $\displaystyle T \in S^{\perp}$.
• May 12th 2011, 04:34 AM
qwerty1234
thanks so much, i understand better now!!