Results 1 to 4 of 4

Thread: Orthogonal complement

  1. #1
    Junior Member
    Joined
    May 2011
    Posts
    28

    Orthogonal complement

    Let V denote the real vector space M_(n,n) (R) of n by n matrices with real entries and let S denote the subspace of all n by n symmetric matrices in V (i.e. such that A=A transpose).

    ---- Determine the orthogonal complement of S in V. (Hint: use a basis of S)

    Thanks..
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor Drexel28's Avatar
    Joined
    Nov 2009
    From
    Berkeley, California
    Posts
    4,563
    Thanks
    22
    Quote Originally Posted by qwerty1234 View Post
    Let V denote the real vector space M_(n,n) (R) of n by n matrices with real entries and let S denote the subspace of all n by n symmetric matrices in V (i.e. such that A=A transpose).

    ---- Determine the orthogonal complement of S in V. (Hint: use a basis of S)

    Thanks..
    The answer is the antisymmetric matrices. Here's a proof: don't peek until you've exhausted all other avenues:
    Spoiler:

    Let $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)$ be the set of all symmetric $\displaystyle n\times n$ matrices and $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)$ the set of all anti-symmetric matrices (that is matrices $\displaystyle A$ with $\displaystyle A^\top=-A$). I claim that $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)^\perp=\text{AS ym}_n\left(\mathbb{R}\right)$. Indeed, assuming that you are imposing the inner product on $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)$ by identifying it with $\displaystyle \mathbb{R}^{n^2}$ you can readily prove (if you're really desperate a proof can be gleaned from here) that $\displaystyle \left\langle A,B\right\rangle=\text{tr}\left(AB^{\top}\right)$ from where it's immediate that $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)\subseteq\text {Sym}_n\left(\mathbb{R}\right)^\perp$ since if
    $\displaystyle A\in\text{ASym}_n\left(\mathbb{R}\right),S\in\text {Sym}_n\left(\mathbb{R}\right)$ then

    $\displaystyle \left\langle A,B\right\rangle=\text{tr}\left(AB^{\top}\right)= \text{tr} \left(-AB\right)=-\text{tr}\left(AB\right)$

    but with equal validity

    $\displaystyle \left\langle A,B\right\rangle=\left\langle B,A\right\rangle=\text{tr}\left(BA^\top\right)= \text{tr} \left(BA\right)$

    and so $\displaystyle \left\langle A,B\right\rangle=0$. Now, to finish the argument note that every matrix $\displaystyle M\in\text{Mat}_n\left(\mathbb{R}\right)$ may be written as

    $\displaystyle \displaystyle M=\underbrace{\frac{M+M^\top}{2}}_{\text{symmetric }}+\underbrace{\frac{M-M^\top}{2}}_{\text{anti-symmetric}}$

    and so $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)=\text{Sym}_n \left(\mathbb{R}\right)+\text{ASym}_n\left(\mathbb {R}\right)$ that said it's evident that $\displaystyle \text{Sym}_n\left(\mathbb{R}\right)\cap\text{ASym} _n\left(\mathbb{R}\right)=\{\bold{0}\}$ since if $\displaystyle M$ is in the intersection then $\displaystyle M=M^\top=-M$. Thus, $\displaystyle \text{Mat}_n\left(\mathbb{R}\right)= \text{Sym}_n \left(\mathbb{R}\right)\oplus\text{ASym}_n\left( \mathbb{R} \right)$ and so

    $\displaystyle \text{codim}\left(\text{ASym}_n \mathbb{R}\right)=\dim\text{Sym}_n\left(\mathbb{R} \right)=\text{codim} \text{Sym}_n\left(\mathbb{R}\right)^\perp$

    And so recalling that $\displaystyle \text{ASym}_n\left(\mathbb{R}\right)\subseteq\text {Sym}_n\left(\mathbb{R}\right)^\perp$ you may conclude with a dimension argument
    Last edited by Drexel28; May 11th 2011 at 08:37 PM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,546
    Thanks
    842
    well a basis for $\displaystyle M_n(\mathbb{R})$ is $\displaystyle \{E_{ij}, 1\leq i,j\leq n\}$ where each $\displaystyle E_{ij}$ has a 1 in the i,j-th position and 0's elsewhere.

    a basis for S is $\displaystyle \{E_{ii}\} \cup \{E_{ij}+E_{ji}, i \neq j\}$. considering each of the $\displaystyle E_{ij}$ as standard basis elements in $\displaystyle \mathbb{R}^{n^2}$, we see any element of $\displaystyle S^{\perp}$ must be orthogonal to each and every one of the basis elements for S.

    in particular, this means that the i,i-th entry of $\displaystyle T \in S^{\perp}$ must be 0 for all i, and that for i ≠ j, the i,j-th entry of T must be the negative of the j,i-th entry.

    in short, $\displaystyle T^{t} = -T$, that is, T is anti-symmetric.

    on the other hand, if T is anti-symmetric, it is clear that the entries on the diagonal are 0, so $\displaystyle <E_{ii},T> = 0$ and the i,j-th entry of T, $\displaystyle t_{ij} = -t_{ji}$, the j,i-th entry of T, so we have $\displaystyle <E_{ij}+E_{ji},T> = 0$ therefore, $\displaystyle T \in S^{\perp}$.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    May 2011
    Posts
    28
    thanks so much, i understand better now!!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Orthogonal complement
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: Dec 26th 2011, 12:28 AM
  2. Orthogonal Complement 2
    Posted in the Differential Geometry Forum
    Replies: 3
    Last Post: Apr 7th 2011, 06:15 PM
  3. Orthogonal Complement 1
    Posted in the Differential Geometry Forum
    Replies: 5
    Last Post: Apr 1st 2011, 01:06 PM
  4. Orthogonal complement
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: Apr 28th 2010, 05:49 AM
  5. Orthogonal Complement
    Posted in the Differential Geometry Forum
    Replies: 3
    Last Post: May 31st 2009, 01:12 PM

Search Tags


/mathhelpforum @mathhelpforum