Results 1 to 10 of 10

Math Help - Orthogonal matrix

  1. #1
    Super Member
    Joined
    Mar 2006
    Posts
    705
    Thanks
    2

    Orthogonal matrix

    Find an orthogonal or unitary matrix P and a diagonal matrix D such that P^*AP=D

    a) A =<br />
\begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}<br />
b) A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}

    My solution so far:

    From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

    Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}

    But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

    Thanks.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    I hate and am not used to finding orthogonal basis... i'll only be able to show you the eigenspaces.

    An eigenvalue \lambda verifies :

    AX=\lambda X

    Let's do it for i.

    \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} i x_1 \\ i x_2 \end{pmatrix}

    If we calculate the product on the right side, we'll have a system of equations :

    0 x_1-x_2=i x_1
    x_1+0 x_2=i x_2

    i x_1=-x_2 (1)
    x_1=i x_2 (2)

    From (1), if we divide all the stuff by i, we have

    x_1=-\frac{x_2}{i}=i x_2

    So the only condition is that x_1=i x_2

    So take x_2=1

    Then x_1=i

    A vector generating the eigenspace associated to i is E_i=\begin{pmatrix} i \\ 1 \end{pmatrix}


    Now, this will be exactly the same for -i.
    If i'm not wrong, you should get :

    E_{-i}=\begin{pmatrix} 1 \\ i \end{pmatrix}
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Behold, the power of SARDINES!
    TheEmptySet's Avatar
    Joined
    Feb 2008
    From
    Yuma, AZ, USA
    Posts
    3,764
    Thanks
    78
    Quote Originally Posted by tttcomrader View Post
    Find an orthogonal or unitary matrix P and a diagonal matrix D such that P^*AP=D

    a) A =<br />
\begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}<br />
b) A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}

    My solution so far:

    From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

    Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}

    But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

    Thanks.
    so we start with

    \begin{bmatrix}<br />
- \lambda && -1 \\<br />
1 && -\lambda \\<br />
\end{bmatrix}

    start with \lambda=-i

    \begin{bmatrix}<br />
i && -1 && 0\\<br />
1 && i && 0\\<br />
\end{bmatrix} iR_1+R_2

    \begin{bmatrix}<br />
i && -1 && 0\\<br />
0 && 0 && 0\\<br />
\end{bmatrix} -iR_1

    \begin{bmatrix}<br />
1 && i && 0\\<br />
0 && 0 && 0\\<br />
\end{bmatrix}

    so x_2=t,x_1=-it

    v_1= \begin{bmatrix}<br />
-it \\<br />
t \\<br />
\end{bmatrix}= t\begin{bmatrix}<br />
-i \\<br />
1 \\<br />
\end{bmatrix}

    start with \lambda=i

    \begin{bmatrix}<br />
-i && -1 && 0\\<br />
1 && -i && 0\\<br />
\end{bmatrix} -iR_1+R_2;iR_1

    \begin{bmatrix}<br />
1 && -i && 0\\<br />
0 && 0 && 0\\<br />
\end{bmatrix}

    so x_2=s,x_1=is

    v_2= \begin{bmatrix}<br />
is \\<br />
s \\<br />
\end{bmatrix}= s\begin{bmatrix}<br />
i \\<br />
1 \\<br />
\end{bmatrix}

    the magnitude of each is \sqrt{2}

    so the matrix is

    \begin{bmatrix}<br />
\frac{-i}{\sqrt{2}} && \frac{i}{\sqrt{2}} \\<br />
\frac{1}{\sqrt{2}} && \frac{1}{\sqrt{2}} \\<br />
\end{bmatrix}
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member
    Joined
    Mar 2006
    Posts
    705
    Thanks
    2
    The only thing I don't understand is how to get from  \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix} to finding the solution x_{1} = -is, x_{2} = s

    And isn't the magnitude of  \begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0?
    Last edited by tttcomrader; April 12th 2008 at 08:32 AM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Behold, the power of SARDINES!
    TheEmptySet's Avatar
    Joined
    Feb 2008
    From
    Yuma, AZ, USA
    Posts
    3,764
    Thanks
    78
    Quote Originally Posted by tttcomrader View Post
    The only thing I don't understand is how to get from  \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix} to finding the solution x_{1} = -is, x_{2} = s

    And isn't the magnitude of  \begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0?
    \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}

    this is equivlent to the system

    x+iy=0

    0x+0y=0

    this is true for all values of y becuase y mult 0 = 0

    so we parameterize the solution by letting

    y=s, s \in \mathbb{R}

    now we sub this into the first equation

    x+i(s)=0 )\ \iff x=-is

    so all of the solutions to the system are (-is,s) =s(-i,1)

    we need to take the modulus of i for the magnitute

    so we get

    |v_2|=\sqrt{|-i|^2+1^2}=\sqrt{1+1}=\sqrt{2}
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Mar 2006
    Posts
    705
    Thanks
    2
    Thank you so so so much!!!

    I wonder why our professor didn't explain that.

    You just made my weekend much easier.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Super Member
    Joined
    Mar 2006
    Posts
    705
    Thanks
    2
    Arrr... I'm doing the second problem and I'm in trouble again.

    Consider A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}

    I found the eigenvalues are 1 and 4.

    Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

    So I have  \frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1) But that's not enough to make P, did I miss a basis?

    Thank you.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    As the matrix is symmetric, it's diagonalizable. This means that :
    - there are 3 distinct eigenvalues
    - or the sum of the multiplicity of the eigenvalues is 3
    - or the eigenspaces generate a subspace of dimension 3.

    So you've missed one ^^
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Behold, the power of SARDINES!
    TheEmptySet's Avatar
    Joined
    Feb 2008
    From
    Yuma, AZ, USA
    Posts
    3,764
    Thanks
    78
    Quote Originally Posted by tttcomrader View Post
    Arrr... I'm doing the second problem and I'm in trouble again.

    Consider A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}

    I found the eigenvalues are 1 and 4.

    Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

    So I have  \frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1) But that's not enough to make P, did I miss a basis?

    Thank you.
    The Basis for the value

    \lambda=1 has dimention 2

    \begin{bmatrix}<br />
1 && 1 && 1 && 0 \\<br />
1 && 1 && 1 && 0 \\<br />
1 && 1 && 1 && 0 \\<br /> <br />
\end{bmatrix} -R_1+R_2, -R_1+R_3

    \begin{bmatrix}<br />
1 && 1 && 1 && 0 \\<br />
0 && 0 && 0 && 0 \\<br />
0 && 0 && 0 && 0 \\<br /> <br />
\end{bmatrix}

    so if we let z=s and y=t we get the solution

     \begin{bmatrix}<br />
-t-s \\<br />
-t \\<br />
s \\<br />
\end{bmatrix} = \begin{bmatrix}<br />
-t \\<br />
-t \\<br />
0 \\<br /> <br />
\end{bmatrix} + \begin{bmatrix}<br />
-s \\<br />
0 \\<br />
s \\<br />
\end{bmatrix} =

    s \begin{bmatrix}<br />
-1 \\<br />
0 \\<br />
1 \\<br />
\end{bmatrix} + t \begin{bmatrix}<br />
-1 \\<br />
-1 \\<br />
0 \\<br />
\end{bmatrix} <br />

    These are two vector for one value of lambda

    Good luck.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Super Member
    Joined
    Mar 2006
    Posts
    705
    Thanks
    2
    Arr... I don't know, but I just keep running into trouble in this problem...

    I have P = \begin{bmatrix} - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && \frac {1}{ \sqrt {3}} \\ 0 && - \frac {1}{ \sqrt {2}} && \frac {1} { \sqrt {3}} \\ \frac {1} { \sqrt {2}} && 0 && \frac {1} { \sqrt {3}} \end{bmatrix}

    So P^* = \begin{bmatrix} - \frac {1} { \sqrt {2}} && 0 && \frac {1}{ \sqrt {2}} \\ - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && 0 \\  \frac {1} { \sqrt {3}} &&  \frac {1}{ \sqrt {3}} &&  \frac {1} { \sqrt {3}} \end{bmatrix}

    But when I do P^* A P , it doesn't give me a diagonal matrix, why why why!!!!????? thank you
    Last edited by tttcomrader; April 15th 2008 at 08:32 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. [SOLVED] Orthogonal Matrix
    Posted in the Advanced Algebra Forum
    Replies: 5
    Last Post: November 30th 2011, 10:17 AM
  2. orthogonal matrix
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: November 6th 2010, 08:08 AM
  3. Orthogonal Matrix
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: May 28th 2009, 10:13 AM
  4. Orthogonal Matrix
    Posted in the Advanced Algebra Forum
    Replies: 6
    Last Post: April 10th 2009, 04:26 AM
  5. Orthogonal Matrix
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: March 13th 2009, 07:17 AM

Search Tags


/mathhelpforum @mathhelpforum