# Math Help - Orthogonal matrix

1. ## Orthogonal matrix

Find an orthogonal or unitary matrix P and a diagonal matrix D such that $P^*AP=D$

a) $A =
\begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}
$
b) $A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

My solution so far:

From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have $E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}$

But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

Thanks.

2. Hello,

I hate and am not used to finding orthogonal basis... i'll only be able to show you the eigenspaces.

An eigenvalue $\lambda$ verifies :

$AX=\lambda X$

Let's do it for i.

$\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} i x_1 \\ i x_2 \end{pmatrix}$

If we calculate the product on the right side, we'll have a system of equations :

$0 x_1-x_2=i x_1$
$x_1+0 x_2=i x_2$

$i x_1=-x_2$ (1)
$x_1=i x_2$ (2)

From (1), if we divide all the stuff by i, we have

$x_1=-\frac{x_2}{i}=i x_2$

So the only condition is that $x_1=i x_2$

So take $x_2=1$

Then $x_1=i$

A vector generating the eigenspace associated to i is $E_i=\begin{pmatrix} i \\ 1 \end{pmatrix}$

Now, this will be exactly the same for -i.
If i'm not wrong, you should get :

$E_{-i}=\begin{pmatrix} 1 \\ i \end{pmatrix}$

Find an orthogonal or unitary matrix P and a diagonal matrix D such that $P^*AP=D$

a) $A =
\begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}
$
b) $A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

My solution so far:

From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have $E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}$

But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

Thanks.

$\begin{bmatrix}
- \lambda && -1 \\
1 && -\lambda \\
\end{bmatrix}$

start with $\lambda=-i$

$\begin{bmatrix}
i && -1 && 0\\
1 && i && 0\\
\end{bmatrix} iR_1+R_2$

$\begin{bmatrix}
i && -1 && 0\\
0 && 0 && 0\\
\end{bmatrix} -iR_1$

$\begin{bmatrix}
1 && i && 0\\
0 && 0 && 0\\
\end{bmatrix}$

so $x_2=t,x_1=-it$

$v_1= \begin{bmatrix}
-it \\
t \\
\end{bmatrix}= t\begin{bmatrix}
-i \\
1 \\
\end{bmatrix}$

start with $\lambda=i$

$\begin{bmatrix}
-i && -1 && 0\\
1 && -i && 0\\
\end{bmatrix} -iR_1+R_2;iR_1$

$\begin{bmatrix}
1 && -i && 0\\
0 && 0 && 0\\
\end{bmatrix}$

so $x_2=s,x_1=is$

$v_2= \begin{bmatrix}
is \\
s \\
\end{bmatrix}= s\begin{bmatrix}
i \\
1 \\
\end{bmatrix}$

the magnitude of each is $\sqrt{2}$

so the matrix is

$\begin{bmatrix}
\frac{-i}{\sqrt{2}} && \frac{i}{\sqrt{2}} \\
\frac{1}{\sqrt{2}} && \frac{1}{\sqrt{2}} \\
\end{bmatrix}$

4. The only thing I don't understand is how to get from $\begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$ to finding the solution $x_{1} = -is, x_{2} = s$

And isn't the magnitude of $\begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0$?

The only thing I don't understand is how to get from $\begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$ to finding the solution $x_{1} = -is, x_{2} = s$

And isn't the magnitude of $\begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0$?
$\begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$

this is equivlent to the system

$x+iy=0$

$0x+0y=0$

this is true for all values of y becuase y mult 0 = 0

so we parameterize the solution by letting

$y=s, s \in \mathbb{R}$

now we sub this into the first equation

$x+i(s)=0 )\ \iff x=-is$

so all of the solutions to the system are (-is,s) =s(-i,1)

we need to take the modulus of i for the magnitute

so we get

$|v_2|=\sqrt{|-i|^2+1^2}=\sqrt{1+1}=\sqrt{2}$

6. Thank you so so so much!!!

I wonder why our professor didn't explain that.

You just made my weekend much easier.

7. Arrr... I'm doing the second problem and I'm in trouble again.

Consider $A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

I found the eigenvalues are 1 and 4.

Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

So I have $\frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1)$ But that's not enough to make P, did I miss a basis?

Thank you.

8. Hello,

As the matrix is symmetric, it's diagonalizable. This means that :
- there are 3 distinct eigenvalues
- or the sum of the multiplicity of the eigenvalues is 3
- or the eigenspaces generate a subspace of dimension 3.

So you've missed one ^^

Arrr... I'm doing the second problem and I'm in trouble again.

Consider $A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

I found the eigenvalues are 1 and 4.

Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

So I have $\frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1)$ But that's not enough to make P, did I miss a basis?

Thank you.
The Basis for the value

$\lambda=1$ has dimention 2

$\begin{bmatrix}
1 && 1 && 1 && 0 \\
1 && 1 && 1 && 0 \\
1 && 1 && 1 && 0 \\

\end{bmatrix} -R_1+R_2, -R_1+R_3$

$\begin{bmatrix}
1 && 1 && 1 && 0 \\
0 && 0 && 0 && 0 \\
0 && 0 && 0 && 0 \\

\end{bmatrix}$

so if we let z=s and y=t we get the solution

$\begin{bmatrix}
-t-s \\
-t \\
s \\
\end{bmatrix} = \begin{bmatrix}
-t \\
-t \\
0 \\

\end{bmatrix} + \begin{bmatrix}
-s \\
0 \\
s \\
\end{bmatrix} =$

$s \begin{bmatrix}
-1 \\
0 \\
1 \\
\end{bmatrix} + t \begin{bmatrix}
-1 \\
-1 \\
0 \\
\end{bmatrix}
$

These are two vector for one value of lambda

Good luck.

10. Arr... I don't know, but I just keep running into trouble in this problem...

I have $P = \begin{bmatrix} - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && \frac {1}{ \sqrt {3}} \\ 0 && - \frac {1}{ \sqrt {2}} && \frac {1} { \sqrt {3}} \\ \frac {1} { \sqrt {2}} && 0 && \frac {1} { \sqrt {3}} \end{bmatrix}$

So $P^* = \begin{bmatrix} - \frac {1} { \sqrt {2}} && 0 && \frac {1}{ \sqrt {2}} \\ - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && 0 \\ \frac {1} { \sqrt {3}} && \frac {1}{ \sqrt {3}} && \frac {1} { \sqrt {3}} \end{bmatrix}$

But when I do $P^* A P$, it doesn't give me a diagonal matrix, why why why!!!!????? thank you