1. ## Orthogonal matrix

Find an orthogonal or unitary matrix P and a diagonal matrix D such that $\displaystyle P^*AP=D$

a) $\displaystyle A = \begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}$ b) $\displaystyle A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

My solution so far:

From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have $\displaystyle E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}$

But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

Thanks.

2. Hello,

I hate and am not used to finding orthogonal basis... i'll only be able to show you the eigenspaces.

An eigenvalue $\displaystyle \lambda$ verifies :

$\displaystyle AX=\lambda X$

Let's do it for i.

$\displaystyle \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} i x_1 \\ i x_2 \end{pmatrix}$

If we calculate the product on the right side, we'll have a system of equations :

$\displaystyle 0 x_1-x_2=i x_1$
$\displaystyle x_1+0 x_2=i x_2$

$\displaystyle i x_1=-x_2$ (1)
$\displaystyle x_1=i x_2$ (2)

From (1), if we divide all the stuff by i, we have

$\displaystyle x_1=-\frac{x_2}{i}=i x_2$

So the only condition is that $\displaystyle x_1=i x_2$

So take $\displaystyle x_2=1$

Then $\displaystyle x_1=i$

A vector generating the eigenspace associated to i is $\displaystyle E_i=\begin{pmatrix} i \\ 1 \end{pmatrix}$

Now, this will be exactly the same for -i.
If i'm not wrong, you should get :

$\displaystyle E_{-i}=\begin{pmatrix} 1 \\ i \end{pmatrix}$

Find an orthogonal or unitary matrix P and a diagonal matrix D such that $\displaystyle P^*AP=D$

a) $\displaystyle A = \begin{pmatrix} 0 && -1 \\ 1 && 0 \end{pmatrix}$ b) $\displaystyle A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

My solution so far:

From an example in the book, I know that I have to find the eigenvalues of A, then find a basis for the eigenspaces. Use the Gram-Schmidt process to obtain the orthogonal basis, take the union of these bases, then they will make up the entries of P.

Now, for a), I have eigenvalues i and -i, but I have difficulty trying to find the eigenspaces, I have $\displaystyle E_{i}=N(T-iI)= \{ \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \in \mathbb {R}^2 : \begin{pmatrix} -i && -1 \\ 1 && -i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} \}$

But I forgot how to find a basis of that, I know I have done this before, but I just can't do it again now.

Thanks.

$\displaystyle \begin{bmatrix} - \lambda && -1 \\ 1 && -\lambda \\ \end{bmatrix}$

start with $\displaystyle \lambda=-i$

$\displaystyle \begin{bmatrix} i && -1 && 0\\ 1 && i && 0\\ \end{bmatrix} iR_1+R_2$

$\displaystyle \begin{bmatrix} i && -1 && 0\\ 0 && 0 && 0\\ \end{bmatrix} -iR_1$

$\displaystyle \begin{bmatrix} 1 && i && 0\\ 0 && 0 && 0\\ \end{bmatrix}$

so $\displaystyle x_2=t,x_1=-it$

$\displaystyle v_1= \begin{bmatrix} -it \\ t \\ \end{bmatrix}= t\begin{bmatrix} -i \\ 1 \\ \end{bmatrix}$

start with $\displaystyle \lambda=i$

$\displaystyle \begin{bmatrix} -i && -1 && 0\\ 1 && -i && 0\\ \end{bmatrix} -iR_1+R_2;iR_1$

$\displaystyle \begin{bmatrix} 1 && -i && 0\\ 0 && 0 && 0\\ \end{bmatrix}$

so $\displaystyle x_2=s,x_1=is$

$\displaystyle v_2= \begin{bmatrix} is \\ s \\ \end{bmatrix}= s\begin{bmatrix} i \\ 1 \\ \end{bmatrix}$

the magnitude of each is $\displaystyle \sqrt{2}$

so the matrix is

$\displaystyle \begin{bmatrix} \frac{-i}{\sqrt{2}} && \frac{i}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} && \frac{1}{\sqrt{2}} \\ \end{bmatrix}$

4. The only thing I don't understand is how to get from $\displaystyle \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$ to finding the solution $\displaystyle x_{1} = -is, x_{2} = s$

And isn't the magnitude of $\displaystyle \begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0$?

The only thing I don't understand is how to get from $\displaystyle \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$ to finding the solution $\displaystyle x_{1} = -is, x_{2} = s$

And isn't the magnitude of $\displaystyle \begin{bmatrix} i \\ 1 \end{bmatrix} = \sqrt {i^2+1^2} = \sqrt {-1+1} = 0$?
$\displaystyle \begin{bmatrix} 1 && i && 0 \\ 0 && 0 && 0 \end{bmatrix}$

this is equivlent to the system

$\displaystyle x+iy=0$

$\displaystyle 0x+0y=0$

this is true for all values of y becuase y mult 0 = 0

so we parameterize the solution by letting

$\displaystyle y=s, s \in \mathbb{R}$

now we sub this into the first equation

$\displaystyle x+i(s)=0 )\ \iff x=-is$

so all of the solutions to the system are (-is,s) =s(-i,1)

we need to take the modulus of i for the magnitute

so we get

$\displaystyle |v_2|=\sqrt{|-i|^2+1^2}=\sqrt{1+1}=\sqrt{2}$

6. Thank you so so so much!!!

I wonder why our professor didn't explain that.

You just made my weekend much easier.

7. Arrr... I'm doing the second problem and I'm in trouble again.

Consider $\displaystyle A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

I found the eigenvalues are 1 and 4.

Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

So I have $\displaystyle \frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1)$ But that's not enough to make P, did I miss a basis?

Thank you.

8. Hello,

As the matrix is symmetric, it's diagonalizable. This means that :
- there are 3 distinct eigenvalues
- or the sum of the multiplicity of the eigenvalues is 3
- or the eigenspaces generate a subspace of dimension 3.

So you've missed one ^^

Arrr... I'm doing the second problem and I'm in trouble again.

Consider $\displaystyle A = \begin{pmatrix} 2 && 1 && 1 \\ 1 && 2 && 1 \\ 1 && 1 && 2 \end{pmatrix}$

I found the eigenvalues are 1 and 4.

Then I found the basis to be (-2,1,1) for 1, and (1,1,1) for 4.

So I have $\displaystyle \frac {1}{ \sqrt {6}}(-2,1,1), \frac {1}{ \sqrt {3}} (1,1,1)$ But that's not enough to make P, did I miss a basis?

Thank you.
The Basis for the value

$\displaystyle \lambda=1$ has dimention 2

$\displaystyle \begin{bmatrix} 1 && 1 && 1 && 0 \\ 1 && 1 && 1 && 0 \\ 1 && 1 && 1 && 0 \\ \end{bmatrix} -R_1+R_2, -R_1+R_3$

$\displaystyle \begin{bmatrix} 1 && 1 && 1 && 0 \\ 0 && 0 && 0 && 0 \\ 0 && 0 && 0 && 0 \\ \end{bmatrix}$

so if we let z=s and y=t we get the solution

$\displaystyle \begin{bmatrix} -t-s \\ -t \\ s \\ \end{bmatrix} = \begin{bmatrix} -t \\ -t \\ 0 \\ \end{bmatrix} + \begin{bmatrix} -s \\ 0 \\ s \\ \end{bmatrix} =$

$\displaystyle s \begin{bmatrix} -1 \\ 0 \\ 1 \\ \end{bmatrix} + t \begin{bmatrix} -1 \\ -1 \\ 0 \\ \end{bmatrix}$

These are two vector for one value of lambda

Good luck.

10. Arr... I don't know, but I just keep running into trouble in this problem...

I have $\displaystyle P = \begin{bmatrix} - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && \frac {1}{ \sqrt {3}} \\ 0 && - \frac {1}{ \sqrt {2}} && \frac {1} { \sqrt {3}} \\ \frac {1} { \sqrt {2}} && 0 && \frac {1} { \sqrt {3}} \end{bmatrix}$

So $\displaystyle P^* = \begin{bmatrix} - \frac {1} { \sqrt {2}} && 0 && \frac {1}{ \sqrt {2}} \\ - \frac {1}{ \sqrt {2}} && - \frac {1}{ \sqrt {2}} && 0 \\ \frac {1} { \sqrt {3}} && \frac {1}{ \sqrt {3}} && \frac {1} { \sqrt {3}} \end{bmatrix}$

But when I do $\displaystyle P^* A P$, it doesn't give me a diagonal matrix, why why why!!!!????? thank you