Matrix Spectra with Troublesome Trigonometric Term

Hello!

I’m trying to find the spectra (eigenvalues) of matrix

$\displaystyle A = \[ \left( \begin{array}{ccc}

cos(x) & -sin(x) & 0 \\

sin(x) & cos(x) & 0 \\

0 & 0 & 1 \end{array} \right)\] $

I start by generating its characteristic polynomial which yields:

$\displaystyle (cos(x)-\lambda)(cos(x)-\lambda)(1-\lambda)+(sin(x))(sin(x))(1-\lambda)=0$

$\displaystyle (1-\lambda)[(cos(x)-\lambda)(cos(x)-\lambda)+sin(x)sin(x) ]$

$\displaystyle (1-\lambda)[cos(x)^2-2\lambdacos(x)+\lambda^2+sin(x)^2]$

Applying the Euler Formula:

$\displaystyle (1-\lambda)[-2\lambda cos(x) + \lambda^2 + 1]$

$\displaystyle (1-\lambda)(\lambda^2 - 2\lambda.cos(x) + 1)$

At this point, I want to extract roots, the solutions for eigenvalues $\displaystyle \lambda$. One, from the LH term, is clearly $\displaystyle \lambda = 1$. But that $\displaystyle -2cos(x)$ in the middle is stopping me from factorizing it via any way I can work out. I expect, since A was orthogonal, that the other two roots will be a complex conjugate pair but am not 100% on that – in any case, I can’t see how I can legally break that trig term into a complex one.

Any ideas?

Cheers!

Alexandicity