# Math Help - Unitary Matrix problem

1. ## Unitary Matrix problem

I've been strugling with the following for a while. Will appreciate any direction.

If U is a unitary matrix and (U+iI) is hermitian, prove that U=-iI. (U is obviously square of order n).

Thanks,

SK

2. Originally Posted by skyking
If U is a unitary matrix and (U+iI) is hermitian, prove that U=-iI. (U is obviously square of order n).
Think in terms of eigenvalues and eigenvectors. A unitary matrix is diagonalisable, so if all its eigenvalues are $-i$ then U must be equal to $-iI$. Also, each eigenvalue $\lambda$ has absolute value 1.

If $U + iI$ is hermitian then $U^*-iI = U+iI$. Now suppose that $\lambda$ is an eigenvalue of $U$, with eigenvector x. Then $Ux = \lambda x$ and $U^*x = U^{-1}x = \lambda^{-1}x = \overline{\lambda}x$. Now over to you: show from those facts that $\lambda = -i$.

3. Originally Posted by skyking
I've been strugling with the following for a while. Will appreciate any direction.

If U is a unitary matrix and (U+iI) is hermitian, prove that U=-iI. (U is obviously square of order n).

Thanks,

SK
Alternatively to [b]Opalg's[b] solution you can notice three things

1) The eigenvalues of $p(A)$ are $p(\lambda)$ for any polynomial, thus the eigenvalues of $U+iI$ are $\lambda+i$ where $\lambda\in\sigma\left(A\right)$

2) Since $U+iI$ is Hermitian it's eigenvalues must be real, thus $\lambda+i\in\mathbb{R}$ for each $\lambda\in\sigma\left(A\right)$. Thus, $\lambda=r-i$ for some $r\in\mathbb{R}$

3) But, since $U$ is unitary we have that for every $\lambda\in\sigma\left(A\right)$ that $1=|\lambda|=|r-i|=r^2+1$ and thus $r=0$

Thus, all the eigenvalues of $U$ are $-i$.

4. Originally Posted by Opalg
Think in terms of eigenvalues and eigenvectors. A unitary matrix is diagonalisable, so if all its eigenvalues are $-i$ then U must be equal to $-iI$. Also, each eigenvalue $\lambda$ has absolute value 1.

If $U + iI$ is hermitian then $U^*-iI = U+iI$. Now suppose that $\lambda$ is an eigenvalue of $U$, with eigenvector x. Then $Ux = \lambda x$ and $U^*x = U^{-1}x = \lambda^{-1}x = \overline{\lambda}x$. Now over to you: show from those facts that $\lambda = -i$.

Thanks for the idea. I think I understand the logic here. One thing that I'm missing is the jump from saying that if all the eigenvalues of U are -i then it is equal to -iI.
The only reasoning I can find is since U is unitary there exists an orthonormal base in which every single vector is an eigenvector, now if x is avector it can be represented as a linear combination of those base vectors and therfore one can show that U is kI (where k is the only eigenvalue).

BUT, I'm not sure that this is legitimate.

Thanks,

SK

5. Originally Posted by Drexel28
Alternatively to [b]Opalg's[b] solution you can notice three things

1) The eigenvalues of $p(A)$ are $p(\lambda)$ for any polynomial, thus the eigenvalues of $U+iI$ are $\lambda+i$ where $\lambda\in\sigma\left(A\right)$

2) Since $U+iI$ is Hermitian it's eigenvalues must be real, thus $\lambda+i\in\mathbb{R}$ for each $\lambda\in\sigma\left(A\right)$. Thus, $\lambda=r-i$ for some $r\in\mathbb{R}$

3) But, since $U$ is unitary we have that for every $\lambda\in\sigma\left(A\right)$ that $1=|\lambda|=|r-i|=r^2+1$ and thus $r=0$

Thus, all the eigenvalues of $U$ are $-i$.
I'm not sure I understand all the symbology here in statement (1).

BTW, how do you insert math symbols here?

6. Originally Posted by skyking
Thanks for the idea. I think I understand the logic here. One thing that I'm missing is the jump from saying that if all the eigenvalues of U are -i then it is equal to -iI.
The only reasoning I can find is since U is unitary there exists an orthonormal base in which every single vector is an eigenvector, now if x is a vector it can be represented as a linear combination of those base vectors and therefore one can show that U is kI (where k is the only eigenvalue).

BUT, I'm not sure that this is legitimate.
Yes, that is a legitimate argument. Another way to see this result is to say that since there exists an orthonormal basis consisting of eigenvectors, the matrix of U with respect to that basis will be a diagonal matrix in which the diagonal elements are the corresponding eigenvalues. But if each of those eigenvalues is –i, then the matrix is –iI (where I is the identity matrix) and hence U = –iI (where I is the identity operator).

7. Originally Posted by skyking
I'm not sure I understand all the symbology here in statement (1).

BTW, how do you insert math symbols here?
The easy part of the theorem I'm referring to says that if $\lambda$ is an eigenvalue of $A$ and $p(z)=a_0+\cdots+a_nz^n$ is a polynomial then $p(\lambda)$ is an eigenvalue of $p\left(A\right)=a_0I+a_1A+\cdots+a_nA^n$. This is easily verified since if $x_{\lambda}$ is the associated eigenvector then clearly $A^jx_{\lambda}=A^{j-1}\left(Ax_{\lambda}\right)=A^{j-1}\left(\lambda x_{\lambda}=\lambda A^{j-1}x_{\lambda}=\cdots=\lambda^n x_{\lambda}$ and thus

\begin{aligned} p\left(A\right)x_{\lambda} &= \left(\sum_{j=0}^{n}a_j A^j\right)x_{\lambda}\\ &= \sum_{j=0}^{n}\left(a_jA^jx_{\lambda})\\ &=\sum_{j=0}^{n}\left(a_j \lambda^j x_{\lambda}\\ &= \left(\sum_{j=0}^{n}a_j\lambda^j\right)x_{\lambda} \\ &= p(\lambda)x_{\lambda}\end{aligned}

And we used it by noticing that if $p(z)=z+i$ then $U+iI=p\left(U\right)$ so $\lambda$ being an eigenvalue of $U$ implies that $p\left(\lambda\right)=\lambda+i$ is an eigvenvalue of $U+iI$

8. Originally Posted by skyking
T One thing that I'm missing is the jump from saying that if all the eigenvalues of U are -i then it is equal to -iI.
Depending upon your knowledge this follows immediately since every unitary matrix is normal and thus we may appeal to the spectral theorem for normal matrices to conclude that $U=V\text{diag}(\lambda_1,\cdots,\lambda_n)V^{*}\qu ad (1)$ where $V$ is unitary. Thus, noticing that we have proven that $\lambda_1=\cdots=\lambda_n=-i$ we see that $(1)$ implies that $U=V\text{diag}(-i,\cdots,-i)V^{*}=V(-iI)V^{*}= -iVV^{*}=-iI$