# Thread: Eigenvector, linear transformation problem

1. ## Eigenvector, linear transformation problem

Let $\displaystyle T: \mathbb{R}^2 \to \mathbb{R}^2$ defined by $\displaystyle T(x,y)=(3x+y, 2x+2y)$.
Show that there exists a basis $\displaystyle \bold {B} =\{ \alpha _1 , \alpha _2 \}$ of $\displaystyle \mathbb{R}^2$ formed by eigenvectors.
Find the matrix of $\displaystyle T$ with respect of the canonical basis and the matrix of $\displaystyle T$ with respect of the basis of eigenvectors of $\displaystyle \bold {B}$.

My attempt: I don't know how to show the first part.
I think I've found the matrix of $\displaystyle T$ with respect to the canonical basis :
$\displaystyle T(1,0)=(3,2)$
$\displaystyle T(0,1)=(1,2)$. Hence the matrix is $\displaystyle \begin{bmatrix} 3 & 1 \\ 2 & 2 \end{bmatrix}$. Let's call it $\displaystyle T'$.
I don't know how to get it with respect to another basis, even if I realize I must know it since it's very basic.
From memory, I think that eigenvalues of $\displaystyle T$ are the $\displaystyle \lambda$ in $\displaystyle \det [I \lambda - T']=0 \Leftrightarrow \lambda=1 \text{ or } 4$. I guess that by finding the eigenvalues or $\displaystyle T$ it would help to find its eigenvectors... I'm lost! I'd appreciate a bit of help if you can. Thanks.

2. Originally Posted by arbolis
Hence the matrix is $\displaystyle \begin{bmatrix} 3 & 1 \\ 2 & 2 \end{bmatrix}$. Let's call it $\displaystyle T'$.
I don't know how to get it with respect to another basis, even if I realize I must know it since it's very basic.
From memory, I think that eigenvalues of $\displaystyle T$ are the $\displaystyle \lambda$ in $\displaystyle \det [I k - T']=0 \Leftrightarrow k=1 \text{ or } 4$. I guess that by finding the eigenvalues or $\displaystyle T$ it would help to find its eigenvectors... I'm lost! I'd appreciate a bit of help if you can. Thanks.
You established that $\displaystyle k=1\text{ or }4$. Therefore, there is a non-trivial solution to (working with $\displaystyle k=1$ first):
$\displaystyle \begin{bmatrix} 3 & 1 \\ 2 & 2 \end{bmatrix} \begin{bmatrix}x_1\\x_2 \end{bmatrix} = 1\cdot \begin{bmatrix}x_1\\x_2\end{bmatrix}$.
This corresponds to:
$\displaystyle \left\{ \begin{array}{c}3x_1 + 1x_2 = x_1 \\ 2x_1 + 2x_2 = x_2 \end{array} \right. \implies \left\{ \begin{array}{c} 2x_1 + x_2 = 0 \\ 2x_1 + x_2 = 0 \end{array} \right.$
The solution to this system is: $\displaystyle x_1 = t\text{ and }x_2 = -2t, t\in \mathbb{R}$.
The eigenvectors corresponding to the eigenvalue one are therefore: $\displaystyle \left\{ k \begin{bmatrix}1\\-2\end{bmatrix} : k \in \mathbb{R}^{\times} \right\}$

3. Thank you TPH,
by the way, what does $\displaystyle \mathbb{R}^{\times}$ mean?

4. Originally Posted by arbolis
Thank you TPH,
by the way, what does $\displaystyle \mathbb{R}^{\times}$ mean?
No, turns out ThePerfectHacker didn't mean what I thought at all!

5. Originally Posted by arbolis
Thank you TPH,
by the way, what does $\displaystyle \mathbb{R}^{\times}$ mean?
We define $\displaystyle \mathbb{R}^{\times} = \{ x\in \mathbb{R} | x\not = 0\}$.
The reason I did that because I wanted to avoid the zero vector, usually when we look for eigenvectors we ignore the trivial ones.

6. Originally Posted by ThePerfectHacker
We define $\displaystyle \mathbb{R}^{\times} = \{ x\in \mathbb{R} | x\not = 0\}$.
The reason I did that because I wanted to avoid the zero vector, usually when we look for eigenvectors we ignore the trivial ones.
Ah ok. I knew it as $\displaystyle R^{*}$.