1. ## Linear Algebra

Midterms 2008, i found this problem and I can do the induction part, however the next part I am helplessly stuck. Can anyone please help me solve it?

Thank You

2. ## Re: Linear Algebra

A vector $v_i$ is an eigenvector of $C_p$ with eigenvalue $\lambda_i$ if:

$C_pv_i = \lambda_i v_i$

So, calculate each side of the equation.

3. ## Re: Linear Algebra

Im sorry but i have difficulty doing this part, can you be kind and please explain to me how it works?

4. ## Re: Linear Algebra

What have you done? What do you get when you multiply the matrix $C_p$ and vector $v_i$? What do you get for the first one or two rows, at least?

5. ## Re: Linear Algebra

I mean I do get how to show that the two expressions are indeed equal, however how about the fact that C_p can be represented in the jordan normal form ?

6. ## Re: Linear Algebra

Indeed now I do see that C_p v_i = \lambda_i v_i , thank you for that, but how about showing C_p= PDP^-1. How do I get P^-1?

7. ## Re: Linear Algebra

You have $C_p v_i = \lambda_i v_i$ so $C_p \begin{bmatrix}v_1 & v_2 & \cdots & v_n\end{bmatrix} = \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \end{bmatrix}$ simply by equating columns.

The LHS is $C_p P$. Show that the RHS is $PD$. (Simply multiply it out).

Now, you just need to show that $P$ is invertible.

8. ## Re: Linear Algebra

thing is C_p is n time n matrix and P is a 1 times n matrix, how can this be compatible, or is there a typo in the question?

9. ## Re: Linear Algebra

also how do i show that P is indeed invertible?

Thank you

10. ## Re: Linear Algebra

No typo. $P$ is the matrix of column vectors $v_1, \ldots, v_n$, so it is an $n \times n$ matrix, as well.

There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under $C_p$. Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.

11. ## Re: Linear Algebra

Cant i use the result of the induction on the first part to show that P is invertible? Because I have no idea how the method you suggested works..

12. ## Re: Linear Algebra

Wait i just tried C_p P=PD, I just cant seem to be able to arrive at the same expression.

13. ## Re: Linear Algebra

Can you show these eigenvectors are linearly independent?
Can you see if my proof is correct?

the linear combination of vectors and weights can be written as the following:

$c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$

if we multiply the above expression by some matrix C_p we get:

$c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

we can also multiply the first equation with the respective eigenvalues $(\lambda_1, \lambda_2, .... , \lambda_k)$and equate with the second equation:

$c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_kv_{k-1}+c_k\lambda_k v_k=0$

$c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_{k-1} v_{k-1}+c_k\lambda_k v_k=c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

Now multiply the first equation with just $\lambda_k:$

$c_1\lambda_k v_1 + ... +c_{k-1}\lambda_{k} v_{k-1}+c_k\lambda_k v_k=0$

subtracting the expressions, we get:

$c_1(\lambda_1 - \lambda_k) v_1 + ... +c_{k-1}(\lambda_{k-1} -\lambda_k) v_{k-1} =0$

We know that by the expression above, the equality only holds if $c_1(\lambda_1 - \lambda_k) = 0$ etc... However since we assume from the question that the eigenvalues of the matrix $(\lambda_1, \lambda_2, .... , \lambda_k)$ is indeed unique, then it follows that for the equality to hold the values of $c_1 = c_2=...=c_{k-1}=0$. With this established we can relate back to the original equation;

$c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$ $\rightarrow c_kv_k=0$ since $c_1 = c_2=...=c_{k-1}=0$, thus this shows that the eigenvector $v_k \ne 0$, hence showing the linear independence of the eigenvectors.

Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?

14. ## Re: Linear Algebra

Here is some basics about matrix multiplication. Suppose you have a matrix $A$ and two sets of vectors $x_1,x_2,\ldots x_k$ and $b_1,b_2, \ldots, b_k$ with $A x_i = b_i$ for each $i=1,\ldots, k$. Let $X$ be the matrix whose columns are $x_1,x_2,\ldots, x_k$. Let $B$ be the matrix whose columns are $b_1,b_2,\ldots , b_k$. Then $AX = B$. I recommend trying that out in the 2x2 case for $A,X,$ and $B$. It follows almost trivially from the how matrix multiplication works.

So, $P$ is just a matrix whose columns are $v_1,v_2,\ldots, v_n$. So, the multiplication $C_pP$ will give $\begin{bmatrix}C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix}$. The $i$-th column is just the result of the matrix $C_p$ times the $i$-th column of $P$. That is how matrix multiplication is defined.

Now, consider the $i$-th column. It is $C_p v_i$. Because $v_i$ is an eigenvector, $C_p v_i = \lambda_i v_i$. So, we can rewrite the matrix:

$\begin{bmatrix} C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix} = \begin{bmatrix} \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$

Now, consider $PD = \begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}\begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}$. Break it down column by column.

$P\begin{bmatrix}\lambda_1 \\ 0 \\ \vdots \\ 0\end{bmatrix} = \lambda_1 v_1 + 0\cdot v_2 + \cdots + 0\cdot v_n = \lambda_1v_1$

$P\begin{bmatrix} 0 \\ \vdots \\ \lambda_i \\ \vdots \\ 0\end{bmatrix} = 0\cdot v_1 + \cdots + \lambda_i v_i + \cdots 0\cdot v_n = \lambda_i v_i$

So, $PD = \begin{bmatrix}\lambda_1v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$.

Hence, $C_p P = PD$ as desired.

15. ## Re: Linear Algebra

Oo thank you, also can you check my proof on linear independence, because i dont know how to proceed further from there

Page 1 of 2 12 Last