1. ## Linear Algebra

Midterms 2008, i found this problem and I can do the induction part, however the next part I am helplessly stuck. Can anyone please help me solve it?

Thank You

2. ## Re: Linear Algebra

A vector $\displaystyle v_i$ is an eigenvector of $\displaystyle C_p$ with eigenvalue $\displaystyle \lambda_i$ if:

$\displaystyle C_pv_i = \lambda_i v_i$

So, calculate each side of the equation.

3. ## Re: Linear Algebra

Im sorry but i have difficulty doing this part, can you be kind and please explain to me how it works?

4. ## Re: Linear Algebra

What have you done? What do you get when you multiply the matrix $C_p$ and vector $v_i$? What do you get for the first one or two rows, at least?

5. ## Re: Linear Algebra

I mean I do get how to show that the two expressions are indeed equal, however how about the fact that C_p can be represented in the jordan normal form ?

6. ## Re: Linear Algebra

Indeed now I do see that C_p v_i = \lambda_i v_i , thank you for that, but how about showing C_p= PDP^-1. How do I get P^-1?

7. ## Re: Linear Algebra

You have $\displaystyle C_p v_i = \lambda_i v_i$ so $\displaystyle C_p \begin{bmatrix}v_1 & v_2 & \cdots & v_n\end{bmatrix} = \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \end{bmatrix}$ simply by equating columns.

The LHS is $\displaystyle C_p P$. Show that the RHS is $\displaystyle PD$. (Simply multiply it out).

Now, you just need to show that $\displaystyle P$ is invertible.

8. ## Re: Linear Algebra

thing is C_p is n time n matrix and P is a 1 times n matrix, how can this be compatible, or is there a typo in the question?

9. ## Re: Linear Algebra

also how do i show that P is indeed invertible?

Thank you

10. ## Re: Linear Algebra

No typo. $\displaystyle P$ is the matrix of column vectors $\displaystyle v_1, \ldots, v_n$, so it is an $\displaystyle n \times n$ matrix, as well.

There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under $\displaystyle C_p$. Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.

11. ## Re: Linear Algebra

Cant i use the result of the induction on the first part to show that P is invertible? Because I have no idea how the method you suggested works..

12. ## Re: Linear Algebra

Wait i just tried C_p P=PD, I just cant seem to be able to arrive at the same expression.

13. ## Re: Linear Algebra

Can you show these eigenvectors are linearly independent?
Can you see if my proof is correct?

the linear combination of vectors and weights can be written as the following:

$\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$

if we multiply the above expression by some matrix C_p we get:

$\displaystyle c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

we can also multiply the first equation with the respective eigenvalues $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k)$and equate with the second equation:

$\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_kv_{k-1}+c_k\lambda_k v_k=0$

$\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_{k-1} v_{k-1}+c_k\lambda_k v_k=c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

Now multiply the first equation with just $\displaystyle \lambda_k:$

$\displaystyle c_1\lambda_k v_1 + ... +c_{k-1}\lambda_{k} v_{k-1}+c_k\lambda_k v_k=0$

subtracting the expressions, we get:

$\displaystyle c_1(\lambda_1 - \lambda_k) v_1 + ... +c_{k-1}(\lambda_{k-1} -\lambda_k) v_{k-1} =0$

We know that by the expression above, the equality only holds if $\displaystyle c_1(\lambda_1 - \lambda_k) = 0$ etc... However since we assume from the question that the eigenvalues of the matrix $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k)$ is indeed unique, then it follows that for the equality to hold the values of $\displaystyle c_1 = c_2=...=c_{k-1}=0$. With this established we can relate back to the original equation;

$\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$ $\displaystyle \rightarrow c_kv_k=0$ since $\displaystyle c_1 = c_2=...=c_{k-1}=0$, thus this shows that the eigenvector $\displaystyle v_k \ne 0$, hence showing the linear independence of the eigenvectors.

Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?

14. ## Re: Linear Algebra

Here is some basics about matrix multiplication. Suppose you have a matrix $\displaystyle A$ and two sets of vectors $\displaystyle x_1,x_2,\ldots x_k$ and $\displaystyle b_1,b_2, \ldots, b_k$ with $\displaystyle A x_i = b_i$ for each $\displaystyle i=1,\ldots, k$. Let $\displaystyle X$ be the matrix whose columns are $\displaystyle x_1,x_2,\ldots, x_k$. Let $\displaystyle B$ be the matrix whose columns are $\displaystyle b_1,b_2,\ldots , b_k$. Then $\displaystyle AX = B$. I recommend trying that out in the 2x2 case for $\displaystyle A,X,$ and $\displaystyle B$. It follows almost trivially from the how matrix multiplication works.

So, $\displaystyle P$ is just a matrix whose columns are $\displaystyle v_1,v_2,\ldots, v_n$. So, the multiplication $\displaystyle C_pP$ will give $\displaystyle \begin{bmatrix}C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix}$. The $\displaystyle i$-th column is just the result of the matrix $\displaystyle C_p$ times the $\displaystyle i$-th column of $\displaystyle P$. That is how matrix multiplication is defined.

Now, consider the $\displaystyle i$-th column. It is $\displaystyle C_p v_i$. Because $\displaystyle v_i$ is an eigenvector, $\displaystyle C_p v_i = \lambda_i v_i$. So, we can rewrite the matrix:

$\displaystyle \begin{bmatrix} C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix} = \begin{bmatrix} \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$

Now, consider $\displaystyle PD = \begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}\begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}$. Break it down column by column.

$\displaystyle P\begin{bmatrix}\lambda_1 \\ 0 \\ \vdots \\ 0\end{bmatrix} = \lambda_1 v_1 + 0\cdot v_2 + \cdots + 0\cdot v_n = \lambda_1v_1$

$\displaystyle P\begin{bmatrix} 0 \\ \vdots \\ \lambda_i \\ \vdots \\ 0\end{bmatrix} = 0\cdot v_1 + \cdots + \lambda_i v_i + \cdots 0\cdot v_n = \lambda_i v_i$

So, $\displaystyle PD = \begin{bmatrix}\lambda_1v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$.

Hence, $\displaystyle C_p P = PD$ as desired.

15. ## Re: Linear Algebra

Oo thank you, also can you check my proof on linear independence, because i dont know how to proceed further from there

Page 1 of 2 12 Last