Page 1 of 2 12 LastLast
Results 1 to 15 of 23

Thread: Linear Algebra

  1. #1
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Linear Algebra

    Midterms 2008, i found this problem and I can do the induction part, however the next part I am helplessly stuck. Can anyone please help me solve it?
    Linear Algebra-midterms-2008.png

    Thank You
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Nov 2010
    Posts
    3,389
    Thanks
    1347

    Re: Linear Algebra

    A vector $\displaystyle v_i$ is an eigenvector of $\displaystyle C_p$ with eigenvalue $\displaystyle \lambda_i$ if:

    $\displaystyle C_pv_i = \lambda_i v_i$

    So, calculate each side of the equation.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Im sorry but i have difficulty doing this part, can you be kind and please explain to me how it works?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Apr 2005
    Posts
    19,769
    Thanks
    3027

    Re: Linear Algebra

    What have you done? What do you get when you multiply the matrix [itex]C_p[/itex] and vector [itex]v_i[/itex]? What do you get for the first one or two rows, at least?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    I mean I do get how to show that the two expressions are indeed equal, however how about the fact that C_p can be represented in the jordan normal form ?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Indeed now I do see that C_p v_i = \lambda_i v_i , thank you for that, but how about showing C_p= PDP^-1. How do I get P^-1?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor
    Joined
    Nov 2010
    Posts
    3,389
    Thanks
    1347

    Re: Linear Algebra

    You have $\displaystyle C_p v_i = \lambda_i v_i$ so $\displaystyle C_p \begin{bmatrix}v_1 & v_2 & \cdots & v_n\end{bmatrix} = \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \end{bmatrix}$ simply by equating columns.

    The LHS is $\displaystyle C_p P$. Show that the RHS is $\displaystyle PD$. (Simply multiply it out).

    Now, you just need to show that $\displaystyle P$ is invertible.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    thing is C_p is n time n matrix and P is a 1 times n matrix, how can this be compatible, or is there a typo in the question?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    also how do i show that P is indeed invertible?

    Thank you
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor
    Joined
    Nov 2010
    Posts
    3,389
    Thanks
    1347

    Re: Linear Algebra

    No typo. $\displaystyle P$ is the matrix of column vectors $\displaystyle v_1, \ldots, v_n$, so it is an $\displaystyle n \times n$ matrix, as well.

    There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under $\displaystyle C_p$. Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Cant i use the result of the induction on the first part to show that P is invertible? Because I have no idea how the method you suggested works..
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Wait i just tried C_p P=PD, I just cant seem to be able to arrive at the same expression.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Can you show these eigenvectors are linearly independent?
    Can you see if my proof is correct?

    the linear combination of vectors and weights can be written as the following:

    $\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$

    if we multiply the above expression by some matrix C_p we get:

    $\displaystyle c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

    we can also multiply the first equation with the respective eigenvalues $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k) $and equate with the second equation:

    $\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_kv_{k-1}+c_k\lambda_k v_k=0$

    $\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_{k-1} v_{k-1}+c_k\lambda_k v_k=c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$

    Now multiply the first equation with just $\displaystyle \lambda_k:$

    $\displaystyle c_1\lambda_k v_1 + ... +c_{k-1}\lambda_{k} v_{k-1}+c_k\lambda_k v_k=0$

    subtracting the expressions, we get:

    $\displaystyle c_1(\lambda_1 - \lambda_k) v_1 + ... +c_{k-1}(\lambda_{k-1} -\lambda_k) v_{k-1} =0$

    We know that by the expression above, the equality only holds if $\displaystyle c_1(\lambda_1 - \lambda_k) = 0$ etc... However since we assume from the question that the eigenvalues of the matrix $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k) $ is indeed unique, then it follows that for the equality to hold the values of $\displaystyle c_1 = c_2=...=c_{k-1}=0$. With this established we can relate back to the original equation;

    $\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$ $\displaystyle \rightarrow c_kv_k=0 $ since $\displaystyle c_1 = c_2=...=c_{k-1}=0$, thus this shows that the eigenvector $\displaystyle v_k \ne 0$, hence showing the linear independence of the eigenvectors.

    Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor
    Joined
    Nov 2010
    Posts
    3,389
    Thanks
    1347

    Re: Linear Algebra

    Here is some basics about matrix multiplication. Suppose you have a matrix $\displaystyle A$ and two sets of vectors $\displaystyle x_1,x_2,\ldots x_k$ and $\displaystyle b_1,b_2, \ldots, b_k$ with $\displaystyle A x_i = b_i$ for each $\displaystyle i=1,\ldots, k$. Let $\displaystyle X$ be the matrix whose columns are $\displaystyle x_1,x_2,\ldots, x_k$. Let $\displaystyle B$ be the matrix whose columns are $\displaystyle b_1,b_2,\ldots , b_k$. Then $\displaystyle AX = B$. I recommend trying that out in the 2x2 case for $\displaystyle A,X,$ and $\displaystyle B$. It follows almost trivially from the how matrix multiplication works.

    So, $\displaystyle P$ is just a matrix whose columns are $\displaystyle v_1,v_2,\ldots, v_n$. So, the multiplication $\displaystyle C_pP$ will give $\displaystyle \begin{bmatrix}C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix}$. The $\displaystyle i$-th column is just the result of the matrix $\displaystyle C_p$ times the $\displaystyle i$-th column of $\displaystyle P$. That is how matrix multiplication is defined.

    Now, consider the $\displaystyle i$-th column. It is $\displaystyle C_p v_i$. Because $\displaystyle v_i$ is an eigenvector, $\displaystyle C_p v_i = \lambda_i v_i$. So, we can rewrite the matrix:

    $\displaystyle \begin{bmatrix} C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix} = \begin{bmatrix} \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$

    Now, consider $\displaystyle PD = \begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}\begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}$. Break it down column by column.

    $\displaystyle P\begin{bmatrix}\lambda_1 \\ 0 \\ \vdots \\ 0\end{bmatrix} = \lambda_1 v_1 + 0\cdot v_2 + \cdots + 0\cdot v_n = \lambda_1v_1$

    $\displaystyle P\begin{bmatrix} 0 \\ \vdots \\ \lambda_i \\ \vdots \\ 0\end{bmatrix} = 0\cdot v_1 + \cdots + \lambda_i v_i + \cdots 0\cdot v_n = \lambda_i v_i$

    So, $\displaystyle PD = \begin{bmatrix}\lambda_1v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$.

    Hence, $\displaystyle C_p P = PD$ as desired.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Junior Member
    Joined
    Nov 2013
    From
    Hong Kong
    Posts
    45

    Re: Linear Algebra

    Oo thank you, also can you check my proof on linear independence, because i dont know how to proceed further from there
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Replies: 6
    Last Post: Dec 5th 2017, 10:27 PM
  2. Linear Algebra Linear maps dealing with linear independence.
    Posted in the Advanced Applied Math Forum
    Replies: 4
    Last Post: Mar 22nd 2013, 02:02 PM
  3. Replies: 1
    Last Post: Aug 1st 2011, 10:00 PM
  4. Replies: 2
    Last Post: Dec 6th 2010, 03:03 PM
  5. Replies: 7
    Last Post: Aug 30th 2009, 10:03 AM

Search Tags


/mathhelpforum @mathhelpforum