Page 1 of 2 12 LastLast
Results 1 to 15 of 23

Math Help - Linear Algebra

  1. #1
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Linear Algebra

    Midterms 2008, i found this problem and I can do the induction part, however the next part I am helplessly stuck. Can anyone please help me solve it?
    Linear Algebra-midterms-2008.png

    Thank You
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor
    Joined
    Nov 2010
    Posts
    1,904
    Thanks
    764

    Re: Linear Algebra

    A vector v_i is an eigenvector of C_p with eigenvalue \lambda_i if:

    C_pv_i = \lambda_i v_i

    So, calculate each side of the equation.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Im sorry but i have difficulty doing this part, can you be kind and please explain to me how it works?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,792
    Thanks
    1532

    Re: Linear Algebra

    What have you done? What do you get when you multiply the matrix [itex]C_p[/itex] and vector [itex]v_i[/itex]? What do you get for the first one or two rows, at least?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    I mean I do get how to show that the two expressions are indeed equal, however how about the fact that C_p can be represented in the jordan normal form ?
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Indeed now I do see that C_p v_i = \lambda_i v_i , thank you for that, but how about showing C_p= PDP^-1. How do I get P^-1?
    Follow Math Help Forum on Facebook and Google+

  7. #7
    MHF Contributor
    Joined
    Nov 2010
    Posts
    1,904
    Thanks
    764

    Re: Linear Algebra

    You have C_p v_i = \lambda_i v_i so C_p \begin{bmatrix}v_1 & v_2 & \cdots & v_n\end{bmatrix} = \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \end{bmatrix} simply by equating columns.

    The LHS is C_p P. Show that the RHS is PD. (Simply multiply it out).

    Now, you just need to show that P is invertible.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    thing is C_p is n time n matrix and P is a 1 times n matrix, how can this be compatible, or is there a typo in the question?
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    also how do i show that P is indeed invertible?

    Thank you
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor
    Joined
    Nov 2010
    Posts
    1,904
    Thanks
    764

    Re: Linear Algebra

    No typo. P is the matrix of column vectors v_1, \ldots, v_n, so it is an n \times n matrix, as well.

    There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under C_p. Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Cant i use the result of the induction on the first part to show that P is invertible? Because I have no idea how the method you suggested works..
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Wait i just tried C_p P=PD, I just cant seem to be able to arrive at the same expression.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Can you show these eigenvectors are linearly independent?
    Can you see if my proof is correct?

    the linear combination of vectors and weights can be written as the following:

    c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0

    if we multiply the above expression by some matrix C_p we get:

    c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0

    we can also multiply the first equation with the respective eigenvalues (\lambda_1, \lambda_2, .... , \lambda_k) and equate with the second equation:

    c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_kv_{k-1}+c_k\lambda_k v_k=0

    c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_{k-1} v_{k-1}+c_k\lambda_k v_k=c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0

    Now multiply the first equation with just \lambda_k:

    c_1\lambda_k v_1 + ... +c_{k-1}\lambda_{k} v_{k-1}+c_k\lambda_k v_k=0

    subtracting the expressions, we get:

    c_1(\lambda_1 - \lambda_k) v_1 + ... +c_{k-1}(\lambda_{k-1} -\lambda_k) v_{k-1} =0

    We know that by the expression above, the equality only holds if c_1(\lambda_1 - \lambda_k) = 0 etc... However since we assume from the question that the eigenvalues of the matrix (\lambda_1, \lambda_2, .... , \lambda_k) is indeed unique, then it follows that for the equality to hold the values of c_1 = c_2=...=c_{k-1}=0. With this established we can relate back to the original equation;

    c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0 \rightarrow c_kv_k=0 since c_1 = c_2=...=c_{k-1}=0, thus this shows that the eigenvector v_k \ne 0, hence showing the linear independence of the eigenvectors.

    Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?
    Follow Math Help Forum on Facebook and Google+

  14. #14
    MHF Contributor
    Joined
    Nov 2010
    Posts
    1,904
    Thanks
    764

    Re: Linear Algebra

    Here is some basics about matrix multiplication. Suppose you have a matrix A and two sets of vectors x_1,x_2,\ldots x_k and b_1,b_2, \ldots, b_k with A x_i = b_i for each i=1,\ldots, k. Let X be the matrix whose columns are x_1,x_2,\ldots, x_k. Let B be the matrix whose columns are b_1,b_2,\ldots , b_k. Then AX = B. I recommend trying that out in the 2x2 case for A,X, and B. It follows almost trivially from the how matrix multiplication works.

    So, P is just a matrix whose columns are v_1,v_2,\ldots, v_n. So, the multiplication C_pP will give \begin{bmatrix}C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix}. The i-th column is just the result of the matrix C_p times the i-th column of P. That is how matrix multiplication is defined.

    Now, consider the i-th column. It is C_p v_i. Because v_i is an eigenvector, C_p v_i = \lambda_i v_i. So, we can rewrite the matrix:

    \begin{bmatrix} C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix} = \begin{bmatrix} \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}

    Now, consider PD = \begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}\begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}. Break it down column by column.

    P\begin{bmatrix}\lambda_1 \\ 0 \\ \vdots \\ 0\end{bmatrix} = \lambda_1 v_1 + 0\cdot v_2 + \cdots + 0\cdot v_n = \lambda_1v_1

    P\begin{bmatrix} 0 \\ \vdots \\ \lambda_i \\ \vdots \\ 0\end{bmatrix} = 0\cdot v_1 + \cdots + \lambda_i v_i + \cdots 0\cdot v_n = \lambda_i v_i

    So, PD = \begin{bmatrix}\lambda_1v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}.

    Hence, C_p P = PD as desired.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Newbie
    Joined
    Nov 2013
    From
    jakarta
    Posts
    17

    Re: Linear Algebra

    Oo thank you, also can you check my proof on linear independence, because i dont know how to proceed further from there
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Linear Algebra Linear maps dealing with linear independence.
    Posted in the Advanced Applied Math Forum
    Replies: 4
    Last Post: March 22nd 2013, 02:02 PM
  2. Replies: 3
    Last Post: March 8th 2013, 05:04 AM
  3. Replies: 1
    Last Post: August 1st 2011, 10:00 PM
  4. Replies: 2
    Last Post: December 6th 2010, 03:03 PM
  5. Replies: 7
    Last Post: August 30th 2009, 10:03 AM

Search Tags


/mathhelpforum @mathhelpforum