Results 1 to 12 of 12

Math Help - Diagonalizable/Similar Matrices

  1. #1
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1

    Diagonalizable/Similar Matrices

    I have a couple of Linear Algebra questions.

    1.) Given a Matrix:

    A = ([[5/6,1/3,0,-1/6],[1/3,11/42,3/14,4/21],[0,3/14,5/14,3/7],[-1/6,4/21,3/7,23/42]])

    Where the numbers in each of the []'s is a row (the above matrix corresponding to a 4x4 matrix);

    a.) Is the Matrix A diagonalizable? If it is, diagonlize A. If it is not, explain why it is not. Is A invertible? If it is, find A^(-1). If it is not, explain why it is not.

    b.) By looking at the above example, we are easily able to cehck that A^(2) = A, and that A^(T) = A, even though A =/ (does not equal) I. Consider the general case of an n x n matrix, B, with B =/ I and B^(2) = B. Show that B is not invertible.

    2.) Suppose Matrix A = ([[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n]]). Also, assume that the rows of Matrix A do not add up to 0. That is,

    Summation(c_i =/ 0) from i=1...n.

    Find all of the eigenvalues, their (algebraic) multiplicities, and as well as the dimensions of their eigenspaces.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1
    Now, for my work:

    1.) I know that in general, for k greater than 1, A^(k) = PD^(k)P^(-1)

    A square matrix A is said to be diagonalizable if A is similar to a diagonal matrix, that is, if A = PDP^(-1) for some invertible matrix P and some diagonal matrix D.

    An n x n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. In fact, A = PDP^(-1), with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively to the eigenvectors of P.
    In other words, A is diagonalizable if and only if there are enough eigenvectors to form a basis of R^(n). So, we want the geometric multiplicity to equal the algebraic multiplicity. So, how to find a diagonalizable matrix?

    1.) Find the eigenvectors of A;
    2.) Find n linearly independent eigenvectors of A;
    3.) Construct P from the vectors in step 2;
    4.) Construct D from the corresponding eigenvalues.

    So I found the eigenvalues of the matrix A to be 1,1,0,0.

    And the eigenvectors to be:

    [0, 2, {Vector[Row](1..4,[]), Vector[Row](1..4,[])}], [1, 2, {Vector[Row](1..4,[]), Vector[Row](1..4,[])}] (maple notation), which im a little confused on this notation that it gives back.

    Any way, I made my new matrix P = [[1, (-5)/2, 3/2, 0], [0, 1/2, (-3)/2, 1], [6, 3, 1, 0], [-5, -2, 0, 1]] since those are the eigenvectors.

    D will be the eigenvalues down the diagonal: [[1, (-5)/2, 3/2, 0], [0, 1/2, (-3)/2, 1], [6, 3, 1, 0], [-5, -2, 0, 1]]

    But since order is now important, that is, the order of eigenvalues has to match order chosen for columns of P, I have to be careful. This is where I get stuck.

    Part b? No idea.

    And for #2 im not sure where to start.
    Last edited by fifthrapiers; November 4th 2006 at 07:30 AM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by fifthrapiers View Post

    b.) By looking at the above example, we are easily able to cehck that A^(2) = A, and that A^(T) = A, even though A =/ (does not equal) I. Consider the general case of an n x n matrix, B, with B =/ I and B^(2) = B. Show that B is not invertible.
    Let B\ne I be a square matrix such that:

    <br />
B^2=B<br />

    Now suppose that B has inverse B^{-1} then:

    <br />
B^2 B^{-1}=B B^{-1}<br />

    or regrouping:

    B(BB^{-1})=B=B B^{-1}=I

    so:

    <br />
B=I<br />

    A contradiction, hence any matrix B\ne I that satisfies B^2=B cannot be invertible.

    RonL
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by fifthrapiers View Post
    I have a couple of Linear Algebra questions.

    1.) Given a Matrix:

    A = ([[5/6,1/3,0,-1/6],[1/3,11/42,3/14,4/21],[0,3/14,5/14,3/7],[-1/6,4/21,3/7,23/42]])

    Where the numbers in each of the []'s is a row (the above matrix corresponding to a 4x4 matrix);

    a.) Is the Matrix A diagonalizable? If it is, diagonlize A. If it is not, explain why it is not. Is A invertible? If it is, find A^(-1). If it is not, explain why it is not.
    You have shown that the eigen values of A are 1,\ 1,\ 0,\ 0, and found a
    set of 4 (linearly independent) eigen vectors. So if B is the matrix whos
    columns are these eigen vectors we have:

    B^{-1} A B is diagonal with the eigen values down the diagonal.

    But two of these values on the diagonal are zero, so this matrix is not
    invertible and so A is not invertible.

    RonL
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by fifthrapiers View Post
    2.) Suppose Matrix A = ([[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n]]). Also, assume that the rows of Matrix A do not add up to 0. That is,

    Summation(c_i =/ 0) from i=1...n.

    Find all of the eigenvalues, their (algebraic) multiplicities, and as well as the dimensions of their eigenspaces.
    Lets assume that A is a square matrix. Given the conditions defining it,
    it is obvious that the rank of A is 1. The rank theorem tells us that the dimension
    of the null-space of A is n-1.

    Thus A has one non-zero eigen value, and 0 is an eigen value of multiplicity
    (n-1). An obvious eigen vector is (1, 1, .., 1)', and its eigen value is
    c=sum(i=1..n) c_i.

    So we have

    Code:
    e-value         multiplicity     Dim of E-space
     
       c                   1                    1
       0                  (n-1)               (n-1)
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1
    Ahh, I was making part a harder than it is. So if the eigenvalues in the diagonal = 0, it is not invertible. Now it makes sense, because the product of the eigenvalues is equal to the determinant, and therefore if the determinant is = 0, the matrix cannot be invertible.

    Thanks, CaptainBlack.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1
    Quote Originally Posted by CaptainBlack View Post
    Lets assume that A is a square matrix. Given the conditions defining it,
    it is obvious that the rank of A is 1. The rank theorem tells us that the dimension
    of the null-space of A is n-1.

    Thus A has one non-zero eigen value, and 0 is an eigen value of multiplicity
    (n-1). An obvious eigen vector is (1, 1, .., 1)', and its eigen value is
    c=sum(i=1..n) c_i.

    So we have

    Code:
    e-value         multiplicity     Dim of E-space
     
       c                   1                    1
       0                  (n-1)               (n-1)
    We haven't learned about "rank of A" yet or "null-space". Is there a way to prove this without using the above?

    And, I am assuming Dim of E-space is the dimensions of the eigenspaces (geometric multiplicities, I think).
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    [quote=fifthrapiers;26229]We haven't learned about "rank of A" yet or "null-space". Is there a way to prove this without using the above?[\quote]

    I'll have a think about another method, at the moment other methods look
    labourious but I will see what I can find.

    And, I am assuming Dim of E-space is the dimensions of the eigenspaces (geometric multiplicities, I think).
    The null space is the space of all vectors X for which AX=0, which is
    clearly the same thing as the eigen space of A associated with the
    eigen value 0.

    RonL
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1
    Laborious works, if it's what I have learned. We have covered pretty much everything on determinants, eigenvectors, eigenvalues, char. equation/polynomial, some stuff on Markov chains, and the properties of invertible matrices (singular).
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by fifthrapiers View Post

    2.) Suppose Matrix A = ([[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n],[c_1, c_2, ..., c_n]]). Also, assume that the rows of Matrix A do not add up to 0. That is,

    Summation(c_i =/ 0) from i=1...n.

    Find all of the eigenvalues, their (algebraic) multiplicities, and as well as the dimensions of their eigenspaces.
    I think this still needs some polishing, but here it is anyway. The basic idea is the same as the previous solution for this problem; show the obvious eigen value has multiplicity 1, and that the only other eigen value is 0, which has multiplicity n-1.

    For any X \in \mathbb{R}^n we find that for some \kappa_X \in \mathbb{R} AX=\kappa_X O, where O is the vector all of whose elements are 1, which can be checked by multiplying the expression out.

    Now lets write X=\frac{\kappa_X}{c} O+Y, then as AO=cO:

    <br />
AX = \frac{c \kappa_X}{c} O+AY=\kappa_N O+AY<br />

    Hence AY=0.

    So we conclude that  \mathbb{R}^n can be written as the sum of the space spanned by O and the null space of A, and so the null space of A has dimension n-1.

    From this we conclude that the dimension of the eigen space corresponding to the eigen value c is 1, and so the multiplicity of the eigen space is 1. Also the eigen space corresponding to the eigen value 0 is the null space of A and so of dimension n-1 so the multiplicity of 0 is n-1.

    RonL
    Last edited by CaptainBlack; November 5th 2006 at 11:20 AM.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Member
    Joined
    May 2006
    Posts
    148
    Thanks
    1
    What does it mean for a matrix to have "rank 1"? Is it has something to do with the linearly independent rows/columns, or something to that effect?
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by fifthrapiers View Post
    What does it mean for a matrix to have "rank 1"? Is it has something to do with the linearly independent rows/columns, or something to that effect?
    It means the the number of linearly independent rows or columns is 1.

    So for a square nxn matrix the minimum rank is 1, and the maximum rank is
    n.

    RonL
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 6
    Last Post: January 9th 2012, 08:30 AM
  2. Are these matrices similar?
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: October 24th 2011, 12:18 AM
  3. Similar Matrices
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: November 26th 2010, 01:21 PM
  4. Diagonalizable Matrices
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: December 3rd 2009, 02:31 PM
  5. diagonalizable and invertible matrices
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: December 13th 2008, 02:51 PM

Search Tags


/mathhelpforum @mathhelpforum