Results 1 to 5 of 5

Math Help - Eigenvalues of an Orthonormal Projection

  1. #1
    Junior Member
    Joined
    Nov 2010
    Posts
    33

    Eigenvalues of an Orthonormal Projection

    Let P = the map which takes to its orthogonal projection onto S, a subspace of an inner product space V.

    dimS = k, dimV = n

    Find the eigenvalues and eigenvectors of P. What are the algebraic and geometric multiplicities of each eigenvalue?

    Ok so what I know is that P:S-->S(orthonormal projection), and that dimS(orthonormal projection) = n-k. But how would I make this into a matrix where I can find the eigenvalues and eigenvectors?

    My guess would be to say P is a projection of a vector x in S onto S(orthonormal projection) which means P(x) = SUM(<x, si> si) where si is a vector in S(orthonormal projection). Then I would say that [x]_s = (<x,s1>, <x,s2>,...,<x,sn>), but then I don't know how to find the eigenvalues of a matrix that isn't square.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Apr 2005
    Posts
    16,454
    Thanks
    1868
    You are overthinking this. If v already happens to lie in that given subspace, then Pv= v. What eigenvalue does that correspond to? And, of course, the "multiplicity" is the dimension of that subspace. If v is orthogonal to that subspace, then Pv= 0. What eigenvalue does that correspond to. The "multiplicity" of that eigenvalue is the dimension of V minus the dimension of the given subspace.

    By the way, you "don't know how to find the eigenvalues of a matrix that isn't square" because non-square matrices don't have eigenvalues! \lambda is an eigenvalue of a matrix (or, more generally, linear transformation), A, if and only Av= \lambda v for some non-zero eigenvector.
    In order for that to even make sense, v must be a member of both the "domain" and "range" spaces for A. That is, A must map some vector space V into itself which means that the matrix representing A must be square.
    Last edited by Ackbeet; April 12th 2011 at 07:49 PM. Reason: Changed physicsforums TeX tags to MHF TeX tags.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,401
    Thanks
    762
    a matrix for P is square, it just isn't 1-1 (it has rank k < n, unless S is just the 0-subspace). also P maps a vector s in S to itself,

    so P^2(v) = P(P(v)) = P(v) (because P(v) is in S).

    so if λ is an eigenvalue of P, and v an eigenvector, P^2(v) = P(λv) = λP(v) = λ^2v, but P^2(v) = P(v) = λv, so λ^2 = λ. what values can λ have?

    how many 0-rows must a reduced matrix for P have? what must the other rows be (is the matrix for P symmetric)?

    you should be able to write down the characteristic polynomial for P, because you know dim(S). this gives you the algebraic multiplicites.

    is the reduced matrix for P diagonal? what does this tell you about the geometric multiplicites?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Junior Member
    Joined
    Nov 2010
    Posts
    33
    So would the eigenvalues be -1, 0, and 1? But then I don't see how I can find the eigenvalues from this information since I don't know the exact values in P. Would the eigenvectors just be a set of orthonormal vectors?

    Also, is the multiplicity of each eigenvalue the same? I don't know how I would determine different multiplicities for each eigenvalue if I don't know the exact values in P. Would the algebraic multiplicity of each eigenvalue be n-k?

    Also, wouldn't P have all linearly independent vectors because the set of all mutually orthogonal vectors is a linearly independent set. That would mean the reduced matrix of P would have no 0-rows and the other rows would be consisting of the eigenvalues on the diagonal. This would make the geometric multiplicity k, right?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,401
    Thanks
    762
    is -1 a solution to λ^2 - λ = 0?

    the rows of P can't possibly ALL be linearly independent. P shrinks down a vector space of dim n to one of dimension k.

    convince yourself that any vector perpendicular to S is in the null space of P (why?). what could the eigenvalue of an eigenvector in the null space possibly be?

    S has dimension k. how can the rank of P be more than k??????? isn't S the entirety of the image of P?

    so how could P possibly have more than k linearly independent columns? assuming that P is row-reduced, what does that say about how many 0-rows you have?

    can you have more non-zero rows than linearly independent columns? is the column rank and the row rank ever different?

    the characteristic polynomial of P is obvious. the eigenvalues of P are obvious. once you write down the characteristic polynomial, you HAVE the algebraic multiplicites.

    the geometric multiplicity of an eigenvalue is the dimension of the eigenspace.

    what is another name for an eigenspace corresponding to an eigenvalue of 0?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Orthonormal Set
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: May 2nd 2011, 03:07 AM
  2. Replies: 1
    Last Post: December 2nd 2009, 11:14 PM
  3. Orthonormal Help! (Lin. Alg.)
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: October 30th 2008, 07:08 AM
  4. Orthonormal basis
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: April 18th 2008, 09:05 AM
  5. Orthonormal
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: November 30th 2007, 01:35 AM

Search Tags


/mathhelpforum @mathhelpforum