Results 1 to 3 of 3

Math Help - Positive Real Matrices

  1. #1
    Newbie
    Joined
    Jun 2009
    Posts
    3

    Post Positive Real Matrices

    In Michael Artin's "Alegbra" textbook, Chapter 4 section 3, he discussed positive real matrices. "They occur in applications and one of their most important properties is that they always have an eigenvector whose coordinates are positive. Instead of proving this, let us illustrate it in the case of two variables by examining the effect of multiplying by a positive 2 X 2 matrix A on R^2." He goes on to say that since the entries of this matrix A are positive, left multiplication by A carries the first quadrant S to itself [since e1 is carried to the first column of A and e2 to the second column], i.e. S > AS > A^2S and so on. He continues "Now the intersection of a nested set of sectors is either a sector or a half line. In our case, the intersection Z = Intersection over A^r S for all r>=0 turns out to be a half line. This is intuitively plausible, and it can be shown in various ways."

    Is there a straightforward / direct way to prove this? I am struggling to think of something other than my geometric intuition that starting with the first quadrant, the sector keeps getting smaller if you repeatedly multiply by a positive matrix...
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by skeboy View Post
    In Michael Artin's "Alegbra" textbook, Chapter 4 section 3, he discussed positive real matrices. "They occur in applications and one of their most important properties is that they always have an eigenvector whose coordinates are positive. Instead of proving this, let us illustrate it in the case of two variables by examining the effect of multiplying by a positive 2 X 2 matrix A on R^2." He goes on to say that since the entries of this matrix A are positive, left multiplication by A carries the first quadrant S to itself [since e1 is carried to the first column of A and e2 to the second column], i.e. S > AS > A^2S and so on. He continues "Now the intersection of a nested set of sectors is either a sector or a half line. In our case, the intersection Z = Intersection over A^r S for all r>=0 turns out to be a half line. This is intuitively plausible, and it can be shown in various ways."

    Is there a straightforward / direct way to prove this? I am struggling to think of something other than my geometric intuition that starting with the first quadrant, the sector keeps getting smaller if you repeatedly multiply by a positive matrix...
    here's a proof instead of a just a geometric illustration for 2 \times 2 matrices:

    let A be a k \times k positive real matrix and \lambda the eigenvalue of A such that |\lambda| is as large as possible. clearly |\lambda| > 0 because otherwise all the eigenvalues of A would be 0 and so A would be

    nilpotent, which is impossible since A is positive. let \bold{a}=\begin{bmatrix}a_1 & a_2 & \cdots & a_k \end{bmatrix}^T be an eigenvector of A corresponding to \lambda. define |\bold{a}|=\begin{bmatrix}|a_1| & |a_2| & \cdots & |a_k| \end{bmatrix}^T. the claim is that |\bold{a}| is also an

    eigenvector of A, which will prove the problem because |a_j| > 0, for all j. why? from now on, we'll use this notation: for any matrices X,Y with the same dimension we wrtie X > Y \ (X \geq Y)

    if all the entries of X-Y are positive (non-negative). back to our problem: let A|\bold{a}|=\bold{c} and \bold{c}-|\lambda||\bold{a}|=\bold{b}. we only need to prove that \bold{b}=\bold{0}. it's easy to see that \bold{b} \geq \bold{0}. so if \bold{b} \neq \bold{0}, then

    A \bold{b} > \bold{0}. therefore there exists a real number r > 0 such that A \bold{b} > r \bold{c}, because clearly \bold{c} > \bold{0}. using the definition of \bold{b} we get: \frac{1}{r + |\lambda|} A\bold{c} > \bold{c}, and hence \left(\frac{1}{r+|\lambda|}A \right)^n \bold{c} > \bold{c}, for all positive

    integers n. call this (1). now if \mu is an eigenvalue of \frac{1}{r+|\lambda|}A, then \mu(r+|\lambda|) will be an eigenvalue of A and thus we must have |\mu|(r+|\lambda|) \leq |\lambda|. therefore |\mu| < 1. so, by what we proved

    in here, we must have \lim_{n\to\infty}\left(\frac{1}{r+|\lambda|}A \right)^n=\bold{0}. but then (1) will give us the contradiction \bold{0} \geq \bold{c}. \ \ \Box
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Newbie
    Joined
    Jun 2009
    Posts
    3
    Thanks! This seems like a good proof of the existence of an Eigenvector in the first quadrant, but I'm not quite seeing how it proves that if you left multiply the first quadrant by a positive real martix an arbitrary number of times and intersect over all such sectors, you end up with a half line instead of a sector.

    Also, if one were to try to offer a geometric demonstration (say in the 2x2 acting on R^2 case), e1 and e2 keep moving further "into" the first quadrant as you transform them repeatedly ( S > AS > A^2S etc), but how do you argue that they don't each converge to a separate half line (leaving a sector), rather than to the same half line?
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Positive Real Numbers
    Posted in the Number Theory Forum
    Replies: 1
    Last Post: March 30th 2011, 05:36 AM
  2. Positive definite matrices
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: May 25th 2010, 04:18 PM
  3. Positive semidefinite matrices
    Posted in the Advanced Algebra Forum
    Replies: 4
    Last Post: May 5th 2010, 02:43 PM
  4. Solve for positive real x
    Posted in the Algebra Forum
    Replies: 3
    Last Post: November 10th 2008, 04:46 AM
  5. Let x and y be positive real numbers...
    Posted in the Algebra Forum
    Replies: 2
    Last Post: October 26th 2008, 07:09 AM

Search Tags


/mathhelpforum @mathhelpforum