Positive Real Matrices
In Michael Artin's "Alegbra" textbook, Chapter 4 section 3, he discussed positive real matrices. "They occur in applications and one of their most important properties is that they always have an eigenvector whose coordinates are positive. Instead of proving this, let us illustrate it in the case of two variables by examining the effect of multiplying by a positive 2 X 2 matrix A on R^2." He goes on to say that since the entries of this matrix A are positive, left multiplication by A carries the first quadrant S to itself [since e1 is carried to the first column of A and e2 to the second column], i.e. S > AS > A^2S and so on. He continues "Now the intersection of a nested set of sectors is either a sector or a half line. In our case, the intersection Z = Intersection over A^r S for all r>=0 turns out to be a half line. This is intuitively plausible, and it can be shown in various ways."
Is there a straightforward / direct way to prove this? I am struggling to think of something other than my geometric intuition that starting with the first quadrant, the sector keeps getting smaller if you repeatedly multiply by a positive matrix...
Thanks! This seems like a good proof of the existence of an Eigenvector in the first quadrant, but I'm not quite seeing how it proves that if you left multiply the first quadrant by a positive real martix an arbitrary number of times and intersect over all such sectors, you end up with a half line instead of a sector.
Also, if one were to try to offer a geometric demonstration (say in the 2x2 acting on R^2 case), e1 and e2 keep moving further "into" the first quadrant as you transform them repeatedly ( S > AS > A^2S etc), but how do you argue that they don't each converge to a separate half line (leaving a sector), rather than to the same half line?