Let \Pi be the transition matrix for an irreducible Markov chain on a state Space S. If \alpha is a stationary distribution for \Pi .

Prove that if S is finite, \alpha(x) > 0 for all x.

I would appreciate some hints/useful definitions as I'm not really getting Markov chains (and am generally rubbish at probability!).

Thanks.