Let $\displaystyle \Pi$ be the transition matrix for an irreducible Markov chain on a state Space S. If $\displaystyle \alpha$ is a stationary distribution for $\displaystyle \Pi$ .

Prove that if S is finite, $\displaystyle \alpha(x) > 0 $ for all $\displaystyle x$.

I would appreciate some hints/useful definitions as I'm not really getting Markov chains (and am generally rubbish at probability!).

Thanks.