Let be the transition matrix for an irreducible Markov chain on a state Space S. If is a stationary distribution for .
Prove that if S is finite, for all .
I would appreciate some hints/useful definitions as I'm not really getting Markov chains (and am generally rubbish at probability!).
Thanks.