## Markov chain convergence

I've got a simple markov chain with a transition matrix of
P = 0.9 0.1
0.2333 0.7667

which converges onto

P = .7 .3
.7 .3 when I matrix multiply it by itself numerous times.

Is there a simple measure of the rate at which this markov chain converges to its steady state values?

Thank-you,