Help on 2 dimensional Markov chains

Hello, my name is Koustubh.

I have a question regarding two dimensional Markov chain.

Lets say that there are three states viz. 1,2,3 and in the spatial domain there are 3 regions say A,B,C.

Now I have Tranisition probability Matrices for states and region separtely. Transition can occur betwwen the states 1,2 and 3 which is representred by a Transition Probability matrix (3x3).

Similarly, I have a seperate transition probability Matrix(3x3) for the three regions. Now, the transition can occur from a state in region A to another state in region B. How do u go about doing the calculation?

Thanks