It's an example from my textbook

Suppose that whether or not it rains today depends on previous whether conditions through the last two days. Specifically, supposed that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.

If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (explain why?). However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in

state 0: if it rained both today and yesterday

state 1: if it rained today but not yesterday

state 2: if it rained yesterday but not today

state 3: if it did not rain either yesterday or today

The preceding would then represent a four-state Markov Chain having a transition probability matrix

I'm having a hard time understanding the different states and how the probabilities are calculated.

Conversely, I had a very easy time understanding this one which is one of my homework problems.

Three white and three black balls are distributed in two urns in such a way that each contains three balls. We say that the system is in state i,i=0,1,2,3, if the first urn contains i white balls. At each step, we draw one ball from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let X_n denote the state of the system after the nth step. Explain why {X_n,n=0,1,2,3,…} is a Markov chain and calculate its transition probability matrix.

Any help is appreciated.