Please somebody help!!!
So I don't feel so alone with Markov Chains?!
Hey Guys,
A quick question I was wondering if you could help me or help me try and understand the way about approaching this question...
Let {Xt}t≥0 be a two-state Markov chain with state space S = {0, 1}, transition matrix:
and initial distribution
Define the New Stochastic Processes {Yt}t≥1 and {Zt}t≥1 as:
and,
a) What are the State Spaces for these New Stochastic Processes?
I have tried using the transition matrix and inputting the values of S = {0,1} into {Yt} and {Zt} but I'm not sure if I am on the right path.. Any Suggestions??
Thanks
The state space is the set of possible values which the random variable can take on. In this example if are both non zero then there is some chance of going from any state to any other state in { }.
Therefore:
can be when
or when
or when
Use the same reasoning to determine what states can be in. Can it achieve the value 11?