Let be a Bernoulli Process with success parameter p. Prove if , where for and and where for and are Markov chains.
Follow Math Help Forum on Facebook and Google+
Does anyone have any idea how to go about these?
Let us suppose then, either with 0.5 probability of each case. Given this, and knowing that However, if we know that we know that Taking this into account, we get that, and , imply . Therefore, It doesn't seem to be a Markov Chain
View Tag Cloud