# Markov Chains

• Apr 22nd 2010, 07:40 PM
Steamer
Markov Chains
Hi, have been struggling with the following question:

weekly demand for packets of a particular medication at a pharmacy is a discrete random variable with following probability function:

d 0 1 2 3

P(D=d) 0.4 0.3 0.2 0.1

demands for different weeks are independent. Each week, irrespective of the current number of packets already being held, one new packet is delivered. In addition to the regular delivery, special deliveries are made as needed when the demand exceeds the number of available packets. Due to a limited shelf life, packets are discarded if they have not been used within 7 weeks of the initial delivery. Packets with earlier delivery dates are used before packets with later delivery dates.

Let Xn be the number of packets held by the pharmacy immediately after the regular delivery in week n. As packets are not kept longer than 7 weeks, the possible values of Xn are 1,2...7.

question: Given that there is initallly one packet held by the pharmacy immediately after the regular delivery in week 0, evaluate the probability that there are 2 packets in inventory immediately after the regular delivery in week 4.

I'm trying to solve using the transition matrix but am confused how to do it by following this process.