At my local sandwich bar, I have noticed that it always takes at least two minutes to serve a customer and it can take much longer to fulfil an unusual order. The time in minutes taken to serve a customer may be modelled by a continuous random variable T with probability density function f(t)= 64 t5 ,t≥ 2.
(i) Show that the c.d.f. of the random variable T is given by
F(t)= 1 − 16 t4 ,t≥ 2.
(ii) According to the model, what proportion of customers take more than four
minutes to serve?
Find the probability that it takes between five and ten minutes to serve a
(iii) Use the p.d.f. f(t) to calculate the mean and variance of the time taken to serve a customer.
(iv) Use Formula (4.1) of Unit 1 to calculate the mean time taken to serve a
Formula (4.1) = http://http://upload.wikimedia.org/m...826603637f.png (see below)
(v) Simulate the time taken to serve a customer, using the random number
u=0.5536, which is an observation from the uniform distribution U(0,1).
Give your answer in minutes and seconds, to the nearest second.
Thanks so much MR. F it has been the great help already.
here is the question on print screen.
These questions form part of a graded assessment. Thread closed.
Originally Posted by wallace