I am not sure if this is the rigth section, let me know if I have to post it somewhere else.
The problem I have is with calculating waiting time. I have a system with an arrival rate for each period i (AR(i)), a constant service rate (SR) and a number of servers for each period (S(i)). The number of uncompleted orders per period can then be described as: L = (AR(i)-(SR*S(i))+L(i-1)
Thusfar no problems, however now I want to know the time it takes to complete the orders arriving in period i: (AR(i)-(SR*S(i))

If the number of servers are constant this poses no problem, you divide the queue by the capacity. However in my system there is a large variance in the number of servers available per period.

For example if in i=1 the queue of i-1 = 0, AR(1) = 10 and SR*S(1) = 1, in period i=2 the same parameters are used only now the queue of the previous period is L(1)=9, in period i=3 SR*S(3) becomes 30 thus processing the queue L(2) = 18 plus AR(3) = 10. The maximum waiting time for orders arriving in period 1 is two periods, for orders of arriving in period 2 it is one period and arrivals in period 3 are processed within one period. The system functions under a first-come first-serve policy.

Does anyone have a idea how to calculate the maximum waiting time? More specifically, how to describe the process with a formula