Please let me know if anyone can help with the question below? Thanks so much.

Consider a leaky bucket implementation that maintains a leak rate of p packets/sec (pps) over an outgoing link of capacity C when there are packets in the input buffer. Packet arrival in the input buffer is at a constant rate of x1 for a time interval (0; T], switches to a higher rate x2 for a time interval (T; 2T], switches back to x1 during the interval (2T; 3T], and then again switches to x2 during the interval (3T; 4T], and so on | where x1 < p < x2 << C and T >> 1/x2. Assume that q is the average number of packets in the buffer in the steady state.

A. Obtain a recurrence relation for the delay suffered by a packet arriving at the input buffer after the steady state is reached (note, the delay is the time elapsed between when a packet p arrives at the input buffer and when p is output from the buffer).

B: Estimate the mean packet delay using the recurrence relation, for T = 100 sec and x1 = 10 pps, in the following cases and plot the results:

1. With p = 20 pps, vary x2 as 40, 50, 60, 70 80, 90, 100 pps;
2. With p = 30 pps, vary x2 as 40, 50, 60, 70 80, 90, 100 pps.

You may do the calculations for all packets arriving up to time 2T (i.e., 200 secs). Note, these packets may take longer than 2T to leak out of the buffer.