Hi, my first post here...

I have a problem I don't have the mathematical skills to fully compute...

Definition:

In a packet network I need to monitor the delay variation for the packets.

I want to calculate if 99% of the packets are within 0,3ms, 3ms & 10ms respectively. In other words the answer is somewhat boolean but would preferrably be presented as a percentage for each "limit".

Assumption (given in the specs...): the packets are of a time invariant gaussian distribution.

Given input:

I have some samples to calculate from.. I know the longest delay för the best 50% of the packets, the best 10% and the best 1%.

ex.:

best50%: 67ns

best10%: 43ns

best1%: 27ns

My conclusion:

The distribution of the samples would be inside the area of a bell curve (normal distribution/gaussian distr.).

The mean of the samples can be approximated to the same value as best50%..

standard dev (n-1): sqrt((best1%-best50%)^2 / 50)

then I end up with a nice curve if plotted that seems plausible.. but this is where my math skills fail.. I can't judge if I'm right or just "luckily close"..

then using the calculated std.dev. to calculate a protion of the area under the bell-curve (=probability mass ?) and calculate how many percent of the samples should be within 30ns, 300ns & 1000ns spans respectively...

(some of them renders negative here which of course is impossible.. to be honest there also should be som kind of "minimum delay bias" due to the speed of light etc. which should shift everything(to the right =).. but I think that fact can be neglected)

I'm pretty confident that I'm on track (or at least close...I've given this problem about a week of thought by now...)

sorry for beeing lazy but I'm stronger in programming than in math...

didn't take university math either... =O

thanks in advance !