This is what I'm trying to calcualte. There is an event that has a 1/1890 chance of occurring. Over say 20,000 trials, probability expects this event to occur approximately 10 times. But its unlikely it'll occur exactly 10 times, and what I'm trying to calculate is how 'far away' the number of times this event occurs could be.

Ie. The answer will look something like this:

In 20,000 trials there is a 68% chance the event will occur 10 +- 2 times.

Or a 95% chance the event will occur 10+-3 times

Or a 99.7% chance the event will occur 10+-4 times.

I'm just fudging these numbers cause I don't know how to calculate it, but I know it has soemthing to do with standard deviations away from the mean probability (1/1890).

Help on this is very much appreciated, thanks . Please let me know if I haven't explained clearly, I don't really remember the terminology very well.