Suppose X_1, X_2, ... , X_100 are independent random variables with common mean "mu" and variance "sigma squared." Let X be their average. What is the probability that

|X - "mu" | is greater than or equal to 0.25?

I can tell that this has something to do with either the weak law of large numbers or the central limit theorem, but something isn't clicking. Thanks!