taken from an article by Nassim Taleb here.

THE WORLD QUESTION CENTER 2006 — Page 17

In it, he highlighted this:

I've enjoyed giving math students the following quiz (to be answered intuitively, on the spot). In a Gaussian world, the probability of exceeding one standard deviations is ~16%. What are the odds of exceeding it under a distribution of fatter tails (with same mean and variance)?

So guys, what's the answer?