Here's my answer about the last part.
Originally Posted by Media_Man
There are natural probability measures on (with product -algebra), namely the distributions that consist in picking each number independently with some probability . This gives random infinite subsets of (that have asymptotic density ). Moreover, every natural number plays the same role (the measure is shift-invariant), so that these measures appear to be analogs of a "uniform measure". Choosing would additionally give a symmetry between the random subset and its complement (they would be distributed alike), but I don't need this assumption in the following.
In other words, let and let be independent random variables distributed according to and . The random subset would be . And the question is:
What is the probability that the sum
The answer is..., wait for it,... zero. Disappointingly. But not surprisingly if you know Kolmogorov's 0-1 law, which tells that the answer has to be either 0 or 1.
I've come up with various proofs but no fully elementary one. One possibility is to use Paley-Zygmund inequality (this is an easy one, don't get scared by the name!) for partial sums to get that, taking a limit, which, given Kolmogorov's 0-1 law, leads to the conclusion since .
Another possibility (without K's 0-1 law) is to write . We notice that almost-surely because of the law of large numbers (Not quite: in fact there is a subtelty, but nevermind, it can be made to work). Then
and almost-surely , so that the right-hand side sum diverges almost-surely as .
As a conclusion: for almost-every subset of (according to the previous measures), . The second proof (almost) only uses the fact that a random set has a positive asymptotic density.
About this "frontier" thing, here is something you might like (but maybe you know it already): for any , the series
(with successively nested logs) diverges, while for any , for any , the series
(only the last function is raised to the power ) converges. For a given , the higher is, the slower the converging series decays. This gives an intuition (not a theorem) of how fast a decreasing sequence should at least decay for making a convergent series.