Almost sure convergence & convergence in probability
"Almost sure convergence" always implies "convergence in probability", but the converse is NOT true.
Thus, there exists a sequence of random variables Y_n such that Y_n->0 in probability, but Y_n does not converge to 0 almost surely.
I think this is possible if the Y's are independent, but still I can't think of an concrete example. What is of example of this happening?
Any help is appreciated! :)
[note: also under discussion in talk stats forum]