Let X have df F. Suppose Xn is a decreasing sequence, Xn--a, show F(Xn)--F(a) as n goes to infinity.. I know it's convergence in probability. But i dont know how to prove it.. anyone can help??



Let X1, X2,.. be iid bernoulli (p). Set Sn=X1+X2+...+Xn and let T be the first n for which Sn=4. Calculate P(X1=1 / T=5). {P(X1=1) conditional on T=5 } My answer is 3/4, but i dont have the answer, is that right ?? thanx