fundamental in probability & convergence with probability 1

I am trying to understand a proof that if $\displaystyle \{X_n\}$ is fundamental in probability, then there exists a subsequence and a random variable $\displaystyle X$ such that the subsequence converges with probability 1 to $\displaystyle X$.

The proof begins with the construction, by indices, of a subsequence where $\displaystyle n_1 = 1$ and $\displaystyle n_{i+1}$ is the smallest index such that $\displaystyle n_{i+1} > n_i$ where $\displaystyle P(\{ \vert X_m - X_n \vert \geq 2^{-i}\}) < 2^{-i}$. This implies that $\displaystyle \sum_{k=1}^{\infty} P(\{\vert X_{n_{k+1}} - X_{n_k}\vert \geq 2^{-k}\}) < \sum_1^{\infty} 2^{-k}< \infty$.

The next line goes: Hence, $\displaystyle \sum_1^{\infty} \vert X_{n_{k+1}} - X_{n_k} \vert < \infty$ with probability 1.

**Can someone explain that step to me please?**

The proof goes on to use that last line to construct a set N of things for which the sum is infinite and then defines the limit r.v. $\displaystyle X$ to be 0 when things are in N, and $\displaystyle X_{n_1} + \sum_1^{\infty}(X_{n_{k+1}} - X_{n_k})$ when things are not in N.

I could also use a little explanation in understanding why this r.v. was so defined, because the subsequence constructed in the proof is supposed to converge with probability one to this random variable but I don't understand whats going on well enough to see it.

**Also, I've been looking at course notes from the past couple of weeks I've missed due to illness and I see a statement (without proof, though I think I can come up with one) that convergence with probability 1 to X is equivalent to the sequence being fundamental in probability. Wouldn't this trivially supercede the above proof? I mean, assuming the sequence is fundamental in probability, then this statement says that the entire sequence, which is trivially a subsequence of itself, converges with probability 1?**