I'm stuck in proving this. I get that we can set up the sequence of nested intervals by starting with an element a of nonempty set A with an upper bound b and create interval I_1 = [a,b]. Then create interval I_2 nested in I_1 by letting:

I_2 = [a, (a+b)/2] if (a+b)/2 is a upper bound

or

I_2 = [(a+b)/2, b] if (a+b)/2 is a lower bound

We continue this interval having process and generate our sequence of nested intervals. Now the typical argument states that since lim(1/2)^n , n->infinity = 0, this leads to us having only one point of intersection in Intersection of all I_i, i = 1 ... infinity. Finally with a few additional arguments we can show this point must be the least upper bound. My question is how do we argue that lim(1/2)^n , n->inf = 0. Or put another way: for every e, there exists a N such that (1/2)^n < e, when n >= N. I don't see how this can be proved without the Monotone Convergence Theorem which I only know how to prove using the LUB property.

Thanks for your help