I'm trying to show that the Nested Interval Theorem does NOT hold if the "first" interval in the sequence is unbounded.
with , for example, implies that
The only thing that makes sense to me that I can think of is that is an increasing sequence with no upper bound so eventually there is a possibility that for some large N becomes an empty set. And the intersection of any set with the empty set is the empty set.
Any suggestion or tips? Thanks