Let f-->R and xo is a cluster point of D. Prove that f has a limit at xo if for each epsilon>0 there is a neighborhood Q of xo such that for all x,y belonging to QintersectD x=/=xo, y=/=xo, we have |f(x)-f(y)|<epsilon.
I have an exam coming up in a week and I don't even know where to start with such weird problems.
I found a proof of this statement on the web (Since we don't have a solution manual), but it is very long and complicated..using many things that we have not used in this course. I was hoping someone could help me get an easier proof so that if such a problem appeared on my exam, I might actually have a chance because there is no way I could come up with the proof I found. Thanks
Let be a function and let be a cluster point of . Prove that exists if , such that implies that .
Proof: Let be a sequence that converges to . (We know such a sequence exists because is a cluster point.) We want to prove that the sequence converges.
We know that given a , all but finitely many are contained in , that is, implies . So for we know that and therefore .
But this means that is Cauchy, and therefore converges because is complete. In other words, exists.