1. ## Proving limit exists

Well the problem reads: Suppose $\displaystyle f: (a, b) \rightarrow R$ and $\displaystyle \lim_{x\to\infty}f(x_n)$ exists for every sequence $\displaystyle x_n$ (where n goes from $\displaystyle 1 \to \infty$) in $\displaystyle (a, b)$ such that $\displaystyle \lim_{x\to\infty}x_n = a$. Prove that $\displaystyle \lim_{x\to a^+}f(x_n)$ exists.

I'm going to assume that $\displaystyle \lim_{x\to a^+}f(x_n)$ does not exist and prove it by contradiction. So then we still have $\displaystyle \lim_{x\to\infty}x_n = a$. But since the sequence $\displaystyle x_n$ lies in the interval $\displaystyle (a, b)$ the $\displaystyle \lim_{x\to\infty}x_n = a$ can't exist because we assumed that right hand limit: $\displaystyle \lim_{x\to a^+}f(x_n)$ does not exist, so the limit as anything approaches a in the interval $\displaystyle (a, b)$ can't exist. But that is a contradiction since we are given $\displaystyle \lim_{x\to\infty}x_n = a$.

I'm not sure if this is right because it seems a little short. Can someone let me know if this is the right approach? Thanks, Chad.

Edit: Maybe I have to use Bolzano-Weierstrass and say that $\displaystyle x_n$ has a convergent sub sequence that tends to a as x approaches infinity, not really sure that makes a difference though since we already have a sequence that converges to a.

2. Originally Posted by eXist
Well the problem reads: Suppose $\displaystyle f: (a, b) \rightarrow R$ and $\displaystyle \lim_{x\to\infty}f(x_n)$ exists for every sequence $\displaystyle x_n$ (where n goes from $\displaystyle 1 \to \infty$) in $\displaystyle (a, b)$ such that $\displaystyle \lim_{x\to\infty}x_n = a$. Prove that $\displaystyle \lim_{x\to a^+}f(x_n)$ exists.

I'm going to assume that $\displaystyle \lim_{x\to a^+}f(x_n)$ does not exist and prove it by contradiction. So then we still have $\displaystyle \lim_{x\to\infty}x_n = a$. But since the sequence $\displaystyle x_n$ lies in the interval $\displaystyle (a, b)$ the $\displaystyle \lim_{x\to\infty}x_n = a$ can't exist because we assumed that right hand limit: $\displaystyle \lim_{x\to a^+}f(x_n)$ does not exist, so the limit as anything approaches a in the interval $\displaystyle (a, b)$ can't exist. But that is a contradiction since we are given $\displaystyle \lim_{x\to\infty}x_n = a$.

I'm not sure if this is right because it seems a little short. Can someone let me know if this is the right approach? Thanks, Chad.

Edit: Maybe I have to use Bolzano-Weierstrass and say that $\displaystyle x_n$ has a convergent sub sequence that tends to a as x approaches infinity, not really sure that makes a difference though since we already have a sequence that converges to a.
You have to assume that the limit $\displaystyle c=f(x_n)$ is the same for any sequence $\displaystyle x_n \rightarrow a$ (otherwise the conclusion would not be valid for obvious reasons). Assume that $\displaystyle \lim_{x\rightarrow a^+ } f(x)$ doesn't exists, in particular there would exist an $\displaystyle \epsilon >0$ such that for every $\displaystyle 0<\delta (=1/n)$ we have $\displaystyle x_{\delta }-a < \delta$ and $\displaystyle \vert f(x_{\delta })-c \vert \geq \epsilon$

Edit: Your argument fails because you assume that because $\displaystyle \lim_{x\rightarrow a^+ } f(x)$ doesn't exist this implies that the limit $\displaystyle f(x_n)$ doesn't exist for any sequence converging to a.

3. So if I assume that the limit: $\displaystyle \lim_{x\to a^+}f(x)$ does not exist, that changes the fact that every sequence converges to a? I might be miss understanding you.

I understand that a must be a cluster point since every sequence in the interval $\displaystyle (a, b)$ converges to it, and therefor the limit at a from the right must exist since the terms of the sequence are all larger than a since they are in the interval $\displaystyle (a, b)$.

I see why this is true, I'm just struggling to write it down I guess.