1. ## Convergent sequence

I am having a lot of trouble attempting this problem. I do not know what is the best way to go about proving it.

Suppose sequence $\displaystyle a_n > 0$ and $\displaystyle b_n = a_n + \frac{1}{a_n}$
Assume $\displaystyle a_n$ >= 1 for all n, and that $\displaystyle b_n$ converges. Prove that $\displaystyle a_n$ converges.

If it assumed that $\displaystyle b_n$ converges but only$\displaystyle a_n > 0$, it does not follow that $\displaystyle a_n$ converges. Find the example.

2. Originally Posted by Rozaline
I am having a lot of trouble attempting this problem. I do not know what is the best way to go about proving it.

Suppose sequence $\displaystyle a_n > 0$ and $\displaystyle b_n = a_n + \frac{1}{a_n}$
Assume $\displaystyle a_n$ >= 1 for all n, and that $\displaystyle b_n$ converges. Prove that $\displaystyle a_n$ converges.

If it assumed that $\displaystyle b_n$ converges but only$\displaystyle a_n > 0$, it does not follow that $\displaystyle a_n$ converges. Find the example.
If $\displaystyle b_n = a_n + \tfrac{1}{a_n}$ then $\displaystyle a_n^2 - b_na_n + 1 = 0$. Use the quadratic formula to solve that equation: $\displaystyle a_n = \tfrac12\bigl(b_n\pm\sqrt{b_n^2-4}\bigr)$. That should tell you enough about $\displaystyle a_n$ to find solutions for both parts of the problem.