1. Convergent sequence

I am having a lot of trouble attempting this problem. I do not know what is the best way to go about proving it.

Suppose sequence $a_n > 0$ and $b_n = a_n + \frac{1}{a_n}
$

Assume $a_n$ >= 1 for all n, and that $b_n$ converges. Prove that $a_n$ converges.

If it assumed that $b_n$ converges but only $a_n > 0$, it does not follow that $a_n$ converges. Find the example.

2. Originally Posted by Rozaline
I am having a lot of trouble attempting this problem. I do not know what is the best way to go about proving it.

Suppose sequence $a_n > 0$ and $b_n = a_n + \frac{1}{a_n}
$

Assume $a_n$ >= 1 for all n, and that $b_n$ converges. Prove that $a_n$ converges.

If it assumed that $b_n$ converges but only $a_n > 0$, it does not follow that $a_n$ converges. Find the example.
If $b_n = a_n + \tfrac{1}{a_n}$ then $a_n^2 - b_na_n + 1 = 0$. Use the quadratic formula to solve that equation: $a_n = \tfrac12\bigl(b_n\pm\sqrt{b_n^2-4}\bigr)$. That should tell you enough about $a_n$ to find solutions for both parts of the problem.