My question might sound a bit strange, maybe not, but it is of great importance for me that someone answers it thoroughly...
Let's say we have the following sequence >>
a(n) := n / [(n^2) - 4]
if i were to calculate the lim n --> infinite i would 1 as the result.
but if i were to divide both the numerator and the denominator with n the sequence would change to a(n) := 1 / [n - (4/n)] and then the lim n --> infinite would be 0. Now the sequence is not convergent 1 but a nullsequence.
Why is convergent 0 correct and convergent 1 false? As correct answer i was told 0, but why? why not 1? must i always try and divide + simplify?
2 >> and is there a clear way to simplify before executing the lim? For example, every variable in the numerator must be of power 1 or something?
I hope my question is clear for you and you can help me. thank you