Hi everyone! Today we were asked to explain why a series $\displaystyle \sum_{n=1}^{\infty}a_n$ that converges would be divergent when it becomes $\displaystyle \sum_{n=1}^{\infty}\frac{1}{a_n}$. My take on the explanation is that there is a theorem that if a series converges, then it is mandatory that the sequence$\displaystyle a_n$ should converge to $\displaystyle 0$. So if $\displaystyle \sum_{n=1}^{\infty}a_n$ converges, then it should be true to say that $\displaystyle \lim_{n\to\infty}a_n=0$. Since I've always thought that for something to have a 0 as its value as $\displaystyle n\to\infty$, it should be true that $\displaystyle a_n$ is composed of two other functions (assuming that the sequence can be represented as a continuous function with a domain that is an element of all real numbers and not exclusive to natural numbers). This makes our previous limit to be $\displaystyle \lim_{n\to\infty}a_n=\lim_{x\to\infty}\frac{f(x)}{ g(x)}=0$ and for this limit to be true it should be also true that $\displaystyle g(x)>f(x)$. So by taking the reciprocal, the limit becomes $\displaystyle \lim_{x\to\infty}\frac{g(x)}{f(x)}$. And since $\displaystyle g(x)>f(x)$, then it should be true to say that $\displaystyle \lim_{x\to\infty}\frac{g(x)}{f(x)}\to\infty$. If this is true, then $\displaystyle \sum_{n=1}^{\infty}\frac{1}{a_n}$ is divergent due to divergence test. What do you guys think of this explanation? I mean, my explanation was very logic based and doesn't get quite as formal as the traditional math proofs that's been going around. Is there any more formal proof to this? Thank you guys in advance!