# Thread: Divergence from the reciprocal of the sequence input.

1. ## Divergence from the reciprocal of the sequence input.

Hi everyone! Today we were asked to explain why a series $\displaystyle \sum_{n=1}^{\infty}a_n$ that converges would be divergent when it becomes $\displaystyle \sum_{n=1}^{\infty}\frac{1}{a_n}$. My take on the explanation is that there is a theorem that if a series converges, then it is mandatory that the sequence$\displaystyle a_n$ should converge to $\displaystyle 0$. So if $\displaystyle \sum_{n=1}^{\infty}a_n$ converges, then it should be true to say that $\displaystyle \lim_{n\to\infty}a_n=0$. Since I've always thought that for something to have a 0 as its value as $\displaystyle n\to\infty$, it should be true that $\displaystyle a_n$ is composed of two other functions (assuming that the sequence can be represented as a continuous function with a domain that is an element of all real numbers and not exclusive to natural numbers). This makes our previous limit to be $\displaystyle \lim_{n\to\infty}a_n=\lim_{x\to\infty}\frac{f(x)}{ g(x)}=0$ and for this limit to be true it should be also true that $\displaystyle g(x)>f(x)$. So by taking the reciprocal, the limit becomes $\displaystyle \lim_{x\to\infty}\frac{g(x)}{f(x)}$. And since $\displaystyle g(x)>f(x)$, then it should be true to say that $\displaystyle \lim_{x\to\infty}\frac{g(x)}{f(x)}\to\infty$. If this is true, then $\displaystyle \sum_{n=1}^{\infty}\frac{1}{a_n}$ is divergent due to divergence test. What do you guys think of this explanation? I mean, my explanation was very logic based and doesn't get quite as formal as the traditional math proofs that's been going around. Is there any more formal proof to this? Thank you guys in advance!

2. ## Re: Divergence from the reciprocal of the sequence input.

You're on the right track. I'd just say that in order for a series to be convergent, the terms need to vanish to 0 (i.e. get small). That means the reciprocal of these values must be something very big. Clearly adding up an infinite number of big numbers (in fact, any numbers that don't go to 0) will give you something infinite, and so the series must be divergent.

3. ## Re: Divergence from the reciprocal of the sequence input.

Yeah was thinking of just doing that. But I just want to know if there is a more formal proof to this. I first tried proving it by delta epsilon limit definition but I realized that a is not finite and that I might not be able to set an Epsilon that would define delta. Thanks dude for confirming my answer. But I'm still open for business in search of a formal proof to this.

4. ## Re: Divergence from the reciprocal of the sequence input.

Is not the fact that the general term doesn't go to zero a formal proof of your statement? I think it is, but I'm not mathematician

5. ## Re: Divergence from the reciprocal of the sequence input.

Well, all I did is put together few of the very basic theorems about series. For that part, maybe it is formal, but for how I arranged it, I don't really think it is.