1. Power Series Proof

Problem:

If a_v > 0 and $\displaystyle \sum_{v=0}^{\infty} a_v$ converges, then prove that $\displaystyle \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v$.

Attempt:

Since $\displaystyle \sum a_v$ converges, then we can say that $\displaystyle \sum a_v x^v$ converges for x = 1. This implies that it converges uniformly for
all |x| < 1.

Let $\displaystyle f(x) = \sum a_v x^v$ for |x| < 1. Since the convergence is uniform, we know that f is continuous in its interval of convergence. Since the power series converges as well for x = 1, we can also say that f(x) is "left-hand continuous" at x = 1. That is, for all positive epsilon there exists a positive delta such that $\displaystyle |f(x) - f(1)| < \epsilon$ whenever $\displaystyle |x - 1| < \delta$, if we only take x approaching 1 from the left.

Note that $\displaystyle |f(x) - f(1)| = |\sum a_v x^v - \sum a_v | < \epsilon$ whenever $\displaystyle |x-1| < \delta$.

This is precisely the statement: $\displaystyle \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v$. QED.

The proof seems bullet-proof to me, but what bugs me is that I didn't use the fact that the a_v are positive, so there must be something wrong with the proof...

2. Originally Posted by JG89
Problem:

If a_v > 0 and $\displaystyle \sum_{v=0}^{\infty} a_v$ converges, then prove that $\displaystyle \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v$.

Attempt:

Since $\displaystyle \sum a_v$ converges, then we can say that $\displaystyle \sum a_v x^v$ converges for x = 1. This implies that it converges uniformly for
all |x| < 1.

Let $\displaystyle f(x) = \sum a_v x^v$ for |x| < 1. Since the convergence is uniform, we know that f is continuous in its interval of convergence. Since the power series converges as well for x = 1, we can also say that f(x) is "left-hand continuous" at x = 1. That is, for all positive epsilon there exists a positive delta such that $\displaystyle |f(x) - f(1)| < \epsilon$ whenever $\displaystyle |x - 1| < \delta$, if we only take x approaching 1 from the left.

Note that $\displaystyle |f(x) - f(1)| = |\sum a_v x^v - \sum a_v | < \epsilon$ whenever $\displaystyle |x-1| < \delta$.

This is precisely the statement: $\displaystyle \lim_{x \rightarrow 1^-} \sum_{v=0}^{\infty} a_v x^v = \sum_{v=0}^{\infty} a_v$. QED.

The proof seems bullet-proof to me, but what bugs me is that I didn't use the fact that the a_v are positive, so there must be something wrong with the proof...
actually,ai>0 is not necessary

3. So my proof is okay?

4. we let x>=0
sigma(aix^n)=sigma(ai(x/1)^n)
since ai converges, (x/1)^n<=1 and it decreases when x decrease.
So, f is "consistently converges" in [0,1]
So f is continous in [0,1]

5. Originally Posted by JG89
So my proof is okay?
may be not,how do you prove f is left side continous in 1?

6. .....

7. Part of its interval of convergence is [0,1]. Since the series converges for all x satisfying $\displaystyle 0 \le x \le 1$ then the sum is continuous in that interval.

8. Originally Posted by JG89
Part of its interval of convergence is [0,1]. Since the series converges for all x satisfying $\displaystyle 0 \le x \le 1$ then the sum is continuous in that interval.
you can only say it is continous on (-1,1)only by check where converges.If you want to prove it is left continuous on 1,you must use "consistently converge"

9. Originally Posted by ynj
we let x>=0
sigma(aix^n)=sigma(ai(x/1)^n)
since ai converges, (x/1)^n<=1 and it decreases when x decrease.
So, f is "consistently converges" in [0,1]
So f is continous in [0,1]
You say yourself at the end of this quote that f is continuous on [0,1]. So for all positive epsilon there exists a positive delta such that $\displaystyle |f(x) - f(1)| = |\sum a_vx^v - \sum a_v| < \epsilon$ whenever $\displaystyle |x-1| < \delta$. This is true for taking x approaching 1 from the left, which is the same statement as $\displaystyle \lim_{x \rightarrow 1^--} \sum a_v x^v = \sum a_v$

10. Is this information incorrect? :

Power series - Wikipedia, the free encyclopedia

Under the title "Radius of Convergence"

"For |x - c| = r, we cannot make any general statement on whether the series converges or diverges. However, for the case of real variables, Abel's theorem states that the sum of the series is continuous at x if the series converges at x."

11. Originally Posted by JG89
Is this information incorrect? :

Power series - Wikipedia, the free encyclopedia

Under the title "Radius of Convergence"

"For |x - c| = r, we cannot make any general statement on whether the series converges or diverges. However, for the case of real variables, Abel's theorem states that the sum of the series is continuous at x if the series converges at x."
well..you use Abel's theorem,then you are right..
Actually,the problem is to prove Abel's theorem,you cannot use it
my proof is to prove abel's theorem

12. Ah, I see what you mean.