# limit function

• Jun 15th 2009, 11:40 AM
Juancd08
limit function
examine the sequence of functions defined by

f_n(x) = 2nx/n^p + x^2. for different values of p > 0.

a) show the function above converges to its limit function uniformly on the interval I = [0, infinity) iff p>2.

b)Show that when p > 4, M-Test be applied to show that g= SUM(f_n) converges uniformly on the interval I, but fails other p's.

c)Show that SUM(f_n) converges uniformly on any interval I =[0,a] whenever p > 2.
• Jun 16th 2009, 07:10 AM
Showcase_22
I would just like to point out that i've never done a question like this before so here's my attempt:

$\displaystyle f_n(x)=\frac{2nx}{n^p}+x^2=2n^{1-p}x+x^2$

Suppose $\displaystyle p>2$.

Need to find $\displaystyle \delta>0$ such that $\displaystyle |x-y|<\delta \Rightarrow |f(x)-f(y)|<\epsilon$.

Let $\displaystyle x=\frac{\delta}{2}$ and $\displaystyle y=\frac{\delta}{3}$ and fix $\displaystyle \delta=\min \{ 1, \epsilon \}$

$\displaystyle |2n^{1-p}x+x^2-2n^{1-p}y-y^2|=|2n^{1-p}(x-y)+(x^2-y^2)| \leq |2n^{1-p}(x-y)|+|x^2-y^2|$

But:

$\displaystyle =|2n^{1-p}(x-y)|+|(x+y)(x-y)|< |2n^{1-p}|\delta+|x+y|\delta=|2n^{1-p}|\delta+\left| \frac{5\delta}{6} \right|\delta$

But since $\displaystyle 0<\delta \leq 1 \Rightarrow \frac{5 \delta^2}{6} \leq \frac{5 \delta}{6}$.

Hence we have $\displaystyle |f(x)-f(y)| < |2n^{1-p}| \delta+\frac{5 \delta}{6}=\delta \left( |2n^{1-p}|+\frac{5}{6} \right)$

Hence define $\displaystyle \epsilon=\delta \left( |2n^{1-p}|+\frac{5}{6} \right)$

EDIT: I would really like to point out that i'm not sure if this is right. I haven't really encountered uniform continuity before!! (Worried)

(I also can't see where i'm used the property that p>2....)
• Jun 27th 2009, 05:27 AM
xalk
Quote:

Originally Posted by Showcase_22
I would just like to point out that i've never done a question like this before so here's my attempt:

$\displaystyle f_n(x)=\frac{2nx}{n^p}+x^2=2n^{1-p}x+x^2$

Suppose $\displaystyle p>2$.

Need to find $\displaystyle \delta>0$ such that $\displaystyle |x-y|<\delta \Rightarrow |f(x)-f(y)|<\epsilon$.

Let $\displaystyle x=\frac{\delta}{2}$ and $\displaystyle y=\frac{\delta}{3}$ and fix $\displaystyle \delta=\min \{ 1, \epsilon \}$

$\displaystyle |2n^{1-p}x+x^2-2n^{1-p}y-y^2|=|2n^{1-p}(x-y)+(x^2-y^2)| \leq |2n^{1-p}(x-y)|+|x^2-y^2|$

But:

$\displaystyle =|2n^{1-p}(x-y)|+|(x+y)(x-y)|< |2n^{1-p}|\delta+|x+y|\delta=|2n^{1-p}|\delta+\left| \frac{5\delta}{6} \right|\delta$

But since $\displaystyle 0<\delta<1 \Rightarrow \frac{5 \delta^2}{6} \leq \frac{5 \delta}{6}$.

Hence we have $\displaystyle |f(x)-f(y)| < |2n^{1-p}| \delta+\frac{5 \delta}{6}=\delta \left( |2n^{1-p}|+\frac{5}{6} \right)$

Hence define $\displaystyle \epsilon=\delta \left( |2n^{1-p}|+\frac{5}{6} \right)$

EDIT: I would really like to point out that i'm not sure if this is right. I haven't really encountered uniform continuity before!! (Worried)

(I also can't see where i'm used the property that p>2....)

Showcase ,we have two kinds of convergence:

1) Point wise convergence and,

2)Uniform convergence

For pointwise convergence the definition is:

Given an ε>0 and an x belonging to the interval of definition of the sequence of functions {$\displaystyle f_{n}(x)$}, there exists a natural No k such that:

for all n: if $\displaystyle n\geq k$,then $\displaystyle |f_{n}(x) -f(x)|<\epsilon$

For uniform convergence the definition is:

Given an ε>0 ,there exists a k such that:

for all n and for all x belonging to the interval of definition of {$\displaystyle f_{n}(x)$}: if $\displaystyle n\geq k$ and xεI ( I =interval of definition),then
$\displaystyle |f_{n}(x) -f(x)|<\epsilon$.

Note the difference.

Both definitions can be found in any analysis book under the title:

Function spaces.

And in our case i think by applying the above definitions:

1) The sequence of functions converges point wise in the interval [ 0,to infinity) to the function f(x) = x^2 ,and

2) converges uniformly in the interval (0,λ) ,where λ is any No greater than 0
• Jun 28th 2009, 05:34 AM
xalk
Quote:

Originally Posted by Showcase_22

(I also can't see where i'm used the property that p>2....)

Let: p>2 ====> p-1>1 ====> ln(n)(p-1)> ln(n) ( for $\displaystyle n\geq 2$)<====> $\displaystyle n^{p-1} > n$ ( ln is a strictly increasing function) <====> $\displaystyle \frac{1}{n^{p-1}}< \frac{1}{n}$ or

$\displaystyle \frac{n}{n^p}< \frac{1}{n}$

You going to need that important inequality somewhere along the proof of pointwise or uniform convergence
• Jul 2nd 2009, 01:34 PM
halbard
You have to interpret the question thus: $\displaystyle f_n(x)=\frac{2nx}{n^p+x^2}$ for $\displaystyle x\geq0$. Then there are three cases to consider:

1. If $\displaystyle 0<p<1$ then there is no pointwise convergence as $\displaystyle n\to\infty$.

2. If $\displaystyle p=1$ then $\displaystyle f_n(x)\to 2x$ as $\displaystyle n\to\infty$.

3. If $\displaystyle p>1$ then $\displaystyle f_n(x)\to 0$ as $\displaystyle n\to\infty$.

In the case $\displaystyle p=1$ we have $\displaystyle |f_n(x)-2x|=\frac{2x^3}{n+x^2}$ and this can be made arbitrarily large for any given $\displaystyle n$, so no uniform convergence here.

When p>1 we have $\displaystyle |f_n(x)-0|=\frac{2nx}{n^p+x^2}$. A bit of calculus shows that this is maximised when $\displaystyle x=n^{p/2}$, and the supremum is $\displaystyle n^{1-(p/2)}$. Therefore uniform convergence occurs iff $\displaystyle 1-(p/2)<0$, i.e. iff $\displaystyle p>2$.

We now know that $\displaystyle |f_n(x)|\leq n^{1-(p/2)}$ for $\displaystyle p>2$, and we know that $\displaystyle \sum_{n=1}^\infty n^{1-(p/2)}$ converges iff $\displaystyle 1-(p/2)<-1$, i.e. iff $\displaystyle p>4$. This gives uniform convergence by the $\displaystyle M$-test when $\displaystyle p>4$.

To show non-uniform convergence for other values of $\displaystyle p$ could be tricky, but I'll try it for the limiting case $\displaystyle p=4$, which probably implies all others.

So consider the function $\displaystyle f_n(x)=\frac{2nx}{n^4+x^2}$. Considered as a function of $\displaystyle n$ this is maximised when $\displaystyle n=\surd x$ and is a decreasing function of $\displaystyle n$ for $\displaystyle n>\surd x$.

Choosing the integer $\displaystyle m$ so that $\displaystyle m\leq\surd x<m+1$, the integral test implies $\displaystyle \sum_{n=m+1}^\infty f_n(x)>\int_{m+1}^\infty f_t(x) dt \geq\int_{1+\surd x}^\infty \frac{2tx}{t^4+x^2}dt$

$\displaystyle =\int_{1+\frac2{\surd x}+\frac1x}^\infty \frac1{u^2+1}du=\frac\pi2-\tan^{-1}\left(1+\frac2{\surd x}+\frac1x\right)$ using the substitution $\displaystyle t^2=xu$.

Thus as $\displaystyle x\to\infty$, the sum exceeds $\displaystyle \frac\pi2-\tan^{-1}1=\frac\pi4$. However, the functions $\displaystyle f_n(x)$ all tend to $\displaystyle 0$ as $\displaystyle x\to\infty$, and if the sum were uniformly convergent then its limit would be $\displaystyle 0$ as well.

As for the last part, we see the maximum of $\displaystyle f_n(x)$ occurring outside the interval $\displaystyle 0\leq x\leq a$ when $\displaystyle n^{p/2}>a$ (assuming $\displaystyle p>2$) and $\displaystyle f_n(x)$ increasing on the interval. Thus for an appropriate integer $\displaystyle m$ we have

$\displaystyle \sum_{n=m}^\infty f_n(x)\leq\sum_{n=m}^\infty f_n(a)=\sum_{n=m}^\infty\frac{2na}{n^p+a^2} <\sum_{n=m}^\infty\frac{2na}{n^p}<\infty$ if $\displaystyle p>2$, so the $\displaystyle M$-test guarantees uniform convergence.