Here is a counter example for you to think about.
Note that for all n.
I've been trying, with no luck so far, to get my head around the differnce between pointwise and uniform convergence.
The definition which I am using for Uniform convergence is:
We say converges to uniformly on a set if
Where are some constants such that as
I understand this definition, but what I am having trouble with is seeing how any convergence sequence of function could fail to satisfy this!
Assume but not uniformly.
Now we fix and calculate for each possible choice of . There must be a value for which gives the greatest value (i.e. for which is furthest away from . So take this value and denote is for each choice . Then for each we have certainly satisfied .
Moreover this sequence must converge to 0 as have by assumption that . So converges to uniformly on a set . Contradicting our assumption that it converged but not uniformly!
Clearly there must be something very wrong with the above argument but I can't see what it is.
thanks for any help or explanation!