1. ## real analytic function

let $\displaystyle f-1,1) \rightarrow R$ be $\displaystyle C^\infty$. Prove that $\displaystyle f$ is real analytic in some neighborhood of 0 if and only if there is a nonempty interval$\displaystyle (- \sigma, \sigma)$ and a constant $\displaystyle M$ such that $\displaystyle |(d/dx)^k f(x)| \leq M^k * k!$ for all $\displaystyle x \in (- \sigma, \sigma)$ and all $\displaystyle k \in \{1,2,...\}$

2. A function is "real analytic" in a region if and only if, at each point in that region, its Taylor series exists and converges to the value of the function in some neighborhood of the point.

Look at the error formula for Taylor polynomials. If you can show that the error goes to 0 as n goes to infinity, you have it.

3. i see that if the error term goes to 0, the taylor expansion will converge. but how do you show that the taylor expansion converges to the value of the function?

4. Originally Posted by Kat-M
i see that if the error term goes to 0, the taylor expansion will converge. but how do you show that the taylor expansion converges to the value of the function?
No, the "error" is the difference between the nth Taylor polynomial and the given function. If the error goes to 0 the Tayor polynomials converge to the function.

5. Originally Posted by Kat-M
let $\displaystyle f-1,1) \rightarrow R$ be $\displaystyle C^\infty$. Prove that $\displaystyle f$ is real analytic in some neighborhood of 0 if and only if there is a nonempty interval$\displaystyle (- \sigma, \sigma)$ and a constant $\displaystyle M$ such that $\displaystyle |(d/dx)^k f(x)| \leq M^k * k!$ for all $\displaystyle x \in (- \sigma, \sigma)$ and all $\displaystyle k \in \{1,2,...\}$
HallsofIvy's hint gives the "if" part. Let's consider the "only if", i.e. show that a real analytic function statisfies the given bound.

The classical proof uses Cauchy's inequality for complex analytic functions; it's rather simple but you definitely can't think about it if you've not studied complex analysis. Let me sketch a proof that "stays in the real line".

I write the proof for $\displaystyle x=0$. It is simply adaptable to other points. We have $\displaystyle f(x)=\sum_{n=0}^\infty a_n x^n$ for $\displaystyle |x|<r$ and $\displaystyle a_n=\frac{f^{(n)}(0)}{n!}$. Choose $\displaystyle 0<\alpha<r$. We have, for all $\displaystyle k\geq 0$, $\displaystyle |a_k| \alpha^k\leq \sum_{n=0}^\infty |a_n|\alpha^n=C_\alpha<\infty$ (the power series converges absolutely inside the interval of convergence). Thus, for $\displaystyle k\geq 1$, $\displaystyle f^{(k)}(0)\leq C_\alpha (1/\alpha)^k k! \leq M^k k!$ where $\displaystyle M=C_\alpha/\alpha$.

6. i understand the case where x =0 but how do you show that $\displaystyle |f^{(k)} (x)/dx| \leq M*k!$ for other x ?

7. Originally Posted by Kat-M
i understand the case where x =0 but how do you show that $\displaystyle |f^{(k)} (x)/dx| \leq M*k!$ for other x ?
Just do the same for the series expansion at $\displaystyle x$ with $\displaystyle |x|<r$: $\displaystyle f(y)=\sum_{n=0}^\infty \frac{f^{(n)}(x)}{n!}(y-x)^n$ for $\displaystyle y$ close enough to $\displaystyle x$ (at least for $\displaystyle y$ in an interval centered at $\displaystyle x$ contained in $\displaystyle (-r,r)$).

8. Originally Posted by Laurent
Just do the same for the series expansion at $\displaystyle x$ with $\displaystyle |x|<r$: $\displaystyle f(y)=\sum_{n=0}^\infty \frac{f^{(n)}(x)}{n!}(y-x)^n$ for $\displaystyle y$ close enough to $\displaystyle x$ (at least for $\displaystyle y$ in an interval centered at $\displaystyle x$ contained in $\displaystyle (-r,r)$).
i have one problem. that is, the constant M is not really constant. every time x changes, M is different depending on the size of the interval centered at x contained at (-r,r). how do i pick the biggest M?

9. Originally Posted by Kat-M
i have one problem. that is, the constant M is not really constant. every time x changes, M is different depending on the size of the interval centered at x contained at (-r,r). how do i pick the biggest M?
I had an answer ready for that one, but it turns out it doesn't work... Since I can't fix it, there remains the "Cauchy inequality" method. Do you know analytic continuation?

If so, you can make sense of

$\displaystyle \frac{f^{(n)}(x)}{n!}=\frac{1}{2i\pi\alpha^n}\int_ 0^{2\pi}f(x+\alpha e^{i\theta})e^{-in\theta}d\theta$,

for all $\displaystyle x,\alpha$ such that $\displaystyle f$ is analytic in $\displaystyle [x-\alpha,x+\alpha]$ (hence on $\displaystyle D(x,\alpha)$ in $\displaystyle \mathbb{C}$ after continuation).

The proof is just term-by-term integration of the series $\displaystyle f(x+\alpha e^{i\theta})e^{-in\theta}=\sum_{k=0}^\infty\frac{f^{(k)}(x)}{k!}e^ {i(k-n)\theta}\alpha^k$.

Then the inequality is straightforward for all $\displaystyle |x|<r$, letting $\displaystyle M=\frac{1}{\alpha}\max \{|f(x)| : |x|\leq r+\alpha\}$ for some $\displaystyle r$ such that $\displaystyle f$ is analytic on $\displaystyle [-r-\alpha,r+\alpha]$ (choosing $\displaystyle \alpha$ small enough).

There may be a way to fix the previous method, but it's a bit late here...