1. ## Infinitely differentiable function

Prove that the following function is infinitely differentiable:

$\displaystyle f: \mathbb{R}\rightarrow \mathbb{R}\:\:\: x \mapsto \lbrace \begin{array}{l} \nearrow\raise5pt\hbox{$\exp ({\frac{-1}{x^2}}) \:if \:x \ne0$}\\ \searrow\lower5pt\hbox{$0\:if\: x=0}$} \end{array}$

Thank you!

Prove that the following function is infinitely differentiable:

$\displaystyle f: \mathbb{R}\rightarrow \mathbb{R}\:\:\: x \mapsto \lbrace \begin{array}{l} \nearrow\raise5pt\hbox{$\exp ({\frac{-1}{x^2}}) \:if \:x \ne0$}\\ \searrow\lower5pt\hbox{$0\:if\: x=0}$} \end{array}$

Thank you!
Ha! I just mentioned this to someone. It's evident that $\displaystyle f$ is differentiable everywhere except zero. For zero we merely note that $\displaystyle f'(x)=\frac{2}{x^3}e^{\frac{-1}{x^2}}$ and by induction $\displaystyle f^{(n)}(x)=p\left(\frac{1}{x}\right)e^{\frac{-1}{x^2}}$ for some polynomial. See if you can conclude from there.

3. It is important to define exactly the concepts of 'derivative' and 'differenziable funtion'. The derivative of a function $\displaystyle y(x)$ in $\displaystyle x=x_{0}$ is the limit [if a limit exists]...

$\displaystyle \displaystyle y^{'}(x_{0}) = \lim_{h \rightarrow 0} \frac{y(x_{0} + h)-y(x_{0})}{h}$ (1)

A function $\displaystyle y(x)$ is differentiable in $\displaystyle x=x_{0}$ if we can write...

$\displaystyle \displaystyle \delta y= y(x_{0} + h)-y(x_{0}) = p\ h + h\ \varepsilon(h)$ (2)

... where $\displaystyle p$ is independent from $\displaystyle h$ and $\displaystyle \displaystyle \lim_{h \rightarrow 0} \varepsilon (h)=0$. Now we consider the function...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (3)

... in $\displaystyle x=x_{0}$. On the basis of the (1) the derivative in $\displaystyle x=x_{0}$ exists and is $\displaystyle y^{'} (0)=0$ but the function isn't differentiable in $\displaystyle x=x_{0}$ because we are not able to write a relation like (2)...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$

4. Originally Posted by chisigma
It is important to define exactly the concepts of 'derivative' and 'differenziable funtion'. The derivative of a function $\displaystyle y(x)$ in $\displaystyle x=x_{0}$ is the limit [if a limit exists]...

$\displaystyle \displaystyle y^{'}(x_{0}) = \lim_{h \rightarrow 0} \frac{y(x_{0} + h)-y(x_{0})}{h}$ (1)

A function $\displaystyle y(x)$ is differentiable in $\displaystyle x=x_{0}$ if we can write...

$\displaystyle \displaystyle \delta y= y(x_{0} + h)-y(x_{0}) = p\ h + h\ \varepsilon(h)$ (2)

... where $\displaystyle p$ is independent from $\displaystyle h$ and $\displaystyle \displaystyle \lim_{h \rightarrow 0} \varepsilon (h)=0$. Now we consider the function...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (3)

... in $\displaystyle x=x_{0}$. On the basis of the (1) the derivative in $\displaystyle x=x_{0}$ exists and is $\displaystyle y^{'} (0)=0$ but the function isn't differentiable in $\displaystyle x=x_{0}$ because we are not able to write a relation like (2)...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$
Spoiler:

Another way of putting differentiability, opposed to the one you put, is that $\displaystyle f'$ exists at $\displaystyle x_0$ if

$\displaystyle \displaystyle\lim_{x\to x_0}\frac{f(x)-f(x_0)}{x-x_0}$

exists. Now, for our $\displaystyle f$ evidently it is differentiable for $\displaystyle x\ne 0$ and at $\displaystyle x=0$ we see that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f(x)-f(0)}{x-0}=\frac{e^{\frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{1}{x}e^{\frac{-1}{x^2}}=0$

And thus $\displaystyle f'(0)$ exists and equals zero. Thus,

$\displaystyle f'(x)=\begin{cases}\frac{2}{x^3}e^{\frac{-1}{x^2}} & \mbox{if}\quad x\ne0\\ 0 & \mbox{if}\quad x=0\end{cases}$

So, it's clear that $\displaystyle f''$ exists for $\displaystyle x\ne0$ and we see that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f'(x)-f'(0)}{x-0}=\lim_{x\to0}\frac{\frac{2}{x^3}e^{\frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{2}{x^4}e^{\frac{-1}{x^2}}=0$

and thus $\displaystyle f''(0)$ exists and equals zero. So,

$\displaystyle f''(x)=\begin{cases}e^{\frac{1}{x^2}}\left(\frac{4 }{x^6}-\frac{6}{x^4}\right) & \mbox{if}\quad x\ne0\\ 0 & \mbox{if}\quad x=0\end{cases}$

From here we can proceed by induction. Namely, assume that $\displaystyle f^{(n)}(x)$ exists for all $\displaystyle x$. Then, show, by induction that

$\displaystyle f^{(n)}(x)=\begin{cases}p\left(\frac{1}{x}\right)e ^{\frac{-1}{x^2}} & \mbox{if}\quad x\ne 0\\ 0 & \mbox{if}\quad x=0\end{cases}$

And thus showing that $\displaystyle f^{(n+1)}(0)$ exists (since clearly $\displaystyle f$ is $\displaystyle n+1$ times differentiable for $\displaystyle x\ne0$) reduces to showing that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f^{(n)}(x)-f^{(n)}(0)}{x-0}=\lim_{x\to0}\frac{p\left(\frac{1}{x}\right)e^{\ frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{1}{x}p\left(\frac{1}{x}\ri ght)e^{\frac{-1}{x^2}}$

exists and, in fact equals zero. Thus it follows by induction that $\displaystyle f$ has derivatives of all orders.

Remark: This shows that being $\displaystyle C^{\infty}$ does not imply real analyticity since this $\displaystyle f$ is infinitely differentiable at $\displaystyle 0$ but for any $\displaystyle I$ around $\displaystyle 0$ we have that $\displaystyle \displaystyle \sum_{n=0}^{\infty}\frac{f^{(n)}(0)x^n}{n!}=\sum_{ n=0}^{\infty}\frac{0x^n}{n!}=0$. Thus, the Taylor series at $\displaystyle 0$ of $\displaystyle f$ does not agree with $\displaystyle f$ on any open interval around $\displaystyle 0$

5. Originally Posted by Drexel28
Spoiler:

Another way of putting differentiability, opposed to the one you put, is that $\displaystyle f'$ exists at $\displaystyle x_0$ if

$\displaystyle \displaystyle\lim_{x\to x_0}\frac{f(x)-f(x_0)}{x-x_0}$

exists. Now, for our $\displaystyle f$ evidently it is differentiable for $\displaystyle x\ne 0$ and at $\displaystyle x=0$ we see that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f(x)-f(0)}{x-0}=\frac{e^{\frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{1}{x}e^{\frac{-1}{x^2}}=0$

And thus $\displaystyle f'(0)$ exists and equals zero. Thus,

$\displaystyle f'(x)=\begin{cases}\frac{2}{x^3}e^{\frac{-1}{x^2}} & \mbox{if}\quad x\ne0\\ 0 & \mbox{if}\quad x=0\end{cases}$

So, it's clear that $\displaystyle f''$ exists for $\displaystyle x\ne0$ and we see that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f'(x)-f'(0)}{x-0}=\lim_{x\to0}\frac{\frac{2}{x^3}e^{\frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{2}{x^4}e^{\frac{-1}{x^2}}=0$

and thus $\displaystyle f''(0)$ exists and equals zero. So,

$\displaystyle f''(x)=\begin{cases}e^{\frac{1}{x^2}}\left(\frac{4 }{x^6}-\frac{6}{x^4}\right) & \mbox{if}\quad x\ne0\\ 0 & \mbox{if}\quad x=0\end{cases}$

From here we can proceed by induction. Namely, assume that $\displaystyle f^{(n)}(x)$ exists for all $\displaystyle x$. Then, show, by induction that

$\displaystyle f^{(n)}(x)=\begin{cases}p\left(\frac{1}{x}\right)e ^{\frac{-1}{x^2}} & \mbox{if}\quad x\ne 0\\ 0 & \mbox{if}\quad x=0\end{cases}$

And thus showing that $\displaystyle f^{(n+1)}(0)$ exists (since clearly $\displaystyle f$ is $\displaystyle n+1$ times differentiable for $\displaystyle x\ne0$) reduces to showing that

$\displaystyle \displaystyle \lim_{x\to0}\frac{f^{(n)}(x)-f^{(n)}(0)}{x-0}=\lim_{x\to0}\frac{p\left(\frac{1}{x}\right)e^{\ frac{-1}{x^2}}-0}{x}=\lim_{x\to0}\frac{1}{x}p\left(\frac{1}{x}\ri ght)e^{\frac{-1}{x^2}}$

exists and, in fact equals zero. Thus it follows by induction that $\displaystyle f$ has derivatives of all orders.

Remark: This shows that being $\displaystyle C^{\infty}$ does not imply real analyticity since this $\displaystyle f$ is infinitely differentiable at $\displaystyle 0$ but for any $\displaystyle I$ around $\displaystyle 0$ we have that $\displaystyle \displaystyle \sum_{n=0}^{\infty}\frac{f^{(n)}(0)x^n}{n!}=\sum_{ n=0}^{\infty}\frac{0x^n}{n!}=0$. Thus, the Taylor series at $\displaystyle 0$ of $\displaystyle f$ does not agree with $\displaystyle f$ on any open interval around $\displaystyle 0$

I am very grateful to your full and very clear prove. I really appreciate it. Thank you.

6. Are you sure?
Quoted from wiki: "In calculus (a branch of mathematics), a differentiable function is a function whose derivative exists at each point in its domain."
See Differentiable function - Wikipedia, the free encyclopedia

7. Originally Posted by xxp9
Are you sure?
Quoted from wiki: "In calculus (a branch of mathematics), a differentiable function is a function whose derivative exists at each point in its domain."
See Differentiable function - Wikipedia, the free encyclopedia
Sure of what? The function is differentiable at every point of its domain.

8. I agree with you, but chisigma said it is not at 0. See above post from chisigma.

9. Originally Posted by xxp9
Are you sure?
Quoted from wiki: "In calculus (a branch of mathematics), a differentiable function is a function whose derivative exists at each point in its domain."
See Differentiable function - Wikipedia, the free encyclopedia
Im my opinion the 'key' of the question is that the fact that the derivative of a function $\displaystyle f(x)$ exists in $\displaystyle x=x_{0}$ doesn't mean that $\displaystyle f(x)$ is differentiable in $\displaystyle x=x_{0}$. Wiki as other people may have a different opinion but in that case they have to justify the fact that a function like...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (1)

... has the derivatives of all order in $\displaystyle x=x_{0}$ and therefore its Taylor expansion around $\displaystyle x=0$ doesn't exist. In fact such Taylor expansion exists only if $\displaystyle f(x)$ is differentiable in $\displaystyle x=0$ and that is not the situation...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$

10. A confirm of what i posted is in...

http://www.math.jhu.edu/~fspinu/405/...20notation.pdf

2.4 The following statement are true…

a) f is differentiable at a
b) there exist a real number such that $\displaystyle f(x)= f(a) + k\ (x-a) + o(x-a)$

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$

11. A function is called to be analytic in a domain if its Taylor series converges to itself in the domain.
So you're mixing the two concepts: differentiable and analytic.

12. Originally Posted by xxp9
A function is called to be analytic in a domain if its Taylor series converges to itself in the domain.
So you're mixing the two concepts: differentiable and analytic.
How? I haven't the slightest idea what either of you are talking about.

To say that $\displaystyle f$ is differentiable on $\displaystyle \mathbb{R}$ is to say that $\displaystyle \displaystyle \lim_{x\to x_0}\frac{f(x)-f(x_0)}{x-x_0}$ exists for all $\displaystyle x_0\in\mathbb{R}$. Thus, one can show that this is true for this function. Moreover, it is infinitely differentiable if it has dervatives of all orders, i.e. if $\displaystyle \displaystyle \lim_{x\to x_0}\frac{f^{(n)}(x)-f^{(n)}(x_0)}{x-x_0}$ exists for all $\displaystyle n\in\mathbb{N}$ and all $\displaystyle x_0\in\mathbb{R}$. I showed that. Only after I proved it was differentiable did I make a REMARK about analyticity.

13. Originally Posted by chisigma
Im my opinion the 'key' of the question is that the fact that the derivative of a function $\displaystyle f(x)$ exists in $\displaystyle x=x_{0}$ doesn't mean that $\displaystyle f(x)$ is differentiable in $\displaystyle x=x_{0}$. Wiki as other people may have a different opinion but in that case they have to justify the fact that a function like...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (1)

... has the derivatives of all order in $\displaystyle x=x_{0}$ and therefore its Taylor expansion around $\displaystyle x=0$ doesn't exist. In fact such Taylor expansion exists only if $\displaystyle f(x)$ is differentiable in $\displaystyle x=0$ and that is not the situation...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$
chisigma, I totally fail to understand what you are getting at here. For a start, the statements "the derivative of the function $\displaystyle f(x)$ exists at $\displaystyle x=x_{0}$" and "$\displaystyle f(x)$ is differentiable at $\displaystyle x=x_{0}$" mean exactly the same thing. Secondly, the function f defined by

$\displaystyle f(x) = \begin{cases}e^{-1/x^2}&(x\ne0) \\ 0&(x=0)\end{cases}$

is infinitely differentiable (equivalently, it has derivatives of all orders) at every point $\displaystyle x_0$ including $\displaystyle x_0=0$. At $\displaystyle x=0$, every derivative is equal to 0. That does not mean that the Taylor expansion around $\displaystyle x=0$ doesn't exist. The Taylor expansion around $\displaystyle x=0$ does exist, but it is identically zero.

14. Originally Posted by Opalg
chisigma, I totally fail to understand what you are getting at here. For a start, the statements "the derivative of the function $\displaystyle f(x)$ exists at $\displaystyle x=x_{0}$" and "$\displaystyle f(x)$ is differentiable at $\displaystyle x=x_{0}$" mean exactly the same thing. Secondly, the function f defined by

$\displaystyle f(x) = \begin{cases}e^{-1/x^2}&(x\ne0) \\ 0&(x=0)\end{cases}$

is infinitely differentiable (equivalently, it has derivatives of all orders) at every point $\displaystyle x_0$ including $\displaystyle x_0=0$. At $\displaystyle x=0$, every derivative is equal to 0. That does not mean that the Taylor expansion around $\displaystyle x=0$ doesn't exist. The Taylor expansion around $\displaystyle x=0$ does exist, but it is identically zero.
Honestly for me is a little hard to undestand how is possible that this function...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (1)

... that has all its derivatives equal to zero in $\displaystyle x=0$, can be computed 'somewhere around' $\displaystyle x=0$ with the Taylor expansion...

$\displaystyle \displaystyle y(x)= \sum_{n=0}^{\infty} \frac{y^{(n)} (0)}{n!}\ x^{n}$ (2)

If all the $\displaystyle y^{(n)} (0)$ are equal to 0 , it is evident that the (2) gives 0 no matter which is x... but y(x) is not equal to the 'nul function'... It seems to me that the situation is 'a little unconfortable'...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$

15. Originally Posted by chisigma
Honestly for me is a little hard to undestand how is possible that this function...

$\displaystyle y(x)=\left\{\begin{array}{ll} e^{-\frac{1}{x^{2}}} ,\,\,x \ne 0\\{}\\0 ,\,\, x=0\end{array}\right.$ (1)

... that has all its derivatives equal to zero in $\displaystyle x=0$, can be computed 'somewhere around' $\displaystyle x=0$ with the Taylor expansion...

$\displaystyle \displaystyle y(x)= \sum_{n=0}^{\infty} \frac{y^{(n)} (0)}{n!}\ x^{n}$ (2)

If all the $\displaystyle y^{(n)} (0)$ are equal to 0 , it is evident that the (2) gives 0 no matter which is x... but y(x) is not equal to the 'nul function'... It seems to me that the situation is 'a little unconfortable'...

Kind regards

$\displaystyle \chi$ $\displaystyle \sigma$
But this is the WHOLE point! This is an example of an infinitely differentiable function $\displaystyle f$, and a point $\displaystyle x_0$ such that the Taylor series of $\displaystyle f$ at $\displaystyle x_0$ converges...but not to $\displaystyle f$!

This is why real analyticity is NOT just a fancy word for $\displaystyle C^{\infty}$.

Page 1 of 2 12 Last