Hi guys,
I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.
Thanks in advance.
Hi guys,
I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.
Thanks in advance.
For $\displaystyle y>0$ define $\displaystyle F_y(x) = f(xy)-f(x)-f(y)$ for $\displaystyle x>0$.
Now $\displaystyle F_y$ is differenciable on this integral and $\displaystyle F'_y(x) = \frac{y}{yx} - \frac{1}{x} = 0$.
Thus, $\displaystyle F'_y$ is konstant on $\displaystyle (0,\infty)$ so $\displaystyle F_y(x) = k$ for some $\displaystyle k\in \mathbb{R}$.
But $\displaystyle F_y(1) = 0$ and so $\displaystyle F_y(x) = 0$.
Thus, $\displaystyle f(xy) = f(x)+f(y)$.