Math Help - Proof using Mean Value Theorem

1. Proof using Mean Value Theorem

Hi guys,

I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.

Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.

2. Originally Posted by Hweengee
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.
For $y>0$ define $F_y(x) = f(xy)-f(x)-f(y)$ for $x>0$.
Now $F_y$ is differenciable on this integral and $F'_y(x) = \frac{y}{yx} - \frac{1}{x} = 0$.
Thus, $F'_y$ is konstant on $(0,\infty)$ so $F_y(x) = k$ for some $k\in \mathbb{R}$.
But $F_y(1) = 0$ and so $F_y(x) = 0$.
Thus, $f(xy) = f(x)+f(y)$.

3. " is konstant on so for some . "

Do you mean F'_y(x) = 0 --> F_y(x) = k for some k in R? Sorry but I'm still a little confused.

4. Originally Posted by Hweengee
Do you mean F'_y(x) = 0 --> F_y(x) = k for some k in R? Sorry but I'm still a little confused.
Sorry. I fixed it now. Does it make better sense?

5. yes i get it now. thank you very much.