# Proof using Mean Value Theorem

• September 18th 2008, 07:01 AM
Hweengee
Proof using Mean Value Theorem
Hi guys,

I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.

Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.

• September 18th 2008, 08:17 AM
ThePerfectHacker
Quote:

Originally Posted by Hweengee
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.

For $y>0$ define $F_y(x) = f(xy)-f(x)-f(y)$ for $x>0$.
Now $F_y$ is differenciable on this integral and $F'_y(x) = \frac{y}{yx} - \frac{1}{x} = 0$.
Thus, $F'_y$ is konstant on $(0,\infty)$ so $F_y(x) = k$ for some $k\in \mathbb{R}$.
But $F_y(1) = 0$ and so $F_y(x) = 0$.
Thus, $f(xy) = f(x)+f(y)$.
• September 18th 2008, 09:00 AM
Hweengee
• September 18th 2008, 09:10 AM
ThePerfectHacker
Quote:

Originally Posted by Hweengee
Do you mean F'_y(x) = 0 --> F_y(x) = k for some k in R? Sorry but I'm still a little confused.

Sorry. I fixed it now. Does it make better sense?
• September 19th 2008, 04:16 AM
Hweengee
yes i get it now. thank you very much.