Hi guys,
I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.
Thanks in advance.
Hi guys,
I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem.
Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0.
Thanks in advance.