Hi guys, I'm a little bit confused over how to apply the Mean Value Theorem to this particular problem. Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0. Thanks in advance.
Follow Math Help Forum on Facebook and Google+
Originally Posted by Hweengee Suppose f is a function such that f'(x)=1/x for all x>0. Prove that if f(1)=0, then f(xy)=f(x)+f(y) for all x,y >0. For define for . Now is differenciable on this integral and . Thus, is konstant on so for some . But and so . Thus, .
Last edited by ThePerfectHacker; September 18th 2008 at 10:10 AM.
" is konstant on so for some . " Do you mean F'_y(x) = 0 --> F_y(x) = k for some k in R? Sorry but I'm still a little confused.
Originally Posted by Hweengee Do you mean F'_y(x) = 0 --> F_y(x) = k for some k in R? Sorry but I'm still a little confused. Sorry. I fixed it now. Does it make better sense?
yes i get it now. thank you very much.
View Tag Cloud