Can anyone help with the following question:
If limf'(x)=0 as x goes to infinity then lim(f(x))/x=0 as x goes to infinity. You either have to prove or give a counterexample.
My attempt so far:
I think it's true. I've said for large enough x, |f'(x)|< e and taking e=1 we have |f'(x)|<1. Im thinking since the gradient of the function is now bounded, the function has to eventually be below the line y=x (it may go above and cross it but then eventually it goes below and stays below). Then f(x)<x and I think that since the gradient of f(x) is shallower than the gradient of x the ratio f(x)/x must tend to 0 as x goes to infinity.
I know this reasoning is really waffling and not a proof so can any help me formalise this better?