it is given that f(x) is monotonically increasing and f(g(x)) is monotonically increasing

does g(x) is monotonically increasing??

i tried to solve it this way:

i think that it does.so i want to disprove the theory that g(x) is not is monotonically increasing:

suppose that g(x) is monotonically decreasing

if a<b then g(a)>g(b)

f(a)<f(b)

f(g(a))<f(g(b))

now what??