Originally Posted by

**theowne** Given a function h = g(f(x)) where f = (x+1)^2 and g = 1/x, determine the intervals in which the function increases.

I tried to find the derivative function and got -2(x+1)^-3 which seems right. What I"m not sure is how to approach determining the intervals. My initial thought is to determine where the derivative touches zero and then substitute values on either side to determine whether the value, and therefore the slope of the original function, is positive or negative. However that would be like this:

0 = -2(x+1)^-3

And I'm not sure how I could solve that for x in that form. Can someone provide some nudging in the right direction?