Two problems: You changed to , which is not correct, and when you plugged in x=1 to , you multiplied by -1 first and then squared, which again is not correct.
The correct answer is:
- Hollywood
seem to have a problem with basic differentiation:
Question: y = 1/(x^2 + 1)
At point x = 1, y = 1/2
Work:
(x^2+1)^-1
-1(X^2+1)^-2 (2x)
(-x^2-1)^-2 (2x)
(-(1)^2 - 1)^-2(2(1)
however, 1 - 1 = 0 and applying ^2 means you are dividing two by zero? where did I go wrong, thanks.