Here is the problem I'm working on:
A ladder 20ft long leans against a house. If the foot of the ladder is moving away from the house at the rate of 2ft/s, find how fast the slope of the ladder is decreasing when the foot of the ladder is 12ft from the house.
Here is what I have so far:
- when x = 12
- By the theorem of Pythagoras:
Any tips on where I went wrong?
- The book, however, came up with
Thanks in advance!