I solved it after staring at the question for 10 minutes (after i posted it ).
Two straight roads intersect at right angles. Two men, A and B, are 100 km from the intersection, one on each road. They drive towards the intersection at 30 km/h and 40 km/h respectively. Find the distance of each driver from the intersection as a function of t, the time in hours for which they are driving. Hence, find their distance apart, d(t), at any time. For what value of t is their distance apart least?
How exactly do I do this question? I don't seem to get it .
I presume it suddenly occured to you to use the Pythagorean theorem? If t is the time, in hours, after the intial point, then the distance of each to the intersection is 100- 30t and 100- 40t. The straight line distance between them is .
That will be minimized when is minimized, so you don't have to worry about differentiating the square root.