A truck gets 10mpg (miles per gallon traveling) along an interstate highway at 50mph, and this is reduced by .15mpg for each mile per hour increase above 50 mph.
If the truck driver is paid $30/hour and diesel fuel cost P=$3/gal, which speed v between 50 and 70 mph will minimize the cost of a trip along the highway?
Given that x> 50, the gas mileage is 10+ .15(x- 50)= 10+ .15x- 7.5= 2.5- .15x m/g. With gas at $3 /gal, that would cost 3/(2.5- .15x) dollars per mile and so the gas cost for 70 iles will be 210/(2.5- .15x) dollars. It will take 70/x hours to drive 70 miles at x miles per hour so the truck driver wil have to be paid 210/x dollars. The problem, then, is to minimize 210/(2.5- .15x)+ 210/x. You can factor out the 210 and just ignore it. What value of x will minimize 1/(2.5- .15x)+ 1/x?
Originally Posted by jazz20
Take the derivative with respect to x and set it equal to 0.
I don't understand how you are getting the 70/x, but other than that I understand.
Originally Posted by HallsofIvy
speed= distance/time (miles per hour) so time = distance/speed.
You are right! It should be x/70, not 70/x(Crying).