Originally Posted by

**free_to_fly** Here's a problem that I cam across the other day, I can't seem to find the answer though at first glance the question looks relatively simple.

There is a 2mile stretch of road. I drive down the first mile with an average speed of 30mph, find what speed I must travel down the remaining mile for my overall average speed to be 60mph.

At first I thought time=distance/speed, so 2/(30+x)=2/60, but that yields the incorrect answer. Does it have something to do with the harmonic mean? Can someone help?