If the average speed of an aeroplane had been 120 kmph less, it would have taken a half an hour longer to fly 1000 km. Find the speed of the plane.
Let V be the velocity and T the time.
Since V= Distance/Time, Time= Distance/Velocity.
The time it would take to fly 1000 miles is T= 1000/V. If the speed had been 120 kph less, the time would have been 1/2 hour more: T+ 1/2= 1000/(V- 120). That is the same as T= - 1/2+ 1000/(V- 120) so T= 1000/V= -1/2+ 1000/(V- 120). Multiply through by V- 120 and you will have a quadratic equation to solve for V.