Law of Cosines
Navigation. Airport B is 300 mi from airport A at a bearing N 50 degree E. A pilot wishing to fly from A to B mistakenly flies due east at 200 mi/h for 30 minutes, when he notices his error.
a. How far is the pilot from his destination at the time he notices the error?
b. What bearing should he head his plane in order to arrive at airport B?
a^2 = b^2 + c^2 - 2bc(cos A)
=100^2 + 300^2 - 2*100*300(cos 40)
= 10000 + 90000 - 60000(.766)
a = sqrt(54037.34)
= 232.46 miles
From there you can go all sine rule on it for part b.
hope it helps