An airplane flying at an altitude of 6 miles passes directly over a radar antenna. When the airplane is 10 miles away (s= 10), the radar detects that the distancesis changing at a rate of 270 miles per hour. What is the speed of the airplane?

I've tried setting up this equation with A=pi r^2, pythagorean theorem, and other formulas.

I don't understand how to set it up, and how to solve it. I'm completely stumped. Please help!