Here is the context of the problem:

Calculate the size of the iron meteor that would have its initial velocity reduced by only 10% by

drag in Earth’s atmosphere.

I was able to prove the first part:Write down a differential equation for the deceleration of the meteor du/dt.

du/dt=3*C_{d}/(4R_{m}p_{m})*u^{2 }p_{0}*exp(−z/h)

However have no idea about the next:You should find an equation of the form du/dt = . . ., where the stuff on the RHS depends

only on the density of air (which depends on height in the atmosphere z), the

internal density of the meteor, the radius of the meteor, the meteor’s velocity, and dimensionless

constants. You can assume (check, if you like) that the extra acceleration

provided by gravity over this distance is negligible, since the meteor’s velocity is already

terrifyingly large.

Convert your equation to a linear, first-order differential equation. That is, you should

have an equation which reads something like dy/dz = f (z)y, where f (z) is some function

of z. (Hint: begin by noting that dt = dz/v, and substitute y = v2.)

Solve this equation, imposing boundary conditions as appropriate. (You will probably

end up using an integrating factor.)

Hence, calculate the critical radius and mass of the meteorite such that the velocity at

z = 0 is 0.9 times the initial velocity.

Please Help