Hi
I have an initial value problem to solve.
It is a third-order ODE that must be converted to first-order ODE's first.

2\frac{d^3f}{dn^3} + f\frac{d^2f}{dn^2} = 0

Where:
f(0) = 0
\frac{df}{dn}(0) = 0
\frac{df}{dn}(\infty) = 1
I took variables as such:

x1 = f
x2 = \dot{f}
x3 = \ddot{f}

This gives me the first order equations:

\dot{x1} = x2
\dot{x2} = x3
\dot{x3} = -\frac{x1x3}{2}

Have been told that \infty can be presumed at 10
Also that I have to find a solution for \frac{d^2f}{dn^2}(0) = ? which should lie in the range [0.1,0.4] such that \frac{df}{dn}(10) = 1

So the idea is to find a value for \dot{x2}(10) as this is \frac{d^2f}{dn^2}(10) and then use that to see whether \frac{df}{dn}(10) = 1 or not.

So...
I used Euler's method and MATLAB to iterate up to (10) as follows,

Code:
x3(i+1)= x3(i) + x3(i)*h;
giving me the value \dot{x2}.

I have no idea how I'm supposed to continue past this point. I get the idea of using the bisection method to home in on the correct initial condition so that \frac{df}{dn}(10) = 1 as the first guess will of course be wrong, but getting to that I'm just stumped.

Any help gratefully appreciated,
Matt