Hi everyone,

I seem to be stuck at solving the following:

$\displaystyle -(l-1)i+\frac{(l-l^2)\sigma^2}{2}\overset{!}>0$

Well, my solution ist: $\displaystyle i\overset{!}<{\frac{-l\sigma^2}{2}$

BUT, when I plug in $\displaystyle l=2, l=-1, l=-2$ with $\displaystyle \sigma^2=0.04$ I only get a correct solution for $\displaystyle l=2$

That is:

$\displaystyle -(2-1)i+\frac{(2-2^2)0.04}{2}=-i-0.04\overset{!}{>}0$

and therefore: $\displaystyle i\overset{!}{<}-0.04$ which corresponds to $\displaystyle i\overset{!}<{\frac{-l\sigma^2}{2}$

However, for $\displaystyle l=-2$ I should get: $\displaystyle i\overset{!}<{\frac{-(-2)\sigma^2}{2}=\sigma^2=0.04$

And here is what I do get for plugging in $\displaystyle l=-2$ in the first equation:

$\displaystyle -(-2-1)i+\frac{(-2-(-2)^2)\sigma^2}{2}\overset{!}>0$
$\displaystyle 3i-\frac{6\sigma^2}{2}=3i-3\sigma^2\overset{!}{>}0$

Then my solution is $\displaystyle i\overset{!}{>}0.04$

which is not the same as $\displaystyle i\overset{!}{<}-0.04$

What's going on here?

Thank you for your help.