Hi everyone,

I seem to be stuck at solving the following:

-(l-1)i+\frac{(l-l^2)\sigma^2}{2}\overset{!}>0

Well, my solution ist: i\overset{!}<{\frac{-l\sigma^2}{2}

BUT, when I plug in l=2, l=-1, l=-2 with \sigma^2=0.04 I only get a correct solution for l=2

That is:

-(2-1)i+\frac{(2-2^2)0.04}{2}=-i-0.04\overset{!}{>}0

and therefore: i\overset{!}{<}-0.04 which corresponds to i\overset{!}<{\frac{-l\sigma^2}{2}

However, for l=-2 I should get: i\overset{!}<{\frac{-(-2)\sigma^2}{2}=\sigma^2=0.04

And here is what I do get for plugging in l=-2 in the first equation:

-(-2-1)i+\frac{(-2-(-2)^2)\sigma^2}{2}\overset{!}>0
3i-\frac{6\sigma^2}{2}=3i-3\sigma^2\overset{!}{>}0

Then my solution is i\overset{!}{>}0.04

which is not the same as i\overset{!}{<}-0.04

What's going on here?

Thank you for your help.