Finding roots of cubic polynomial based off the discriminant (in C#)
OK, so I'm writing an ASP.NET MVC (in C#) web application that solves for the roots of quadratic, cubic and quartic polynomials.
Now, I've done the quadratic portion, just because the math for that equation is so simple, basing it off of conditionals related to the discriminant, and the plugging values into the quadratic formula accordingly.
Now, cubics are a little different..
I have found the discriminant via this Wikipedia article. and set up a conditional based on the discriminant equaling, or being greater than or less than 0.
So, according to the article, if the discriminant:
< 0; 1 real root; 2 imaginaries
= 0; 1 real root with a multiplicity of 2; and 1 regular real root
> 0; 3 real roots
So, now that I have found that out, I need to decide what to calculate in my program. If you read the part of the article in the link that I have just described above, what follows next kind of confuses me.
It looks like it reads that if the discriminant is > 0 (and therefore there are 3 distinct real roots) to use this formula:
(2b3 − 9abc + 27a2d)2 − 4(b2 − 3ac)3 = − 27a2
However, that doesn't quite make sense since it isn't equal to an x value at all.
Does anyone know how to compute the real roots based on the discriminant being equal to, greater than or less than 0?
I need some help! :)