# Math Help - Complicated Algebra

1. ## Complicated Algebra

Hi

The problem I'm struggling with is:

If a and b are real numbers such that a + b is not equal to 0 and b is not equal to 0:

(a) Show that if there is only one real root of the equation x^2 + ax + b = 0 where 0 < x < 1, then b(a + b + 1) < 0.

(Hint: Let f(x) = x^2 + ax + b.

(b) Hence, or otherwise, prove that the following equation 1/(x+2) + a/(x+1) + b/x = 0 has two distinct real roots only one of which is positive.

Could someone please please explain to me how to solve this problem?

2. Originally Posted by xwrathbringerx
Hi

The problem I'm struggling with is:

If a and b are real numbers such that a + b is not equal to 0 and b is not equal to 0:

(a) Show that if there is only one real root of the equation x^2 + ax + b = 0 where 0 < x < 1 then b(a + b + 1) < 0.

Mr F says: The inequality is false. eg. It fails for a = -1 and b = 1/4.

(b) Hence, or otherwise, prove that the following equation 1/(x+2) + a/(x+1) + b/x = 0 has two distinct real roots only one of which is positive.

Could someone please please explain to me how to solve this problem?
..

3. Ummm so the question is wrong and the inequality cannot be proven?

4. Originally Posted by xwrathbringerx
Hi

The problem I'm struggling with is:

If a and b are real numbers such that a + b is not equal to 0 and b is not equal to 0:

(a) Show that if there is only one real root of the equation x^2 + ax + b = 0 where 0 < x < 1, then b(a + b + 1) < 0.

(Hint: Let f(x) = x^2 + ax + b.

(b) Hence, or otherwise, prove that the following equation 1/(x+2) + a/(x+1) + b/x = 0 has two distinct real roots only one of which is positive.

Could someone please please explain to me how to solve this problem?
Assume part (a) is patched up, so that it is valid to conclude that b(a + b + 1) < 0, then:

b) Multiply through by x(x+1)(x+2), gives a quadratic. Examining the coefficients and using the condition b(a + b + 1)<0 shows that the coefficients change sign exactly once, so by Descartes rule of signs there is exactly one positive root.

Now put u=-x and do the same for the new polynomial in u you will find the coefficients also change sign once and so by Descartes rule of signs the original quadratic in x has exactly one negative root.

CB

5. Suppose question (a) was correct, how would I actually prove it?

6. Originally Posted by xwrathbringerx
Hi

The problem I'm struggling with is:

If a and b are real numbers such that a + b is not equal to 0 and b is not equal to 0:

(a) Show that if there is only one real root of the equation x^2 + ax + b = 0 where 0 < x < 1, then b(a + b + 1) < 0.

(Hint: Let f(x) = x^2 + ax + b.
If $a$ and $b$ are real numbers such that $a+b$ is not equal to $-1$ and that $b$ is not equal to $0$.

Suppose $x^2+ax+b=0$ has two distinct roots and only one of these is in $(0,1)$ .

Let $f(x)=x^2+ax+b$, then since this is continuous and it has two distinct roots only one of which is in $(0,1)$ the conditions $b \ne 0$ and $a+b \ne -1$ ensure that $f(x)$ has only one zero in $[0,1]$ and that it is not at an end point of the interval. Thus $f(0)$ and $f(1)$ have oposite signs and so:

$f(0)f(1)=b(1+a+b)<0$

CB