Let the polynomial:

f(x) = x^n + a_1 x^{n-1} + ... + a_{n-1} x +1

with non-negative real coefficients a_1, .. , a_{n-1} have n real roots.

Prove that f(2) >= 3^n

RonL

Results 1 to 15 of 18

- Mar 12th 2007, 08:56 AM #1

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

- Mar 12th 2007, 12:15 PM #2

- Mar 12th 2007, 12:46 PM #3

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

- Mar 12th 2007, 01:07 PM #4

- Mar 12th 2007, 05:00 PM #5

- Joined
- Mar 2007
- Posts
- 7

the simplest polinomial is

x*squared*

then the lowest by your definition would be:

p(x) x*squared*+ 1

make x = 1

P(1) = 1*squared*+ 1

P(2) = 2*squared*+ 1

= 5

this maybe wrong.

p.s how do u post a graph (i have a problem but i don't know how to post a graph to ask the question) (its polynomial numbers)

- Mar 12th 2007, 09:54 PM #6

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

- Mar 12th 2007, 11:13 PM #7

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

- Mar 13th 2007, 11:00 AM #8
POSSIBLE OUTLINE OF PROOF

We can also write

f(x) = (x-r_1)(x-r_2)...(x-r_n)

(because f is monic) where r_1, r_2, ..., r_n are the n real roots.

Step 1: show that each of the r_i must be <= 0.

Proof by contradiction: For convenience, arrange the subscripts so that r_1 >= r_2 >=...>= r_n. By assumption, r_1 > 0.

If r_1 > r_2, let x be such that r_1 > x > r_2. Then exactly one of (x-r_1), (x-r_2),... (x-r_n) is < 0, so f(x) < 0, contradiction.

If r_1 = r_2, we have a problem. We could keep going down the list of roots until we find a number that's less than an odd number of the roots, but what if there isn't any such number? That is, r_1 = r_2, r_3 = r_4, and so on, and n is even. In this case I don't know what to do -- perhaps you could make some kind of combinatorial argument using the binomial theorem to show that if all the coefficients are positive, all the roots must be negative.

Step 2. Let's make a notational switch and define R_j = ABS (r_j) [absolute value]. Then f(2) = (2+R_1)(2+R_2)...(2+R_n), where all the R's are positive. Since the constant term of f is 1, it follows that

R_1R_2...R_n = 1.

I believe that for any set of R_j's with this property, it must be true that f(2) >= 3^n.

For example, (2 + 1/3)(2 + 1/2)(2 + 6) = 280/6 > 3^3.

Unfortunately this is the hard part of the proof.

[I apologize if my notation is horrific, but I'm not used to doing math on the Internet.]

- Mar 13th 2007, 11:50 AM #9

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

- Mar 13th 2007, 06:11 PM #10

- Apr 21st 2007, 08:26 PM #11

- Apr 21st 2007, 09:57 PM #12
Prove inductively:

First, show it is true for the first term: n = 1

f(2) = 2^(1) + 1 = 3 >= 3^(1) = 3

Now, assume it is true for some value: n = k

f(2) = 2^k + a_1*2^(k-1) + ... + a_{k-1}*2 + 1 >= 3^k

Show that it is true for the next term: n = k + 1

f(2) = 2^(k+1) + a_1*2^k + ... + a_{k}*2 + 1 >= 3^(k+1)

We have:

2^(k+1) + a_1*2^k + ... + a_{k}*2 + 1

= 2*[2^k + a_1*2^(k-1) + ... + a_{k-1}*2 + a_{k}] + 1

It just occurred to me that I can go no futher until I know how to use the fact that there are n real roots. I know that is important to the proof (because it limits the values of a_1 to a_n), so I need to figure out how that effects the problem before I can go on.

- Apr 21st 2007, 10:09 PM #13
I'll take an approach similar to slobone:

Let f(x) = (x + r_1)(x + r_2)*...*(x + r_{n-1})(x + r_n)

Where r_1*r_2*...*r_n = 1

Now, I'll try proving it inductively:

First, show it is true for the first term: n = 1, which means r_1 = 1

f(2) = (2 + 1) = 3 >= 3^1 = 3

Now, assume it is true for some term: n = k

f(2) = (2 + r_1)(2 + r_2)*...*(2 + r_k) >= 3^k

Where r_1*r_2*...*r_k = 1

Show that it works for the next term: n = k + 1

f(2) = (2 + r_1)(2 + r_2)*...*(2 + r_k)(2 + r_{k+1}) >= 3^(k+1)

Recall that r_1*r_2*...*r_k = 1, but r_1*r_2*...*r_k*r_{k+1} MUST also equal 1, so r_{k+1} MUST equal 1.

Therefore, we have:

(2 + r_1)(2 + r_2)*...*(2 + r_k)(2 + 1) >= 3^k*(2 + 1) = 3*3^k = 3^(k+1)

Q.E.D.

--------------------------------------------------------------------------------------------------------------------------------

Notes:

The coefficient of all the "x" terms in f(x) MUST be 1 (or MUST multiply to equal one - but the way I set up this proof allowed for me to avoid this) because the original polynomial form of f(x) began with 1*x^n, so the expansion of the factored form of f(x) must have the same. Note also that r_1 through r_n take care of the coefficients of the constants, a_1 through a_{n-1}, of the remaining "x"s in the expanded form.

In the factored form of f(x), the product of the roots, r_1*r_2*...*r_n, MUST equal 1 because the constant term of the polynomial form of f(x) was 1, and so expanding the factored form of f(x) MUST also result in a constant term of 1.

I LOVE INDUCTIVE PROOFS!

- Apr 22nd 2007, 02:00 AM #14

- Joined
- Nov 2005
- From
- someplace
- Posts
- 14,972
- Thanks
- 5

As f(x)>0 for all x>=0 all roots are negative. Let these be -b1, -b2, .. -bn,

with all the b's >0.

Then:

f(x) =(x+b1)(x+b2) ..(x+bn)

Also as the constant term is 1 we know that b1.b2. .. bn = 1.

Now 2 + bk = 1 + 1 + bk >= 3 cuberoot(1.1.bk), by the Arithmetic-Geometric

mean inequality. So (2+bk) >= 3cuberoot(bk)

So for x=2:

f(2) = (2+b1)(2+b2) ...(2+bn) >= 3^n cuberoot(b1.b2. .. bn) = 3^n

RonL

- Apr 22nd 2007, 08:32 AM #15