Can somebody proof this :

if n is even and a > 0 then a + 1 is not a factor of a^n +1

i tried contradiction but i cannot solve it

- Nov 20th 2011, 08:30 AMtarekshaProve if n is even and a > 0 then a + 1 is not a factor of a^n +1
Can somebody proof this :

if n is even and a > 0 then a + 1 is not a factor of a^n +1

i tried contradiction but i cannot solve it - Nov 20th 2011, 08:59 AMalexmahoneRe: Math proof help please
- Nov 20th 2011, 09:09 AMalexmahoneRe: Math proof help please
In post #2, I proved that $\displaystyle a+1$ is not a factor of $\displaystyle a^2+1$.

However, when $\displaystyle a=-3$, we get -2 is not a factor of 10, which is false. Could someone explain this anomaly? - Nov 20th 2011, 10:10 AMDevenoRe: Math proof help please
a is assumed > 0 so, using a = -1 as a counter-example is ineffective.

also: what is trying to be proven is false (consider a = 1). - Nov 20th 2011, 10:32 AMOpalgRe: Math proof help please
- Nov 20th 2011, 10:27 PMtakatokRe: Math proof help please
Opalg is perfectly correct, except its only true for a>1. but to flesh it out a little more.

$\displaystyle x^n-y^n= (x-y)(\sum_{i=0}^{n}x^{n-i}y^i)$

if n is even then X=a, Y=-1 is always $\displaystyle a^n-1$

and x--1 = x+1 is a factor of $\displaystyle a^n-1 $

Lets assume a=1. Then $\displaystyle a^n+1 = 2$ for all n. So a+1=2 is a factor.

for a > 1. a+1 >= 3.

Since a+1 is factor of $\displaystyle a^n-1$ then

$\displaystyle a+1 \equiv 0 \bmod {a^n-1}$

$\displaystyle a^n+1 - (a^n-1) = 2$

$\displaystyle a+1 \equiv 2 \bmod{a^n+1}$

Since a+1 >=3 it can't possibily be a factor of $\displaystyle a^n+1$

So I can not prove your statement for a>0, but I can for a> 1 - Nov 21st 2011, 05:00 PMbourbaki87Re: Math proof help please
Isn't AlexMahone correct? I came to the same conclusion, let me expand on it a little.

Consider the polynomial P(a) = a^n + 1

a+1 divides P(a) if and only if P(a -(a+1)) = 0 by the factor theorem, a consequence of the remainder theorem.

Suppose a+1 is a factor. Then P(-1) = 0. This is contradiction number one, since a>0. Let us continue the argument nonetheless. P(-1) = (-1)^n + 1 and since n is even -1^n = 1 thus P(-1) = 2, a contradiction. Thus a+1 cannot be a factor. - Nov 21st 2011, 06:12 PMtakatokRe: Math proof help please
I don't think anyone was disputing anything he said. In fact I believe what I just posted was a slightly longer way of saying what you just said. I was just giving a proof for a > 1. Though empirical evidence shows us that the statement can not be true for a=1. 1^n for any n is 1, so a^n+1 =2 for all n. Now a+1 = 2 when a is 1. and 2 is a factor of 2, so the statement is quite clearly false for a=1.

- Nov 22nd 2011, 01:59 AMOpalgRe: Math proof help please
The fallacy here is that you need to distinguish between

*algebraic*factors and*numerical*factors. It is true that the linear polynomial $\displaystyle a+1$ can never be a factor of the polynomial $\displaystyle a^{2n}+1.$ But that does not stop the number $\displaystyle a+1$ being a factor of the number $\displaystyle a^{2n}+1$ (for a given numerical value of $\displaystyle a$).

To take a simple example, $\displaystyle a+1$ is not a factor of the polynomial $\displaystyle a^2+1.$ But if you put $\displaystyle a=1$ then you find that $\displaystyle a+1\ (=2)$ is a factor of $\displaystyle a^2+1$ (which is also equal to 2).

The original post by tareksha does not explicitly state whether this problem is about algebraic or numerical factors. But the condition a>0 (which should actually be a>1) in the statement of the problem implicitly indicates that this is a problem about numerical factors. At least, that is how I interpreted it. - Nov 22nd 2011, 02:05 AMalexmahoneRe: Math proof help please
- Nov 22nd 2011, 02:15 AMbourbaki87Re: Math proof help please
This makes sense since showing a+1 is not an algebraic factor doesn't require a>0 nor n to be even and this redundant information feels uncomfortable. I didn't consider numerical factors and that's a good distinction you've made,

- Nov 22nd 2011, 02:47 AMOpalgRe: Math proof help please
- Nov 22nd 2011, 03:18 AMbourbaki87Re: Math proof help please
Is the following logic flawed?

Let P(x) = x^n + 1 and Q(x) = x + 1. If Q(x) is a numerical factor of P(x) then P(a)/Q(a) = k for some integer k. Rearranging P(a) = k Q(a) Thus P(a)-k Q(a) = 0. Let R(a) denote this polynomial thus:

R(a) = P(a) - k Q(a) = 0

R(a) = (a^n + 1) - k (a+1) = 0. If this equation holds for some a and k then we have found a numeric factor for P(a) and Q(a)

R has an algebraic factor (a-1) since R(a - (a-1)) = R(1) = (1^n + 1) -k (1+1) = 2-k2. Let k=1 then 2-(1)2 = 0. Therefore a=1 is a numeric factor of P(a) and Q(a)

In order for (a+1) to be an algebraic factor of R, we require R(a-(a+1))=0. Thus R(-1)=0. Here is my question/issue: This argument does not get off the ground since a=-1 and a>0 is a contradiction. Therefore am I correct in thinking you do not need to know n is even? Is this sufficient to show a+1 does not generate a numeric factor?

Allowing a to be negative, R(-1) = (-1^n + 1) - k(0) = -1^n + 1 = 2 here we needed to know n is even. - Nov 22nd 2011, 07:35 AMtakatokRe: Math proof help please
Since $\displaystyle a^3+1 = (a+1)(a^2-a+1)$ It is pretty important that n is even to prove that a+1 is not a factor. Since its an algebraic factor in this case it will always be a numerical factor as well.

Its also important that a>1 and not a >0

Since a^n is equivalent for all N when a=1. It is sufficient to prove that a+1 is an algebraic factor for any N, to prove that a+1 is a numerical factor for all N.

Since we just proved above that a+1 is an algebriac factor when n=3 then a+1 will be a numerical factor for all N when a=1 - Nov 26th 2011, 02:44 AMbourbaki87Re: Math proof help please
Hi Opalg, you say it easily leads to a contradiction, does the following prove a contradiction?:

We know a+1 is a factor of a^n-1. We can write a^n+1 = (a^n-1) + 2. Let P(a) = (a^n-1) + (2). For a+1 to be a factor of P(a) we require a+1 to be a factor of both terms (a^n-1) and (2) but a+1 cannot be a factor of 2 given a>1 since a+1>2 thus the second term will leave a remainder after selecting some value a hence a+1 does not factor P(a) = a^n+1.

I know it seems like I'm beating a dead horse but I just need this cleared up for my sanity. Thanks