Can somebody proof this :
if n is even and a > 0 then a + 1 is not a factor of a^n +1
i tried contradiction but i cannot solve it
Can somebody proof this :
if n is even and a > 0 then a + 1 is not a factor of a^n +1
i tried contradiction but i cannot solve it
Opalg is perfectly correct, except its only true for a>1. but to flesh it out a little more.
if n is even then X=a, Y=-1 is always
and x--1 = x+1 is a factor of
Lets assume a=1. Then for all n. So a+1=2 is a factor.
for a > 1. a+1 >= 3.
Since a+1 is factor of then
Since a+1 >=3 it can't possibily be a factor of
So I can not prove your statement for a>0, but I can for a> 1
Isn't AlexMahone correct? I came to the same conclusion, let me expand on it a little.
Consider the polynomial P(a) = a^n + 1
a+1 divides P(a) if and only if P(a -(a+1)) = 0 by the factor theorem, a consequence of the remainder theorem.
Suppose a+1 is a factor. Then P(-1) = 0. This is contradiction number one, since a>0. Let us continue the argument nonetheless. P(-1) = (-1)^n + 1 and since n is even -1^n = 1 thus P(-1) = 2, a contradiction. Thus a+1 cannot be a factor.
I don't think anyone was disputing anything he said. In fact I believe what I just posted was a slightly longer way of saying what you just said. I was just giving a proof for a > 1. Though empirical evidence shows us that the statement can not be true for a=1. 1^n for any n is 1, so a^n+1 =2 for all n. Now a+1 = 2 when a is 1. and 2 is a factor of 2, so the statement is quite clearly false for a=1.
The fallacy here is that you need to distinguish between algebraic factors and numerical factors. It is true that the linear polynomial can never be a factor of the polynomial But that does not stop the number being a factor of the number (for a given numerical value of ).
To take a simple example, is not a factor of the polynomial But if you put then you find that is a factor of (which is also equal to 2).
The original post by tareksha does not explicitly state whether this problem is about algebraic or numerical factors. But the condition a>0 (which should actually be a>1) in the statement of the problem implicitly indicates that this is a problem about numerical factors. At least, that is how I interpreted it.
This makes sense since showing a+1 is not an algebraic factor doesn't require a>0 nor n to be even and this redundant information feels uncomfortable. I didn't consider numerical factors and that's a good distinction you've made,
Is the following logic flawed?
Let P(x) = x^n + 1 and Q(x) = x + 1. If Q(x) is a numerical factor of P(x) then P(a)/Q(a) = k for some integer k. Rearranging P(a) = k Q(a) Thus P(a)-k Q(a) = 0. Let R(a) denote this polynomial thus:
R(a) = P(a) - k Q(a) = 0
R(a) = (a^n + 1) - k (a+1) = 0. If this equation holds for some a and k then we have found a numeric factor for P(a) and Q(a)
R has an algebraic factor (a-1) since R(a - (a-1)) = R(1) = (1^n + 1) -k (1+1) = 2-k2. Let k=1 then 2-(1)2 = 0. Therefore a=1 is a numeric factor of P(a) and Q(a)
In order for (a+1) to be an algebraic factor of R, we require R(a-(a+1))=0. Thus R(-1)=0. Here is my question/issue: This argument does not get off the ground since a=-1 and a>0 is a contradiction. Therefore am I correct in thinking you do not need to know n is even? Is this sufficient to show a+1 does not generate a numeric factor?
Allowing a to be negative, R(-1) = (-1^n + 1) - k(0) = -1^n + 1 = 2 here we needed to know n is even.
Since It is pretty important that n is even to prove that a+1 is not a factor. Since its an algebraic factor in this case it will always be a numerical factor as well.
Its also important that a>1 and not a >0
Since a^n is equivalent for all N when a=1. It is sufficient to prove that a+1 is an algebraic factor for any N, to prove that a+1 is a numerical factor for all N.
Since we just proved above that a+1 is an algebriac factor when n=3 then a+1 will be a numerical factor for all N when a=1
Hi Opalg, you say it easily leads to a contradiction, does the following prove a contradiction?:
We know a+1 is a factor of a^n-1. We can write a^n+1 = (a^n-1) + 2. Let P(a) = (a^n-1) + (2). For a+1 to be a factor of P(a) we require a+1 to be a factor of both terms (a^n-1) and (2) but a+1 cannot be a factor of 2 given a>1 since a+1>2 thus the second term will leave a remainder after selecting some value a hence a+1 does not factor P(a) = a^n+1.
I know it seems like I'm beating a dead horse but I just need this cleared up for my sanity. Thanks