Page 1 of 2 12 LastLast
Results 1 to 15 of 16

Math Help - Prove if n is even and a > 0 then a + 1 is not a factor of a^n +1

  1. #1
    Newbie
    Joined
    Oct 2011
    Posts
    5

    Exclamation Prove if n is even and a > 0 then a + 1 is not a factor of a^n +1

    Can somebody proof this :

    if n is even and a > 0 then a + 1 is not a factor of a^n +1

    i tried contradiction but i cannot solve it
    Last edited by mr fantastic; November 20th 2011 at 07:17 PM. Reason: Re-titled.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor alexmahone's Avatar
    Joined
    Oct 2008
    Posts
    1,074
    Thanks
    7

    Re: Math proof help please

    Quote Originally Posted by tareksha View Post
    Can somebody proof this :

    if n is even and a > 0 then a + 1 is not a factor of a^n +1

    i tried contradiction but i cannot solve it
    Let f(a)=a^n+1.

    f(-1)=(-1)^n+1=2 ( \because n is even.)

    So, a+1 is not a factor of f(a)=a^n+1.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor alexmahone's Avatar
    Joined
    Oct 2008
    Posts
    1,074
    Thanks
    7

    Re: Math proof help please

    In post #2, I proved that a+1 is not a factor of a^2+1.

    However, when a=-3, we get -2 is not a factor of 10, which is false. Could someone explain this anomaly?
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Mar 2011
    From
    Tejas
    Posts
    3,390
    Thanks
    757

    Re: Math proof help please

    a is assumed > 0 so, using a = -1 as a counter-example is ineffective.

    also: what is trying to be proven is false (consider a = 1).
    Follow Math Help Forum on Facebook and Google+

  5. #5
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7

    Re: Math proof help please

    Quote Originally Posted by tareksha View Post
    Can somebody proof this :

    if n is even and a > 0 then a + 1 is not a factor of a^n +1

    i tried contradiction but i cannot solve it
    if n is even and a > 0 then a + 1 is a factor of a^n 1. If it is also a factor of a^n + 1 then that easily leads to a contradiction (provided that, as Deveno points out, a is greater than 1).
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Nov 2011
    Posts
    53
    Thanks
    1

    Re: Math proof help please

    Opalg is perfectly correct, except its only true for a>1. but to flesh it out a little more.

    x^n-y^n= (x-y)(\sum_{i=0}^{n}x^{n-i}y^i)

    if n is even then X=a, Y=-1 is always a^n-1
    and x--1 = x+1 is a factor of a^n-1

    Lets assume a=1. Then a^n+1 = 2 for all n. So a+1=2 is a factor.

    for a > 1. a+1 >= 3.

    Since a+1 is factor of a^n-1 then
    a+1 \equiv 0 \bmod {a^n-1}

    a^n+1 - (a^n-1) = 2
    a+1 \equiv 2 \bmod{a^n+1}
    Since a+1 >=3 it can't possibily be a factor of a^n+1

    So I can not prove your statement for a>0, but I can for a> 1
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Newbie
    Joined
    Nov 2011
    Posts
    7

    Re: Math proof help please

    Isn't AlexMahone correct? I came to the same conclusion, let me expand on it a little.

    Consider the polynomial P(a) = a^n + 1

    a+1 divides P(a) if and only if P(a -(a+1)) = 0 by the factor theorem, a consequence of the remainder theorem.

    Suppose a+1 is a factor. Then P(-1) = 0. This is contradiction number one, since a>0. Let us continue the argument nonetheless. P(-1) = (-1)^n + 1 and since n is even -1^n = 1 thus P(-1) = 2, a contradiction. Thus a+1 cannot be a factor.
    Last edited by bourbaki87; November 22nd 2011 at 02:05 AM.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    Junior Member
    Joined
    Nov 2011
    Posts
    53
    Thanks
    1

    Re: Math proof help please

    I don't think anyone was disputing anything he said. In fact I believe what I just posted was a slightly longer way of saying what you just said. I was just giving a proof for a > 1. Though empirical evidence shows us that the statement can not be true for a=1. 1^n for any n is 1, so a^n+1 =2 for all n. Now a+1 = 2 when a is 1. and 2 is a factor of 2, so the statement is quite clearly false for a=1.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7

    Re: Math proof help please

    Quote Originally Posted by bourbaki87 View Post
    Isn't AlexMahone correct? I came to the same conclusion, let me expand on it a little.

    Consider the polynomial P(a) = a^n + 1

    a+1 divides P(a) if and only if P(a -(a+1)) = 0 by the factor theorem, a consequence of the remainder theorem.

    Suppose a+1 is a factor. Then P(-1) = 0. But P(-1) = (-1)^n + 1 and since n is even -1^n = 1 thus P(-1) = 2, a contradiction. Thus a+1 cannot be a factor. No contradictions should arise since a>0.
    The fallacy here is that you need to distinguish between algebraic factors and numerical factors. It is true that the linear polynomial a+1 can never be a factor of the polynomial a^{2n}+1. But that does not stop the number a+1 being a factor of the number a^{2n}+1 (for a given numerical value of a).

    To take a simple example, a+1 is not a factor of the polynomial a^2+1. But if you put a=1 then you find that a+1\ (=2) is a factor of a^2+1 (which is also equal to 2).

    The original post by tareksha does not explicitly state whether this problem is about algebraic or numerical factors. But the condition a>0 (which should actually be a>1) in the statement of the problem implicitly indicates that this is a problem about numerical factors. At least, that is how I interpreted it.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    MHF Contributor alexmahone's Avatar
    Joined
    Oct 2008
    Posts
    1,074
    Thanks
    7

    Re: Math proof help please

    Quote Originally Posted by Opalg View Post
    The fallacy here is that you need to distinguish between algebraic factors and numerical factors. It is true that the linear polynomial a+1 can never be a factor of the polynomial a^{2n}+1. But that does not stop the number a+1 being a factor of the number a^{2n}+1 (for a given numerical value of a).

    To take a simple example, a+1 is not a factor of the polynomial a^2+1. But if you put a=1 then you find that a+1\ (=2) is a factor of a^2+1 (which is also equal to 2).

    The original post by tareksha does not explicitly state whether this problem is about algebraic or numerical factors. But the condition a>0 (which should actually be a>1) in the statement of the problem implicitly indicates that this is a problem about numerical factors. At least, that is how I interpreted it.
    If p(x) is an algebraic factor of q(x), does it follow that p(a) is a numerical factor of q(a) for any integer a?
    Follow Math Help Forum on Facebook and Google+

  11. #11
    Newbie
    Joined
    Nov 2011
    Posts
    7

    Re: Math proof help please

    This makes sense since showing a+1 is not an algebraic factor doesn't require a>0 nor n to be even and this redundant information feels uncomfortable. I didn't consider numerical factors and that's a good distinction you've made,
    Follow Math Help Forum on Facebook and Google+

  12. #12
    MHF Contributor
    Opalg's Avatar
    Joined
    Aug 2007
    From
    Leeds, UK
    Posts
    4,041
    Thanks
    7

    Re: Math proof help please

    Quote Originally Posted by alexmahone View Post
    If p(x) is an algebraic factor of q(x), does it follow that p(a) is a numerical factor of q(a) for any integer a?
    Yes, that is correct (assuming that p(x) and q(x) have integer coefficients, of course). It's the converse that is false.
    Follow Math Help Forum on Facebook and Google+

  13. #13
    Newbie
    Joined
    Nov 2011
    Posts
    7

    Re: Math proof help please

    Is the following logic flawed?

    Let P(x) = x^n + 1 and Q(x) = x + 1. If Q(x) is a numerical factor of P(x) then P(a)/Q(a) = k for some integer k. Rearranging P(a) = k Q(a) Thus P(a)-k Q(a) = 0. Let R(a) denote this polynomial thus:

    R(a) = P(a) - k Q(a) = 0

    R(a) = (a^n + 1) - k (a+1) = 0. If this equation holds for some a and k then we have found a numeric factor for P(a) and Q(a)

    R has an algebraic factor (a-1) since R(a - (a-1)) = R(1) = (1^n + 1) -k (1+1) = 2-k2. Let k=1 then 2-(1)2 = 0. Therefore a=1 is a numeric factor of P(a) and Q(a)

    In order for (a+1) to be an algebraic factor of R, we require R(a-(a+1))=0. Thus R(-1)=0. Here is my question/issue: This argument does not get off the ground since a=-1 and a>0 is a contradiction. Therefore am I correct in thinking you do not need to know n is even? Is this sufficient to show a+1 does not generate a numeric factor?

    Allowing a to be negative, R(-1) = (-1^n + 1) - k(0) = -1^n + 1 = 2 here we needed to know n is even.
    Last edited by bourbaki87; November 22nd 2011 at 03:30 AM.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    Junior Member
    Joined
    Nov 2011
    Posts
    53
    Thanks
    1

    Re: Math proof help please

    Since  a^3+1 = (a+1)(a^2-a+1) It is pretty important that n is even to prove that a+1 is not a factor. Since its an algebraic factor in this case it will always be a numerical factor as well.

    Its also important that a>1 and not a >0
    Since a^n is equivalent for all N when a=1. It is sufficient to prove that a+1 is an algebraic factor for any N, to prove that a+1 is a numerical factor for all N.
    Since we just proved above that a+1 is an algebriac factor when n=3 then a+1 will be a numerical factor for all N when a=1
    Follow Math Help Forum on Facebook and Google+

  15. #15
    Newbie
    Joined
    Nov 2011
    Posts
    7

    Re: Math proof help please

    Quote Originally Posted by Opalg View Post
    if n is even and a > 0 then a + 1 is a factor of a^n 1. If it is also a factor of a^n + 1 then that easily leads to a contradiction (provided that, as Deveno points out, a is greater than 1).
    Hi Opalg, you say it easily leads to a contradiction, does the following prove a contradiction?:

    We know a+1 is a factor of a^n-1. We can write a^n+1 = (a^n-1) + 2. Let P(a) = (a^n-1) + (2). For a+1 to be a factor of P(a) we require a+1 to be a factor of both terms (a^n-1) and (2) but a+1 cannot be a factor of 2 given a>1 since a+1>2 thus the second term will leave a remainder after selecting some value a hence a+1 does not factor P(a) = a^n+1.

    I know it seems like I'm beating a dead horse but I just need this cleared up for my sanity. Thanks
    Follow Math Help Forum on Facebook and Google+

Page 1 of 2 12 LastLast

Similar Math Help Forum Discussions

  1. Replies: 0
    Last Post: July 19th 2011, 11:45 AM
  2. factor analysis , extract second factor loading
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: June 1st 2011, 05:17 AM
  3. factor analysis , extract second factor loading
    Posted in the Differential Geometry Forum
    Replies: 0
    Last Post: May 30th 2011, 05:29 AM
  4. Prove that 9 is a factor of...
    Posted in the Number Theory Forum
    Replies: 4
    Last Post: April 9th 2011, 08:49 PM
  5. prove this Highest Common Factor property
    Posted in the Number Theory Forum
    Replies: 3
    Last Post: May 22nd 2008, 02:27 PM

Search Tags


/mathhelpforum @mathhelpforum