Results 1 to 6 of 6

Math Help - factoring

  1. #1
    Senior Member
    Joined
    Jul 2010
    Posts
    357
    Thanks
    4

    factoring

    Hi all,
    why can't a^2+b^2 be factored in fact non of the following factor


    a^4+b^4
    a^8+b^8
    a^16+b^16

    the factors are all even numbers and some even factors can be broken up to to produce a prime number factor ie 10 become 2^5 but i can't quite see why these sums can't be factored.

    Thanks for the help.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,298
    Thanks
    1276
    Remember that factoring allows you to solve equations: a^2- b^2= 0 can be solved by (a- b)(a+ b)= 0 so that a- b= 0 or a= b and a+ b= 0 or a= -b are solutions.

    If it were possible to factor a^2+ b^2 with real coefficients (we can factor it as a^2+ b^2= a^2- (-b)^2= (a- bi)(a+ bi)) then it would be possible to find non-zero a and b so that a^2+ b^2= 0 and of course, that is not true. Since the square of any non-zero number is positive, if a and b are not both 0, then a^2+ b^2 must be positive, not 0.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Jul 2010
    Posts
    357
    Thanks
    4
    thanks hallsofivy i also read something about there being no common factors can you help on this matter
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,298
    Thanks
    1276
    I have no idea what you mean by that. Are you still referring to a^2+ b^2?

    a^2- b^2 also has "no common factors" but can be factored.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Senior Member
    Joined
    Jul 2010
    Posts
    357
    Thanks
    4
    Why can a^2-b^2 be factored and a^2+b^2 can't I thought there needed to be a common factor(s) to factor something.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor

    Joined
    Apr 2005
    Posts
    15,298
    Thanks
    1276
    No, you do not have to have "common factors"- at least not obvious ones.

    IF you have a "common factor", like the "a" in ab+ ac, then you can simply use the "distributive law": ab+ ac= a(b+ c). But with binomials or trinomials it can be a bit more complicated.

    While a^2- b^2 does NOT have an obvious "common factor", we could rewrite it as a^2- ab+ ab- b^2= (a^2- ab)+ (ab- b^2. Now the first binomial has a "common factor" of a and the second has a "common factor" of b: a(a- b)+ b(a- b). And now we can see that the two terms have a "common factor" of a-b: (a+ b)(a- b). But that is not always the simplest thing to do- for something like a^2- b^2 it is simplest to remember the simple formula, a^2- b^2= (a- b)(a+ b).

    Notice that if we try to do the same thing with a^2+ b^2 we would have a^2- b^2= a^2- ab+ ab+ b^2= a(a- b)+ b(a+ b) and now we cannot continue further.

    But the real reason we cannot factor a^2+ b^2 or, more generally, a^{2n}+ b^{2n} is what I said before: a^{2n}+ b^{2n}= 0 cannot have any real roots except a= b= 0. If we were able to factor, with real coefficients, we could find an infinite number of real number values for a and b that would satisfy the equation.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. need help with factoring
    Posted in the Algebra Forum
    Replies: 1
    Last Post: February 7th 2010, 10:47 AM
  2. factoring help
    Posted in the Algebra Forum
    Replies: 2
    Last Post: February 4th 2010, 07:44 PM
  3. Is this factoring or something?
    Posted in the Algebra Forum
    Replies: 4
    Last Post: February 1st 2010, 06:54 PM
  4. Replies: 2
    Last Post: August 22nd 2009, 10:57 AM
  5. Replies: 3
    Last Post: November 5th 2006, 11:02 PM

Search Tags


/mathhelpforum @mathhelpforum