why can't a^2+b^2 be factored in fact non of the following factor
the factors are all even numbers and some even factors can be broken up to to produce a prime number factor ie 10 become 2^5 but i can't quite see why these sums can't be factored.
Thanks for the help.
Remember that factoring allows you to solve equations: can be solved by so that a- b= 0 or a= b and a+ b= 0 or a= -b are solutions.
If it were possible to factor with real coefficients (we can factor it as ) then it would be possible to find non-zero a and b so that and of course, that is not true. Since the square of any non-zero number is positive, if a and b are not both 0, then must be positive, not 0.
thanks hallsofivy i also read something about there being no common factors can you help on this matter
I have no idea what you mean by that. Are you still referring to ?
also has "no common factors" but can be factored.
Why can a^2-b^2 be factored and a^2+b^2 can't I thought there needed to be a common factor(s) to factor something.
No, you do not have to have "common factors"- at least not obvious ones.
IF you have a "common factor", like the "a" in ab+ ac, then you can simply use the "distributive law": ab+ ac= a(b+ c). But with binomials or trinomials it can be a bit more complicated.
While does NOT have an obvious "common factor", we could rewrite it as . Now the first binomial has a "common factor" of a and the second has a "common factor" of b: . And now we can see that the two terms have a "common factor" of a-b: . But that is not always the simplest thing to do- for something like it is simplest to remember the simple formula, .
Notice that if we try to do the same thing with we would have and now we cannot continue further.
But the real reason we cannot factor or, more generally, is what I said before: cannot have any real roots except a= b= 0. If we were able to factor, with real coefficients, we could find an infinite number of real number values for a and b that would satisfy the equation.