If (a ^b)^c = a^(bc) for all real numbers a, b, and c, you can prove that -5 = 5.
-5 = (-5)^1
= ((-5)^2) ^(1/2) so that -5 = |-5|. But something obviously went wrong. The move from (-5)^(2/2) to ((-5)^2)^(1/2) did not preserve the equality. My question is, why do many text books contain the above rule when it allows for this type of deduction? For example, see rule 3 in http://math.uww.edu/~mcfarlat/141/exponent.htm where all numbers "must be reals" . You can see in that list that rule 13 fails if you let a be -5. Also, i'd like to know if anyone has an improved (more rigorous) version or explanation of the rule.