Division by zero .
This is a silly proof that my thirteen-year old cousin showed me the other day. I'm ashamed to admit that it took me a full ten minutes to figure it out.
Let a = 1
Let b = 1
a = b
multiply both sides by b:
ab = b^2
ab - a^2 = b^2 - a^2
a(b - a) = (b - a)(b + a)
divide by (b - a):
[a(b - a)]/(b -a) = [(b - a)(b + a)]/(b -a)
a = b + a
returning to our initial property:
a = 1
b = 1
a = b + a
1 = 1 + 1
1 = 2
What is wrong with the above proof? (It's not difficult, but it's a good one to pull out to confuse people)
By the way, did you spot the fallacy in the “proof”?
This is logical 'proof' rather than mathematical....
Consider the following statement;
" If this statement is true, then 1= 2. " (let's call this A)
First, let's assume A to be true. on this supposition;
As statement A is true, "If statement A is true, then 1=2" is true.
and again, as statement A is true and
"if statement A is true, then 1=2" is true, we obtain 1=2.
Now we have proved that "If this statement is true, then 1=2 " is true.
Then, as 'this statement' is true, 1=2 (q.e.d.)
By using the same trick, anything can be proved.
This is called Curry's paradox..
see also: http://en.wikipedia.org/wiki/Curry%27s_paradox