So, someone was showing me a math problem where he said he could get 1 = 2. Here's what he did:

$\displaystyle x = 1$

Thus

$\displaystyle x^2 = x^2$

So

$\displaystyle x^2 - x = x^2 - 1$

Right?

Then, he factored:

$\displaystyle x(x -1) = (x - 1)(x + 1)$

Divided by (x - 1):

$\displaystyle [x(x-1)]/(x-1) = [(x-1)(x + 1)]/(x - 1)$

to cancel out (x - 1)

Resulted in:

$\displaystyle x = x + 1$

From where he plugged in x = 1:

$\displaystyle 1 = 1 + 1$

Therefore:

$\displaystyle 1 = 2$

Please explain to me why this fails, because it seems like it does in so many places, but I want to know why exactly! For instance, dividing by $\displaystyle (x - 1)$ would be the same as dividing by zero, which would fail, but he thinks he's pretty much found where math has a weakness. I need expert verification!