Hi -

Root2 is irrational. Therefore a/b is not equal to root2 for any values of a and b.

Assume first that a/b is greater than root2. Then, squaring:

a^2/b^2 > 2

So a^2 > 2b^2

Then square (a + b)/(a + 2b).

The result can be written as 1 + 3b^2/(a^2 + 2ab + b^2)

and this is less than

1 + 3b^2/(3b^2 + 2ab) since we have replaced a^2 by 2b^2 in the denominator (and a^2 > 2b^2, from above).

This in turn is less than 2, since 2ab is positive.

So (a + b)/(a + 2b) < root2.

QED

Next, assume a/b is less than root2. And so on ... as before with the inequalities reversed.

OK?