My first attempt (at the bottom) was assuming that you're being asked to solve this exactly.

Let me guess - you're taking calculus. This is a linearization estimate problem.

Let

If at some , and , as I assume it will, then

this is saying |f(x) - L(x)| < 0.1, where L(x) = x/2 + 1 is the 1st degree Taylor polynomial.

Then use what you know about error estimates for Taylor polynomials (or, if early in calculus, use the mean value theorem).

***************************

That's ugly. If I were going to attempt it (I'm certainly not), I'd punt on trying to work with the three inequalities simultaneously (nor blend them into an absolute value, as that would eventually produce an even more horrific polynomial), and instead work with each one separately:

1)

and 2) .

For each of those, I'd first clear the fractions:

, so

, so

Thus:

1')

and 2')

I'd then use the trick (Intermediate Value Theorem) that, if a function is continuous on an interval, then once you've found where a continuous function is 0, then everywhere else (which will be the sub-intervals between those zeros), the function is strickly positive or negative. So determining the domain where the inequality holds follows quickly once you know where equality holds.

Thus I'd solve:

1'')

and 2'')

Which becomes:

1''') (and keep in mind that x >= -1/2)

and 2''') (and keep in mind that x >= -1/2).

And then I'd yell at my teacher (just kidding - I'd do that first. Errr... I mean, please don't yell at your teacher.).

Recall

So, looking at 1''', have:

Thus get: (1'''')

Have fun.