# Thread: linear approximation accuracy question

1. ## linear approximation accuracy question

I have some problems that ask you go verify a linear approximation (easy enough), and then asks you to find the range of values for x which would make it accurate within 0.1. The latter part has me somewhat stumped.

The problem is:
$\displaystyle (1-x)^\frac{1}{3} \approx 1-\frac{1}{3}x$, a=0

$\displaystyle f'(x) = \frac{1}{3}(1-x)^\frac{-2}{3}(-1)$ which is $\displaystyle f'(x) = \frac{1}{3(1-x)^\frac{2}{3}}$

$\displaystyle f'(0) = -\frac{1}{3}$ and $\displaystyle f(0) = 1$
so
$\displaystyle L(x) =1-\frac{1}{3}x$... so that's verified. To get a tolerance of +/- 0.1 you set up the inequality

$\displaystyle (1-x)^\frac{1}{3} - 0.1 < 1-\frac{1}{3}x < (1-x)^\frac{1}{3} + 0.1$

Here I'm stuck. Can someone point me in the right direction? At this point I'm probably supposed to be doing algebra, but I'm hitting a wall.

2. I think that the accuracy of the linear approximation is
$\displaystyle | \; (1-x)^\frac{1}{3} -( 1-\frac{1}{3}x) \; |= \; the \; third \; term \; of \; Teylor \; series$