To calculate a planet's space coordinates, we have to solve the function
Let the base point be a=xi=pi/2 on the interval [0, pi]. Determine the highest-order Taylor series expansion resulting in a maximum error of 0.015 on the specified interval. The error is equal to the absolute value of the difference between the given function and the specific Taylor series expansion. (Hint: Solve graphically)
I have no clue how to solve this so I don't have much direction in my work, I've just been trying a few things and seeing what happens.
Here's what I've got so far. (See figure attached)
Can someone get me going in the right direction or show me how to do these types of questions?