Use the alternating series estimation theorem to approximate the sum of from n=0 to infinity with an error of .
I know that after summing a number of terms, the remainder(error) will be less than the first omitted term, so
I've set up the problem like this:
Is this right, and if so, what's next?