Hi, I'm a little confused as to what the results of a series estimation with a certain precision tells us.
Say we want to estimate the series (n=1, inf) -1(^n-1)/n with a precision of .01.
So |S-S(n)| </= b(n+1), meaning that the series to n subtracted from the entire series is less than or equal to the absolute value part of the alternating series at n+1.
So in our example, you can say that b(n+1)=1/(n+1)</=.01, right? So n must be greater to or equal than 99.
So here's my question, this test is telling us that the series from n=100 to infinity (that is, the sum of every value at n=100 to infinity) is less than or equal to .01? So our estimate for the entire series is off by an error of at most .01?