Say I have the following funciton -

and I am asked to find the interval of convergence for f, f'' and f'''. What is the process for getting the IoC for f'' and f'''?

Here is what I think I should do, please correct me if Im wrong -

1. Find where f(x) converges using the Ratio Test, I get |x| < 1 for this.

2. Then check the function for convergence at endpoints -1 and 1. Neither of these converge for me.

3. So I have interval of convergence (-1, 1).

4.

**Here is where I am not sure what to do -** Rather than doing the entire process again for f(x) I can just differentiate

and then check the endpoints against this new function? Because the radius of convergence is the same for f and f'' (and f'''), the interval of convergence will still be between -1 and 1 and I only need to check the endpoints? Is this correct?