Taylor expansion of ln(1+x)
Okay, so the question is regarding the relation between the interval of convergence and the interval for which . I know they are the same interval, but I am trying to show this by showing that the remainder function, tends to only when . So I have to show that tends to where lies between and (if this approach is wrong, please correct me). It can be observed that so plugging that in I have to show that tends to (because if tends to , then so does ). So the condition for that to tend to zero is that . And now this is where I get stuck showing that must lie in .
A secondary question is why does this still converge on some interval even though on that interval, not all the derivatives are not bounded by some single constant on that interval?