I'm feeling quite confused with series in general: power series, taylor series, maclaurin series, etc.

If I have a series that correctly approximates a function f(x) in any point and is about x = 0, for example, does that just mean that the limit from both sides of x = 0 converges to the same point or does does being about x = 0 mean that the further away from x the value you input for x in the series, the more innacurate it will be compared to the real function itself?

Any input would be greatly appreciated!