Hi!
I understand the concept of these series but sometimes I don't quite understand the problems.
For example
f(x) = 1 + x + x^2
Represent this as a Taylor series.
But the series isn't infinite, it stops, right?
Also,
Find the Maclaurin's series for:
f(x) = 1/(1+x)^2
f(x) = 1/(1+x)^3
Thanks
For the first one:
The function only derives up to n = 3, so... That means that it's not really infinite, right, as all other terms would be zero?
And for the second problem: Can we really derive series like that?? And why is it a negative series? ( I don't see where that comes from)
Oh I understand. Mental slip, the function is infinitely differentiable, but the fact that it yields zeros for n greater than 3 makes it easier to calculate.
Okay, thanks! You've helped me conceptualize it better.
Now why would it not always necessarily converge to the function? (If you don't mind. I find this stuff interesting, but my teacher is so vague when teaching this stuff)
Because numbers outside of the interval of convergence would make the series diverge
For example \only on
because if you put in any other values it would make the series diverge
for example since 1 ins excluded I will show why
Which diverges by the integral test
Thus since the series diverges for x=1 it can't be used...the same goes for all other values not included in the interval of convergence
Yes...I am not quite sure what relevance that has but yes....and just so you know
theyre is a taylor series for any function about c...this is due to the fact that even if the series for something diverges basically everywhere it converges at its center
For example if found a function that could would diverge at any value it would still diverge at its center c...this is due to the definition of the taylor series
I dont care if it diverges for any values basically it will always converge at c...this is because if you imput c you get
Which as I just showed is convergent...thus all functions have convergent power series at their centers