I'm rusty at this but I'll give it a shot -
Input 1 into your fourth derivative. Then multiply it by (b-a)^5 [should be 1 since 1^5 is 1.]. Then divide this by 180n^4. Get n to the other side and solve.
So I'm given this equation:
They want me to find the least number of sub-intervals required to get an accuracy of . In the first part of the problem, they ask me to get the max of the fourth derivative from an interval of 0 to 1, right? I take the derivatives until the fifth derivative, and I got my max to be: 0.975525220034 approximately, and yeah. Then, I'm looking back at my notes, and I see that my teacher gave us this formula to use to calculate the error of the Simpson's Rule:
The problem specifically says to use 2 to find the smallest number of N sub intervals to get the Simpson Error to a accuracy.
So what I did was that I set up my equation like this:
and then I tried to solve for N, but I got this:
I can't possibly have that amount of sub intervals, so obviously I'm doing something wrong. I double-checked my differentiation with several people, and they said my differentiation was correct, so I just...I'm getting slightly frustrated. I don't understand what I'm doing wrong, and why I can't get a good value for N.
If I can get any advice, guidance, or any help, I would greatly appreciate it!
Alright, so when I do that, I get: 0.0333109337. The problem says to use two, so where does 2 come into play here?
Sorry for asking so many questions; I'm just very confused. Normally, I'm pretty good at Calculus, but this is frustrating me. Moreover, why did you use 180 instead of 2880?