Page 2 of 4 FirstFirst 1234 LastLast
Results 16 to 30 of 57
Like Tree1Thanks

Math Help - Introduction to Calculus Tutorial

  1. #16
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    This shall be the last topic that I will give followed by another lecture on its applications. This topic usually appears in Calculus III and it the beginning of Advanced Calculus. In Advanced Calculus this is presented in much more detail and is much more complicated. I will just mention some basic definitions and theorems.

    If you look at the dictionary definition of "Calculus" it will define it as the mathematics that deals with rates of change, area and volumes of shapes. Yes, that is true from an applied math approach. But what I want you to see is that Calculus is the study of a function (real). Note everything we were doing involved dealing with a function. Thus, Calculus is the study of a function. The notation is (slightly wrong) f:\mathbb{R}\to \mathbb{R}. Meaning it takes a real number (that is represented by a bold R) and takes it (represented by an arrow) into another real number (represented by R). This is why Calculus is called "Real Analysis" because it studies real functions and real numbers. There is another type of Calulus that originated probably 100-150 years after Calculus, it is called "Complex Analysis". It is an analogue meaning, f:\mathbb{C}\to \mathbb{C}, it studies complex functions, that is, functions that take a complex number and transform it into another complex number. The notation for the set \mathbb{N}=\{0,1,2,...\} is called the natural numbers. With that we have the following definition.

    Definition: A sequence is a function f:\mathbb{N}\to \mathbb{R}, meaning it takes a natural number and transforms it into another natural number. Note the domain of a sequence are all natural numbers.

    The standard way to write a sequence is a_n not f(n). Thus, if a_n=n^2 then we have,
    a_0=0, a_1=1, a_2=4. The nice thing about a sequence is that we can write out all the terms,
    a_0,a_1,a_2,... in a list. The reason why I say it is nice is because you cannot do that with a real function, you cannot write out all the output values. Supprisingly you can do it with the rational values. This concept is called "countable", it has nothing to do with what we will discuss but I just wanted you to know what it means. This tell us that there are more real numbers then there are natural numbers! A concept that non-mathematicians can never accept, it is difficult for them to understand it.

    When we deal with sequences it is useful to know what the limit of a sequence is. That is \lim_{n\to \infty}a_n, since we are dealing with the natural numbers we are only interested in the case n\to \infty. Thus all limits are n\to \infty. Thus, we will simply write \lim a_n. The idea is, what number a sequence is tending to.

    Example 57: Consider the sequence, 1,1.4,1.41,1.412,... then \lim a_n = \sqrt{2}.

    Definition: If a sequence a_n has a limit then it converges. Otherwise it diverges.

    Do not be confused with the word "diverges". In English it means entropy, something that expands. Hence you would assume something like \lim n^2 diverges because it gets larger and larger without bound. When this happens we say it diverges to infinity and write \pm \infty whenever appropriate. Thus, \lim n^2 = +\infty (note it does not mean the limit exists, it rather means it does not with the special property of growing without bound. The sequence (-1)^n which is 1,-1,1,-1,.. does not tend to a value, thus \lim (-1)^n does not exist, nor does it not diverge to infinity.

    One thing about math is that definitions are just as important as theorems, and in some cases even more important! Calculus/Analysis is a classic example. The foundation of Calculus is the meaning of a sequence and its limit. Yes, we understand what it means intuitively, but how do we define "tending to", "getting closer to", "approaching"? For a mathematician, these terms need to be defined. Below I will present the defintion of this.

    Definition: Given a sequence a_n, we say it has a limit \lim a_n = L when: for any \epsilon>0 there exists a natural number N such that |a_n-L|<\epsilon for all n\geq N.

    Wow! That looks dangerous. You probably never seen a more complicated definition in you life. This definition is attributed to Cauchy and Weierstrauss. Behind this ugly looking definition is one simple and beautify idea. First when we say two numbers \alpha and \beta are close to each other we mean that their difference is very small. Since we do not know which one is larger we use absolute value. Thus, |\alpha - \beta| is small. When we say a sequence converges to a number what we are saying in other words is that the difference between the sequence a_n and its limit value L (if it exists) is small. Thus, |a_n-L|<\epsilon, where epsilon is some small number. Thus, let me repeat. We have a sequence a_n and a limit value L it means |a_n-L| can be made as close as possible, that is, choosing a smaller and smaller \epsilon for sufficiently large n, that is n\geq N for some big natural number N. We will not be using this definition in the lecture, I mentioned it because I want you to see the mathematical side of Calculus. All theorems and all rules that are from Calculus are based on this and other similar definitions. Hence the most important result in Calculus is not really a theorem it is a mere definition.

    Example 58: I will prove the following mathematically using the definition above. Given a sequence a_n=1/n it seems reasonable to say \lim 1/n=0. Hence we need to show |1/n-0|=|1/n|=1/n<\epsilon for n\geq N. Solve the inequality 1/n<\epsilon to get n>1/\epsilon, thus if we choose the first integer that makes this true we have completed the proof, the first such integer is [1/\epsilon]+1, where [\,\,\,] is greatest integer function, that is the integer part of the number. Again, here is how it works. For any epsilon that you name \epsilon>0, I can choose N=[1/\epsilon]+1 an integer such that for all n\geq N we have |a_n|<1/\epsilon. And the proof it complete.
    For example if you choose \epsilon=.001 then I should choose N=[1/.001]+1=1001.

    If you had difficulty understanding that do not worry, we will not use it again. You probably have a question whether mathematicians prove limits in this painful way. No, as you learn math you will find mathematicians create definitions, prove some useful theorem using the defintions, and all results that are complicated that follow are not based on the definitions rather on the proven theorems from the definitions.

    Definition: A sequence a_n has an upper bound means there exists a real number R such that a_n \leq R for all n. By analogy it has a lower bound means there exists a real number r such that r\leq a_n for all n. A sequence is bounded if it both has an upper and lower bound.

    The following should make sense, if a sequence has a limit then it cannot grow without bound, meaning diverge to infinity.

    Theorem:If a sequence has a limit then it is bounded.

    Example 59: The sequence a_n=1/n converges to zero. And it is true that is bounded, because any whole integer is larger than the sequence and any negative number is lower than the sequence. However, the other way around is not true. If a seqeunce is bounded does not mean it converges. For example, a_n=(-1)^n is bounded but it does not converge.

    The converse (the theorem the other way around) is true when you have a monotone sequence.

    Definition: A sequence is non-increasing means that a_n\geq a_{n+1} for sufficiently large n. A sequence is non-decreasing means that a_n\leq a_{n+1} for sufficiently large n. A sequence is monotone when it is either non-increasing or non-decreasing.

    Example 60: The sequence 10,9,1,2,3,4,... is increasing (and hence non-decreasing) because for sufficiently large n in this case n=2 it increases (and hence is non-decreasing).

    A classic theorem in analysis.

    Bolzano-Weierstrauss Theorem: A bounded monotone sequence is convergent (convergent means have a limit).

    To illustrate this theorem consider you have a sequence of numbers and they are increasing in this case and have an upper bound. Thus, they get larger and larger but not without bound (because it is bounded) and hence it cannot diverge at infinity. Again not a proof but this reasoning is plausible.

    Example 61: Consider a_n=1/n again. It is decreasing (and non-increasing) thus it is monotone. And it is bounded. Thus, it is convergent. In this case it converges to zero. This theorem is a example of an existence theorem. An existence theorem says that something exists without saying what it is. They appear all over math. In fact, most theorems are existence theorems.

    Now we get to infinite series. Those are infinite sums.

    Definition: Given a sequence a_n we define a new sequence S_n, (call the series) as S_n=a_0+a_1+...+a_n. This sequence (series) is also called the sequence of partial sums.

    For example, a_n=1,2,3,4,... then S_0=1, S_1=1+2=3 and thus on.
    Hence, the infinite sum a_0+a_1+a_2... is defined to be (if it exists) as the limit of partial sums, that is, \lim S_n. Hence, the sum of infinitely many numbers exists only when the sequence of partial sums is convergent. And its value is defined to be the limit of the sequence of partial sums. The notation that is used is \sum_{n=0}^{\infty} a_n, note sometimes a series can start from a different value, it does not need to be zero. For example, the series, which is beyond this lecture, converges, \sum_{n=1}^{\infty} \frac{1}{n^2}=\frac{\pi^2}{6}.

    The sad thing is, is that there is no nice approach to finding the sum of a series once you established it is convergent. For most of the time mathematicians do not care, all they care about whether a series diverges of converges.

    Here is a simple theorem, that gives a condition a convergent series must have.

    Divergence Theorem: If \sum_{n=k}^{\infty} a_n converges then \lim a_n=0.

    Proof: Note the meaning of n=k means the sequence can start from any value not necessarly zero of one. Since it is convergent \lim S_n exists. But that means that \lim S_{n+1} exists also and is equal. Thus,
    \lim S_{n+1}=\lim S_n
    \lim S_{n+1}-\lim S_n=0
    \lim (S_{n+1}-S_n)=0
    But, S_{n+1}-S_n = a_{n+1}+a_n+...+a_0-a_n-...-a_0=a_{n+1}.
    Thus, \lim a_{n+1}=\lim a_n = 0.

    This test is useful in the following way. If the limit of the sequence IS NOT zero then it diverges. What I just said is a logically equivalent statement, called the contrapositive.

    Example 62: The series \sum_{n=1}^{\infty} 1 diverges. Because \lim 1 = 1 \not = 0. It also makes sense, if you keep adding 1's you get a larger and larger number without bound. Thus, there can be no limiting value.

    Example 63: The sum of reciprocals \sum_{n=1} \frac{1}{n} is called the "Harmonic Series". Note, \lim \frac{1}{n}=0. However, the theorem does not say whether it converges or diverges, thus we do not know. But soon we shall.

    Definition: A series of the form \sum_{n=k}^{\infty} (-1)^n a_n or \sum_{n=k}^{\infty}(-1)^{n+1} a_n where the sequence a_n>0. Is called alternating series. Because the terms of the sequence are positive and (-1)^n \mbox{ or }(-1)^{n+1} are positive and negative numbers alternating. Thus, it is a sequence of alternating signs.

    Example 64: The series \sum_{n=1}^{\infty} \frac{(-1)^{n+1}}{n}. Is an alternating series. First it satisfies the condition of the definition. Or you can write out the terms, 1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+... you can see it is alternating in signs.

    Here is a really useful theorem about alternating series.

    Leibniz Alternating Series Test: If a series is alternating, \lim a_n=0 and a_n is non-increasing then the series converges.

    Example 65: The series \sum_{n=1}^{\infty} \frac{(-1)^{n+1}}{n} is an alternating series as explained before. Futhermore, a_n=\frac{1}{n} is decreasing because a_{n+1}<a_n and hence non-increasing. And the limit \lim a_n=0. Thus, Leibniz test says it converges. This is called the alternating harmonic series.

    The following definition is my own, thus you will not find it in textbooks. Nor the term anywhere else. Thus, you will know not to use it when you discuss convergence, because people will not understand you.

    Definition: An extension function of a sequence a_n is a real-function such that f(n)=a_n for all natural numbers n.

    Example 66: Consider the sequence a_n=1/n, n\geq 1. Then the function f(x)=1/x is an extension function. That is for all positive integers n we have. f(n)=a_n. Consider a_n=\sqrt{1+n^2} then f(x)=\sqrt{1+x^2} is an extension function.

    Here is why our discussion on improper integrals becomes useful.

    Integral Test: Given a sequence \sum_{n=k}^{\infty} a_n and let f(x) be an extension function of the sequence. If f(x) is continous, decreasing and positive for [k,\infty). Then \sum_{n=k}^{\infty} a_n and \int_k^{\infty} f(x) dx either both converge or both diverge.

    Just a warning, it DOES NOT mean they converge to the same value. They can be different values, it only says if it converges or diverges.

    Example 67: Let us determine whether \sum_{k=1}^{\infty} \frac{1}{n} converges or diverges. Note an extension function is f(x)=1/x. This function is continous, positive, and decreasing (the derivative is negative). Thus, we can compare it with \int_1^{\infty} \frac{1}{x} dx = \lim_{t\to \infty} \ln t -\ln 1 = \lim_{t\to\infty} \ln t. This grows without bound. Thus, the harmonic series diverges.

    Example 68: Let us determine whether \sum_{k=2}^{\infty} \frac{1}{n\ln n} converges or diverges. Note an extension function is f(x)=1/(x\ln x), it is continous and positive. To show it is decreasing show that the derivative is negative. (If a function is not decreasing, but it eventually decreases you can still use this. Because only only care about what happens in the end not at the beginning). Thus, we compare it with \int_2^{\infty} \frac{1}{x\ln x} dx. First we need to find \int \frac{1}{x\ln x} dx=\int\frac{1}{\ln x} \cdot \frac{1}{x} dx. Let u=\ln x then du/dx=1/x. Thus, \int \frac{1}{u} \frac{du}{dx} dx=\int \frac{1}{u} du=\ln u+C=\ln \ln x +C. Thus, \lim_{t\to \infty} \ln \ln t - \ln \ln 2 it also increases without bound, but really really slowly. Thus the series diverges.

    When a series converges it is reasonable to say the numbers get small quickly enough. Thus the ratio of the numbers can determine whether a series converges or diverges.

    Ratio Test: Given a series \sum_{n=k}^{\infty} a_k if a_n is non-zero for suffiently large n then consider the limit of the ratio \lim | a_{n+1}/a_n |. If the ratio is strictly less than 1 it converges. If ratio is strictly more than 1 or the ratio diverges to infinity then the series diverges. And the "bad" point is when the ratio is percisely equal to 1. In that case the test is inconclusive, meaning it can and it cannot in some cases.

    Example 69: Consider \sum_{n=1}^{\infty} \frac{1}{n!} we can show convergence by looking at the ratio,
    \lim \left| \frac{1}{(n+1)!} \cdot \frac{n!}{1} \right|=\lim \frac{n!}{(n+1)n!}=\lim \frac{1}{n+1}=0. Thus it converges.

    Example 70: Consider \sum_{n=1}^{\infty} \frac{1}{\sqrt{n}} note we do not need to use absolute value symbols because everything is positive here. Thus a_{n+1}/a_n=\frac{\sqrt{n}}{\sqrt{n+1}}=\sqrt{\frac{n}{n+1  }} and the limit as n\to \infty is 1, thus we cannot use the ratio test. However, an extension function is f(x)=\sqrt{x} and the improper integral \int_1^{\infty} \sqrt{x} dx diverges. Thus the series diverges.

    Example 71: We already know that the alternating harmonic series converges \sum_{n=1}^{\infty} \frac{(-1)^{n+1}}{n}. If we use the ratio test, \left| \frac{(-1)^{n+2}}{n+1} \cdot \frac{n}{(-1)^n} \right| = \frac{n}{n+1} and as n\to \infty then the limit is 1, thus we cannot use the ratio test. However, we know that is converges. Thus, the preceding example and this one show that limit of 1 is no good.

    Example 72: This is a really fun one, \sum_{n=1}^{\infty} \frac{n^n}{n!}. Using the ratio test (again no need of absolute values) we find that \frac{(n+1)^{n+1}}{(n+1)!} \cdot \frac{n!}{n^n}=\frac{(n+1)^{n+1}n!}{(n+1)n!n^n}. Some canceling,
    \frac{(n+1)^n}{n^n}=\left( \frac{n+1}{n} \right)^n=(1+1/n)^n. The question is what is the limit as n\to \infty, if you remember when I was dicussing the exponential function y=e^x I mentioned that that limit is important and is e\approx 2.718. Thus, by the ratio test this series diverges.

    There are many more tests. These are the most important ones. Whenever you are given a series, always first check if the limit of the terms is non-zero, in that case you can use the divergence theorem. If given an alternating series check the divergence theorem and then show it is decreasing. If given a series whose extension function is easy to integrate then compare it with the improper integral. If given a series involving factorial you cannot find a good extension function (yes, the Gamma function works, but how are you going to find its anti-derivative!) and you should use the ratio test.
    ~~~
    Excercises

    1)Find \lim \frac{n+1}{n^2+2}.

    2*)Prove formally (definition of limit) that \lim \frac{1}{n^2}=0.

    3)Does \sum_{n=0}^{\infty} \frac{1}{\sqrt{n+1}} converge?

    4)Does \sum_{n=1}^{\infty} \frac{(n^2)!}{(n!)^2} converge?

    5)Does \sum_{n=1}^{\infty} \frac{(n!)^2}{(n^2)!} converge?

    6)Does \sum_{n=1}^{\infty} \frac{(-1)^n}{\sqrt{n}} converge?

    7)Does \sum_{n=0}^{\infty} \frac{n^5+1}{n^4+1} converge?

    8)Does \sum_{n=1}^{\infty} \frac{1}{e^n} converge?

    9)Does \sum_{n=1}^{\infty} \frac{n!}{(2n)!} converge?

    10)Does \sum_{n=1}^{\infty} \frac{9^n}{n!} converge?

    11*)Show that if \sum_{n=0}\frac{kn}{n^2+1} converges only when k=0.
    Last edited by mash; March 5th 2012 at 08:40 PM. Reason: fixed latex
    Follow Math Help Forum on Facebook and Google+

  2. #17
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    This shall be my last lecture that I will give in this tutorial. This shall be a continuation of the last lecture. Now since we have an understanding of what a sequence and a series are we can talk about a very important topic that appears a lot in analysis related areas, power series.

    Definition: A power series (centered at zero) is an "infinite polynomial" of the form y=\sum_{n=0}^{\infty} a_nx^n=a_0+a_1x+a_2x^2+.... The domain is defined to be the values for x such that the power series converges. And the value is defined to be the convergent value.

    The above definition is good because it always converges at least at one point that is x=0. Note, the a_n are the terms of some sequence.

    Here is a more general definition.

    Definition: A power series centered at c is y=\sum_{n=0}^{\infty}a_n (x-c)^n.

    Again it converges at least at one point, x=c.

    There is one interesting theorem about infinite series which is not even proved in the standard Calculus III course.

    Theorem: Given a power series, then the series converges at: a single point (its center), some interval, or the entire number line.

    When I say "some interval" I mean to say an interval one of the forms: (a,b),[a,b),(a,b],[a,b]. For example, a power series can coverge for x=[1,2]. Another example, it can converge for x=(1,2). But it cannot converge for x=[1,2]\cup [3,4].The symbol \cup represents "union" it means 1\leq x\leq 2 \mbox{ or } 3\leq x\leq 4, because it is not one complete interval, that is, it is [1,4] with (2,3) missing which does not make it complete, and hence violates the theorem. When we say interval of convergence we means the interval for which the series converges, it can happen that the series converges for the entire number line (that is for all x) in that case we shall say the interval of convergence is infinite. And it can also happen that a power series converges at a single point (the center) and we shall say the interval of convergence is zero.

    Example 73: Given the power series \sum_{n=0}^{\infty} \frac{x^n}{n!} find the interval of convergence. The typical way of doing these problems is through the ratio test, because the ratio test is useful for dealing with exponents and factorials. Thus we have a series \sum_{n=0}^{\infty} a_n where a_n=\frac{x^n}{n!} are the terms of the sequence. Now we find the ratio of these terms. But before we do that I will mention one important point. The ratio test that we had the condition that the terms of the sequence are non-zero for sufficiently large n, that is the condition for it to work. In the case where x=0 we cannot use the ratio test because it is always zero, and hence, never non-zero for sufficiently large n. But we can ignore that case because it is the center and we know the series converges at the center. Thus, it is safe to assume x\not =0. Now we use ratio test, \lim \left| \frac{x^{n+1}}{(n+1)!}\cdot \frac{n!}{x^n} \right| = \lim \left| \frac{x n!}{(n+1)n!} \right|
    Cancel, \lim \left| \frac{x}{n+1} \right| now no matter what x you choose you get a limit of zero and by the ratio test this shows convergence. Thus, the interval of convergence is infinite, or you may want to write (-\infty,\infty).

    Example 74: Given the power series \sum_{n=1}^{\infty} \frac{x^n}{n} find the interval of convergence. Again without any fear we can use the ratio test without considering the case x=0 (division by zero) first because we know it converges for it is its center. The terms of the sequence for some fixed x are a_n=\frac{x^n}{n} thus we need to evaluate the limit \lim \left| \frac{x^{n+1}}{n+1} \cdot \frac{n}{x^n} \right| =\lim \left| x\cdot \frac{n}{n+1} \right|. But the limit of \frac{n}{n+1} is one, thus the final limit we get is |x|. We know by the ratio test that if the limit is strictly less than 1 then we have convergence. Thus |x|<1 the power series converges. And for |x|>1 the power series diverges. But what about |x|=1, remember that is the "bad point", the point where the ratio test is inconlusive. Thus, we check each point seperately. The solution to |x|=1 is x=\pm 1. Thus, if x=1 then the power series evaluated at that point is \sum_{n=1}^{\infty} \frac{1}{n}, this is the infamous harmonic series we had before, it diverges. If x=-1 then we have \sum_{n=1}^{\infty} \frac{(-1)^n}{n} this is the alternating harmonic series which we know converges. Thus, the series converges for |x|<1 and x=-1, expressing the absolute value as interval we have (-1,1)\cup \{-1\}=[-1,1), this is the interval of convergence. Note we did not get any other interval as promised by the theorem.

    Example 75: Given the power series \sum_{n=0}^{\infty} (-1)^nn!x^n find the interval of convergence. Again we need to consider the case x=0 but as we already seen we can ignore that without any fear. This is non-zero for suffiently large n we can use the ratio test, \left| \frac{(-1)^{n+1}(n+1)!x^{n+1}}{(-1)^n n!x^n} \right|=(n+1)x. The limit as n\to \infty always diverges to +\infty unless x is zero. Thus, in this case the interval of convergence is zero, that is, only converges at its center x=0.

    If you are smart you should realize what I am about to say. If \lim |a_{n+1}x^{n+1}/a_n x^n | exists then there are two cases. When the limit is zero, in that case the interval of convergence is infinite. The limit is not zero (it must be positive because of absolute value). In that case, for convergence, x must satisfy |x|<L where L is the limit of |a_{n+1}/a_n|. (And also check endpoints). In this case we have an interval. The final case is when \lim |a_{n+1}x^{n+1}/a_n x^n | diverges to \infty thus the interval of convergence is zero. Thus, we proved the theorem, because the cases are the full line, the interval, or a point, theorem proved! But if you are Hacker smart then, no, it is not as simple as I just said. Because in the last case \lim |a_{n+1}x^{n+1}/a_n x^n | when it does not exist, does not mean it must diverge to infinity (remember the sequence (-1)^n limit does not exists but it does not increase without bound). Thus, we completely do not know what the interval is because the limit does not exist nor diverge to +\infty. But there is something else that can happen, the ratio test cannot be used, if a_n if never non-zero for sufficiently large n. Thus, the theorem is powerful because it applies even to the case when the ratio limit does not exists nor diverges to +\infty and in the case where the ratio test cannot be used.

    The nice thing about power series is that they are easy to differenciate. We can do it term by term. Though it might seem obvious, this is not the case. The same thing about integration.

    Example 76: If y=\sum_{n=0}^{\infty} x^n=1+x+x^2+x^3+... is a power series function. Then, term by term differenciation says, y'=0+1+2x+3x^2+...=\sum_{n=1}^{\infty}nx^{n-1}.

    Theorem: If f(x)=\sum_{n=0}^{\infty}a_n x^n then f'(x)=\sum_{n=1}^{\infty} na_n x^{n-1}

    The power series, as mentioned, represents a function f(x)=\sum_{n=0}^{\infty}a_n (x-c)^n whose domain is the interval of convergence and function value is the convergent value. The question that is of interest to us is given some arbitraty function f(x) can we find a power series representation for the function? Let us assume that f(x) has a power series representation,
    f(x)=a_0+a_1(x-c)+a_2(x-c)^2+a_3(x-c)^2+...
    We will assume f(x) is infinitely differenciable at c meaning we can take the derivative again and again.
    Evaluate f(x) at c to get,
    f(c)=a_0+a_1(0)+a_2(0)+...=a_0
    Next, take the derivative,
    f'(x)=a_1+2a_2(x-c)+3a_3(x-c)^2+...
    Evaluate at c to get,
    f'(c)=a_1+2a_2(0)+3a_3(0)+...=a_1
    Next, take the derivative again,
    f''(x)=2a_2+6a_3(x-c)+...
    Evaluate at c to get,
    f''(c)=2a_2+6a_3(0)+...=2a_2
    Next, take the derivative again,
    f'''(x)=6a_3+....
    Evaluate at c to get,
    f'''(c)=6a_3+0+...=6a_3.
    The pattern is clear, in general we get, for n\geq 0,
    f^{(n)}(c)=n!a_n
    Thus,
    \boxed{ a_n = \frac{f^{(n)}(c)}{n!} }
    This, show that if an infinitely differenciable function is expressible as a power series at c then such a representation must be unique.
    (Note the (n) in the exponent does not mean power, it means derivative).

    Definition: If f(x) is infinitely differenciable at c then the Taylor Series (centered at c) is defined by,
    \sum_{n=0}^{\infty} \frac{f^{(n)}(c) (x-c)^n}{n!} .

    Warning, do not think that this says that there is the power series, it is simply a definition. Given a function f(x) we define its Taylor series to be that series. Of course, what we are really are hoping for is that the Taylor series of a function and the function are the same, meaning the Taylor series represents the function. There is a way to show that given a infinitely differenciable function to show it matches with its Taylor series. But since this procedure is a little bit too advanced we will assume that it exists. But we will do more than assume, we will assume it exists on its interval of convergence. Meaning, we will take a function, find its Taylor series, assume it matches the function, find the interval of convergence, and state that this power series represents the given function on the given interval.

    Example 77: We will find the power series for f(x)=\frac{1}{1-x}. Again, we will assume that the Taylor series is the unique power series centered at x=0.
    f(x)=\frac{1}{1-x}
    f'(x)=\frac{1}{(1-x)^2}
    f''(x)=\frac{2}{(1-x)^3}
    f'''(x)=\frac{2\cdot 3}{(1-x)^4}
    ...
    Thus,
    f(0)=\frac{1}{1-0}=1
    f'(0)=\frac{1}{(1-0)^2}=1!
    f''(0)=\frac{2}{(1-0)^3}=2!
    f''(0)=\frac{3\cdot 2}{(1-0)^4}=3!
    .....
    Thus, the coefficients in the Taylor series are,
    a_0=\frac{f(0)}{0!}=1
    a_1=\frac{f'(0)}{1!}=1
    a_2=\frac{f''(0)}{2!}=1
    a_3=\frac{f'''(0)}{3!}=1
    .....
    Thus,
    \frac{1}{1-x}=\sum_{n=0}^{\infty} x^n=1+x+x^2+x^3....
    We are almost finished. Remember I said to check the interval of convergence. By using the techinque with the ratio test shown before we find that (-1,1) is the interval of convergece. Thus, the power series for the function only works for -1<x<1. And for other values it fails. Again, we did not actually show that there is a power series, we just assumed, because that is a little too advanced right now.

    The above series is extremely important, it is called the infinite geometric series. For example, we can use if in the following manner. What does .11111... represent? We can write, by definition of decimals,
    \frac{1}{10}+\frac{1}{10^2}+\frac{1}{10^3}+...=\fr  ac{1}{10}\left( 1+\frac{1}{10}+\frac{1}{10^2}+... \right)
    Thus, by the formula, the sum is, \frac{1}{10} \cdot \frac{1}{1-\frac{1}{10}}=\frac{1}{9}.

    Example 78: We know that \sum_{n=0}^{\infty}x^n=\frac{1}{1-x} on (-1,1), thus, the derivative of both sides, \sum_{n=1}^{\infty} n x^{n-1}=1+2x+3x^2+4x^3+...=\frac{1}{(1-x)^2}. Multiply by through x to get,
    x+2x^2+3x^3+...=\frac{x}{(1-x)^2}. This tells us that, (note -1<1/2<1)
    (1/2)+2(1/2)^2+3(1/2)^3+...=\frac{1/2}{(1-1/2)^2}.

    Example 79: We will find the power series for the exponential function centered at x=0. Thus, the function that we are working with is f(x)=e^x. To find the coefficients for the power series we take derivatives and evaluate.
    f(x)=e^x
    f'(x)=e^x
    f''(x)=e^x
    f'''(x)=e^x.
    .....
    Thus,
    f(0)=1
    f'(0)=1
    f''(0)=1
    f'''(0)=1
    .....
    Thus, the coefficients in the Taylor series are,
    a_0=\frac{f(0)}{0!}=1
    a_1=\frac{f'(0)}{1!}=\frac{1}{1!}
    a_2=\frac{f''(0)}{2!}=\frac{1}{2!}
    a_3=\frac{f'''(0)}{3!}=\frac{1}{3!}
    .....
    Thus, we have,
    e^x=1+\frac{x}{1!}+\frac{x^2}{2!}+\frac{x^3}{3!}+.  ..=\sum_{n=0}^{\infty} \frac{x^n}{n!}.
    Now, we have to find the interval for which this formula is valid, that is we need to find the interval of convergence. We done this example in the previous lecture. We find that (-\infty,\infty) is the interval of convergece. Thus, it works for all x. Specifically when x=1 we have a beautify equation, 1+\frac{1}{1!}+\frac{1}{2!}+...=e. We can use this to approximate e, by just taking a few terms in this infinite series we get an awfully close approximation.

    Example 80: We will do a different example this time, f(x)=\ln x. The problem with this one is that ideally the nicest looking series is when the center is zero, as we had before. The problem is that f(x)=\ln x is not infinitely differenciable at x=0 (the center) because it is not even defined there! Thus, we need to chose a different center for which the function is defined. We cannot chose x<0 for the same reason thus we will chose x=1. Thus, we have,
    f(x)=\ln x
    f'(x)=\frac{1}{x}
    f''(x)=-\frac{1}{x^2}
    f'''(x)=\frac{2}{x^3}
    .....
    Thus,
    f(1)=\ln 1 = 0
    f'(1)=\frac{1}{1}=1
    f''(1)=-\frac{1}{1^2}=-1
    f'''(1)=-\frac{2}{1^3}=2
    .....
    Thus, the coefficients in the Taylor series are,
    a_0=\frac{f(1)}{0!}=0
    a_1=\frac{f'(1)}{1!}=\frac{1}{1}
    a_2=-\frac{f''(1)}{2!}=-\frac{1}{2}
    a_3=\frac{f'''(1)}{3!}=\frac{1}{3}
    .....
    Thus, we have,
    \ln x = \frac{(x-1)}{1}-\frac{(x-1)^2}{2}+\frac{(x-1)^3}{3}-...=\sum_{n=1}^{\infty}(-1)^{n+1}\frac{(x-1)^n}{n}.
    By using the ratio test we find that to converge we need |x-1|<1 thus 0<x<2. Checking the endpoints we find that x=0 leads to negative harmonic series, which diverges to -\infty, and that x=2 leads to alternating harmonic series which converges. Thus, the interval of convergence is (0,2]. This is the interval on which this power series works. Specifically when x=2 we have,
    \ln 2 = 1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+.... A beautiful formula, it also shows what the sum of the alternating harmonic series is.
    Unlike the power series for e this one converges too slowly, meaning many many terms are needed to obtain accuracy. The trick to determine whether it is a fast or slow is so look at how quickly it gets small, since the exponential series gets small very quickly because factorials grow fast it is reasonable to say it converges quickly, and over here denominators get smaller slowly thus it is reasonable to say the series does not converge quickly enough. Of course, this trick does not always work but you can look out for it.


    There are two extremely important series that appear a lot, the sine and cosine. The standard way how this is done in Calculus III class is by the use of the following facts (\sin x)'=\cos x and (\cos x)'=-\sin x. However, mathematicians do not really consider the derivations in Calculus I class to be full of rigor. And hence mathematicians define sine and cosine as follows:
    \sin x = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}x^{2n+1}
    \cos x =\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}x^{2n}.
    Note, by taking the derivatives term by term we obtain the fundamental derivative identities for sine and cosine. Since we done several Taylor series expansions it will be an excerise to find \sin x,\cos x using the derivative identities above. Furthermore, show that the interval of convergence for both is (-\infty,\infty).

    If you wish to read the next section, it would help to know this definition.

    Definition: A function is analytic at some point, means there exists a power series centered at the point having non-zero interval of convergence. Thus, for example, e^x is analytic everywhere, you can center the series anywhere.

    Application to Differencial Equations*
    We can use the above concepts of power series to solve the differencial equation on some non-zero open interval I,
    x^2y''+xy'-y=0.
    The idea here is to assume that there exists a function f(x) which is analytic at (say zero, for simplicity) that solves the differencial equation on some I.
    This is one of the more difficult ones because usually we divide by the expression in front to get,
    y''+\frac{1}{x}y'-\frac{1}{x^2}y=0.
    But the problem is that that can lead to division by zero, and we do not know if we can do that.
    When you will study differencial equations you will see that there is an entire theory that explains when and when not an analytic solution exists (a solution expressible as a power series). Since this is not a lecture on differencial equations we will just assume, if our assumption leads to a contradiction we will know that it was false. Thus, the assumption that we are making is that the solution is analytic, that is,
    y=\sum_{n=0}^{\infty} a_nx^n
    Thus,
    y'=\sum_{n=1}^{\infty} na_n x^{n-1}
    y''=\sum_{n=2}^{\infty} n(n-1)a_n x^{n-2}
    Substitute that into the differencial equation,
    x^2\sum_{n=2}^{\infty} n(n-1)a_nx^{n-2} + x\sum_{n=1}^{\infty} na_n x^{n-1} - \sum_{n=0}^{\infty} a_n x^n
    Multiply through,
    \sum_{n=2}^{\infty} n(n-1)a_n x^n + \sum_{n=1}^{\infty} na_n x^n - \sum_{n=0}^{\infty} a_nx^n=0
    To add these together we will evalute the second summation at n=1 and the third at n=0,1,
    \sum_{n=2}^{\infty} n(n-1)a_n x^n +\sum_{n=2}^{\infty} na_n x^n - \sum_{n=2}^{\infty} a_n x^n +(1)a_1x^1-a_00^n-a_1x^1 =0
    Combine, term by term,
    -a_0+\sum_{n=2}^{\infty} [n(n-1)a_n x^n + na_n x^n - a_n x^n] = 0=0+0x+0x^2+0x^3+...
    Thus,
    -a_0+\sum_{n=2}^{\infty} [n(n-1)a_n+na_n-a_n]x^n = 0+0x+0x^2+...
    Because, power series' are unique we have that each term must be zero,
    -a_0=0
    n(n-1)a_n + na_n-a_n=0 for n\geq 2
    a_n[n(n-1)+n-1]=0 for n\geq 2.
    Thus, a_n=0 for n\geq 2.
    We also found that a_0=0.
    Hence the only one which was non-zero was a_1.
    In fact, it is arbitraty because it ends up cancelling out.
    Thus, we have that, y=0+a_1x+0x^2+0x^3+...=a_1x is a solution.
    Meaning any line passing through the origin is a solution. In fact, it works out! Substitute it into the differencial equation to confirm the solution. But are there any more solutions? If they are then they cannot be analytic at zero because we found all. It turns out that 1/x is a solution, but the reason why we did get it is because 1/x is not analytic at x=0, it is not even defined there! There is a method for finding that other solution but that does not interest us because we just looked for analytic solutions.

    ~~~
    Excerises

    1)Find the power series for \sin x and \cos x as explained above.

    2)Find the power series for \ln \left( \frac{1+x}{1-x} \right). (Hint: Use the logarithm identities).

    3)Find the power series for  \ln |1-x| be integrating term by term the infinite geometric series.
    Note, \frac{1}{1-x}=1+x+x^2+x^3+... and integrate.

    4)Find the power series for e^x centered at x=1.

    5*)Solve the differencial equation, y''-2y'+y=0. It happens to be a slightly different result than the one we did together.

    6)What does .11345345345... represent as a rational number?
    Last edited by mash; March 5th 2012 at 08:41 PM. Reason: fixed latex
    Follow Math Help Forum on Facebook and Google+

  3. #18
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    I have finally finished my quest. I have finished this tutorial, it has the most important concepts from all over Calculus. With more than 150 examples and excerises you should master the basics of Calculus assuming I written it well. I give premission to anyone whoever wishes, to convert this into pdf file and and use it for teaching purposes, only on the condition that you give me the pdf document.
    Follow Math Help Forum on Facebook and Google+

  4. #19
    Member
    Joined
    Oct 2006
    Posts
    88
    Im starting calculus 2 this semister and forgot alot of the stuff from calculus 1 over the 3-4 week break we had...Im struggling right now to try to remember all the stuff we learned last semister and this tutorial is pretty helpful. It would have been nice to have the answer to the excersices to see if I did them correctly though...even though they look pretty simple...I can't help feeling that I should know how to do those excerises =[

    ...On example 33, did you make an error with the 1/2 turning into 1/3? i can't figure that problem out...how did you get \frac{1}{2} \int u^3 u' dx=\frac{1}{3} \int u^4 du
    Last edited by mash; March 5th 2012 at 08:42 PM. Reason: fixed latex
    Follow Math Help Forum on Facebook and Google+

  5. #20
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by jeph View Post

    ...On example 33, did you make an error with the 1/2 turning into 1/3? i can't figure that problem out...how did you get \frac{1}{2} \int u^3 u' dx=\frac{1}{3} \int u^4 du
    Yes I made a mistake. I fixed it now.
    Last edited by mash; March 5th 2012 at 08:42 PM. Reason: fixed latex
    Follow Math Help Forum on Facebook and Google+

  6. #21
    Super Member
    Joined
    Oct 2006
    Posts
    679
    Awards
    1

    Re:

    ThePerfectHacker how long did it take you to right this. A Year?
    Follow Math Help Forum on Facebook and Google+

  7. #22
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by qbkr21 View Post
    ThePerfectHacker how long did it take you to right this. A Year?
    1 Month.

    I am hoping to right another one during my summer vacation (which at CCNY is longgggggg).
    Follow Math Help Forum on Facebook and Google+

  8. #23
    Senior Member
    Joined
    Apr 2006
    Posts
    401
    You should write one in Number Theory. So many cool applications. And after all, number theory in my opinion is the heart of mathematics.
    Follow Math Help Forum on Facebook and Google+

  9. #24
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by AfterShock View Post
    You should write one in Number Theory. So many cool applications. And after all, number theory in my opinion is the heart of mathematics.
    No. Set theory and Logic are the foundations of math.

    But if you are saying number theory is the most interesting, let me add to that. Thus far, out of all the things I have learned in math there is not a single thing that I disliked (except probably statistics, but that is not usually considered even math). Thus, it should not matter what you chose to learn.
    Follow Math Help Forum on Facebook and Google+

  10. #25
    Senior Member
    Joined
    Apr 2006
    Posts
    401
    Quote Originally Posted by ThePerfectHacker View Post
    No. Set theory and Logic are the foundations of math.

    But if you are saying number theory is the most interesting, let me add to that. Thus far, out of all the things I have learned in math there is not a single thing that I disliked (except probably statistics, but that is not usually considered even math). Thus, it should not matter what you chose to learn.
    Perhaps at your institution it's different, but set theory and logic are part of discrete math, which is incorporated into number theory.
    Follow Math Help Forum on Facebook and Google+

  11. #26
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by AfterShock View Post
    which is incorporated into number theory.
    I like to divide math into 3 parts:
    Algebra, Analysis, Geometry.

    Stuff such as: number theory, set theory, field theory, ... is algebra.

    Stuff such as: PDE's, Harmonic Analysis, Functional Analysis,... is analysis.

    Stuff such as: Topology*, Projective Geometry, Differenciable Geometry, ... is geometry.

    Thus, I think of myself as an algebraist.
    Follow Math Help Forum on Facebook and Google+

  12. #27
    Super Member
    Joined
    Oct 2006
    Posts
    679
    Awards
    1
    Quote Originally Posted by ThePerfectHacker View Post

    Example 12: An angry husband throws up his wife with the initial velocity at 96 feet per second. They live on a cliff with the altitude of 960 feet. Find how much time will pass until his wife reach maximum hieght. And find the amount of time until he hits the ground and dies.
    The function is,
    s(t)=-16t^2+96t+960.
    Maximum height is when velocity is zero.
    Thus,
    s'(t)=v(t)=-32t+96=0
    The amount of time that passes is when the height is zero when she comes crashing down.
    s(t)=-16t^2+96t+960=0.

    PerfectHacker this problem doesn't make any since. You stated she was thrown off of the cliff with an initial velocity of 96 feet per second? No Way!! That means that he threw her off at nearly 65.5 miles per hour...
    Last edited by mash; March 5th 2012 at 08:42 PM. Reason: fixed latex
    Follow Math Help Forum on Facebook and Google+

  13. #28
    Senior Member ecMathGeek's Avatar
    Joined
    Mar 2007
    Posts
    436
    Quote Originally Posted by ThePerfectHacker View Post
    No. Set theory and Logic are the foundations of math.

    But if you are saying number theory is the most interesting, let me add to that. Thus far, out of all the things I have learned in math there is not a single thing that I disliked (except probably statistics, but that is not usually considered even math). Thus, it should not matter what you chose to learn.
    I'd like instruction on logic and proofs.
    Follow Math Help Forum on Facebook and Google+

  14. #29
    Junior Member
    Joined
    May 2007
    Posts
    25
    I would like to add my thanks for this thread, it's everything that my textbook is, and then some. I very much like the presentation, it's really easy to read. I have one question though.

    Throughout, I see [math ] [/math ] tags, are they supposed to be interpreted by some software I don't have, or do they serve as a warning to the reader of where text ends and math begins?
    Follow Math Help Forum on Facebook and Google+

  15. #30
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    10
    Quote Originally Posted by turillian@gmail.com View Post
    I would like to add my thanks for this thread, it's everything that my textbook is, and then some. I very much like the presentation, it's really easy to read. I have one question though.

    Throughout, I see [math ] [/math ] tags, are they supposed to be interpreted by some software I don't have, or do they serve as a warning to the reader of where text ends and math begins?
    The math software on this site is disabled. Once it is fixed you will see all the equations.
    Follow Math Help Forum on Facebook and Google+

Page 2 of 4 FirstFirst 1234 LastLast

Similar Math Help Forum Discussions

  1. Replies: 2
    Last Post: May 5th 2011, 04:09 PM
  2. question re: the "calculus tutorial" thread
    Posted in the Calculus Forum
    Replies: 1
    Last Post: December 16th 2009, 06:58 PM
  3. Question to Calculus Tutorial
    Posted in the Calculus Forum
    Replies: 4
    Last Post: June 7th 2009, 03:05 PM
  4. Free calculus tutorial videos online
    Posted in the Math Forum
    Replies: 2
    Last Post: October 22nd 2008, 09:47 PM

Search Tags


/mathhelpforum @mathhelpforum