Results 1 to 4 of 4

Math Help - Differentiability

  1. #1
    Member
    Joined
    Mar 2010
    Posts
    107

    Differentiability

    Hey everyone. I am having trouble understanding the following statements from my textbook (Essential Calculus by Stelwart)

    "..We showed that if f is differentiable at a, then

    \Delta y= f'(a)\Delta x + \epsilon\Delta x where \epsilon \rightarrow 0 as \Delta x \rightarrow 0"

    What does the epsilon delta x have to do with this equation? I divided everything by delta x and got this:

    \frac{\Delta y}{\Delta x} - f'(a) = \epsilon

    I'm assuming lim is involved so this is what it would be:

    \lim_{x\rightarrow 0}\frac{\Delta y}{\Delta x} - f'(a) = \epsilon

    So is epsilon error? If so, why would there be error? I'm definitely confused.
    The book goes on to say more:
    Now consider a function of two variables, z = f(x,y), and the suppose x changes from a to a+\Delta x and y changes from b to b+\Delta y. Then the corresponding increment of z is

    \Delta z = f(a + \Delta x,b + \Delta y) - f(a,b)

    Thus, the increment delta z represents the change in the value of f when (x,y) changes from (a,b) to (a+\Delta x, b+\Delta y). by analogy with (5) (which is the first equation with the epsilon crap) we define the differentiability of a function of two variables as follows:

    If z = f(x,y), then f is differentiable at (a,b) if delta x can be expressed in the form

    \Delta z = f_x(a,b)\Delta x + f_y(a,b)\Delta y + \epsilon_1\Delta x + \epsilon_2\Delta y

    where \epsilon_1 and \epsilon_2 \rightarrow 0 as (\Delta x,\Delta y) \rightarrow (0,0).

    It makes sense except the epsilon stuff.

    Any help is appreciated and thanks in advance!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    Apr 2005
    Posts
    16,232
    Thanks
    1795
    Basically, yes, \epsilon is "error" (more correctly a "relative error"- the actual error divided by the variable). And there is error because we are approximating the, possibly very complex, function, f, by a linear function. We can approximate the function f(x) by the linear function f'(a)(x- a)+ f(a). The "error" is the true value, y= f(x), minus that: error(x)= y- [f'(a)(x-a)+ f(a)]= (y- f(a))+ f'(a)(x- a). Now, taking \Delta y= y- f(a) and \Delta x= x- a, that says error(x)= \Delta y+ f'(a)\Delta x.

    Since we know f(a) we would certainly want our error to be 0 there. And, of course, the further from x= a we are, that is the further we are from the point where we have exact information, we would expect our error to get larger. That is, we would expect our error to be some "relative error" function times \Delta x. That "relative error" is what they are calling \epsilon. The fact that the error, \epsilon\Delta x. goes to 0 as \Delta x goes to 0 simply means this is an approximation to f around x= a. The fact that \epsilon itself goes to 0 means this is the best possible linear approximation.

    In the rest of what you write, they are approximating a function of two variables by a linear function of two variables, which you can think of as giving a tangent plane rather than a tangent line. The two \epsilons are the relative errors in the directions of the coordinate axes. The errors in all other directions can be calculated as a vector sum of those.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Mar 2010
    Posts
    107
    Ah okay. But, your equation is a bit different from the book.

    The book has it: \Delta y= f'(a)\Delta x + \epsilon\Delta x

    We have it as one of these: (I actually drew all of that in Paint/MS Word... took forever)

    EDIT: Read the subsequent post.
    Last edited by lilaziz1; October 13th 2010 at 01:11 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Member
    Joined
    Mar 2010
    Posts
    107
    Wait hold up. What does this have to do with continuity and differentiability? In Chapter 2 (What I pasted in first post was Ch. 11), it says:

    "Recall that if y = f(x) and x changes from a to a + \Delta x, we defined the increment of y as  \Delta y = f(a + \Delta x) - f(a)

    According to the definition of a derivative, we have \lim_{\Delta x \rightarrow 0} \frac{\Delta y}{\Delta x} = f'(a)

    So if we denote by epsilon the difference between the quotient and the derivative, we obtain

    \lim_{\Delta x \rightarrow 0} \epsilon = \lim_{\Delta x \rightarrow 0} (\frac{\Delta y}{\Delta x} - f'(a)) = f'(a) - f'(a) = 0<br />

    But \epsilon = \frac{\Delta y}{\Delta x} - f'(a) \Rightarrow \Delta y = f'(a)\Delta x + \epsilon\Delta x<br />

    If we define epsilon to be 0 when \Delta x = 0, then epsilon becomes a continuous function of \Delta x.

    Thus, for a differentiable function f, we can write

    \Delta y = f'(a)\Delta x + \epsilon\Delta x where \epsilon\rightarrow 0 as \Delta x \rightarrow 0

    and epsilon is a continuous function of delta x. This property of differentiable function is what enables us to prove the Chain Rule."

    I'm lost after "If we define epsilon.."
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. differentiability
    Posted in the Calculus Forum
    Replies: 3
    Last Post: January 16th 2012, 07:31 AM
  2. Differentiability
    Posted in the Calculus Forum
    Replies: 2
    Last Post: September 21st 2010, 08:16 PM
  3. [SOLVED] Differentiability.
    Posted in the Differential Geometry Forum
    Replies: 3
    Last Post: May 19th 2010, 07:51 AM
  4. Differentiability
    Posted in the Calculus Forum
    Replies: 1
    Last Post: October 20th 2009, 12:14 PM
  5. Differentiability
    Posted in the Calculus Forum
    Replies: 4
    Last Post: September 30th 2008, 02:08 PM

Search Tags


/mathhelpforum @mathhelpforum