Results 1 to 4 of 4

Thread: [SOLVED] Derivative Vectors/Gradient

  1. #1
    Member
    Joined
    Jul 2008
    Posts
    119

    [SOLVED] Derivative Vectors/Gradient

    Problem:
    For each of the following functions, find the derivative vector $\displaystyle \nabla f(x)$ for those points $\displaystyle x \in \mathbb{R}^n$ where it is defined:
    a. $\displaystyle f(x)=e^{\| x \| ^2}$
    c.$\displaystyle f(x) = \frac{1}{\| x \|^2}$
    ================================
    a. I know that $\displaystyle \| x \| = \sqrt{x_{1}^2+x_{2}^2+ ... + x_{n}^2}$. As a result, then $\displaystyle \| x \|^2 = x_{1}^2+x_{2}^2+ ... + x_{n}^2 $

    So, if $\displaystyle f(x)=e^{\| x \| ^2} \implies f(x) = e^{x_{1}^2+x_{2}^2+ ... + x_{n}^2}$

    By taking the partial derivatives, then

    $\displaystyle \frac{\partial f}{\partial x_{1}}(x)= 2x_{1}e^{x_{1}^2+x_{2}^2+ ... + x_{n}^2} $

    ....

    $\displaystyle \frac{\partial f}{\partial x_{n}}(x)= 2x_{n}e^{x_{1}^2+x_{2}^2+ ... + x_{n}^2} $

    So, the gradient would be

    $\displaystyle \nabla f(x) = \left( \frac{\partial f}{\partial x_{1}}(x), \frac{\partial f}{\partial x_{2}}(x), ....,\frac{\partial f}{\partial x_{n}}(x) \right) = $

    $\displaystyle \left( 2x_{1}e^{x_{1}^2+x_{2}^2+ ... + x_{n}^2}, ..., 2x_{n}e^{x_{1}^2+x_{2}^2+ ... + x_{n}^2}\right)$

    However, I have a hard time understanding the gradient vector/derivative vector. I'm slightly confused because I thought $\displaystyle \nabla f(x)$ was called the gradient and not a vector. If I can just understand this, then I can do (c) as well.

    Thank you for your time.
    Last edited by Paperwings; Aug 22nd 2008 at 03:48 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member Spec's Avatar
    Joined
    Aug 2007
    Posts
    318
    Quote Originally Posted by Paperwings View Post
    However, I have a hard time understanding the gradient vector/derivative vector. I'm slightly confused because I thought $\displaystyle \nabla f(x)$ was called the gradient and not a vector. If I can just understand this, then I can do (c) as well.
    It is called the gradient, but it's also a vector. In fact, the gradient is the normal vector to the point $\displaystyle \nabla f(\overline{x})$, and it also points in the direction that the function is growing the fastest at that point.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Member
    Joined
    Jul 2008
    Posts
    119
    Thank you Spec. I came to realized that my book referred to the gradient as the gradient vector or derivative vector.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Super Member wingless's Avatar
    Joined
    Dec 2007
    From
    Istanbul
    Posts
    585
    You can remember the definition of del.

    For $\displaystyle \mathbb{R}^3$,
    $\displaystyle \nabla = \frac{\partial}{\partial x}\mathbf{i} + \frac{\partial}{\partial y}\mathbf{j} + \frac{\partial}{\partial z}\mathbf{k}$

    For $\displaystyle \mathbb{R}^n$,
    $\displaystyle \nabla = \left ( \frac{\partial}{\partial x_1}, \frac{\partial}{\partial x_2} ,\frac{\partial}{\partial x_3}, \dots , \frac{\partial}{\partial x_n} \right ) = \sum_{i=1}^{n} \frac{\partial}{\partial x_i}\vec{e}_i $
    (that $\displaystyle \vec{e}_i$ is the standart basis of the space)

    So as you see here, del is taken as a vector.


    Now we can derive the operators (for $\displaystyle \mathbb{R}^3$).

    Let $\displaystyle f(x,y,z)$ be a scalar function.
    $\displaystyle \text{grad} f = \nabla f = \left (\frac{\partial}{\partial x}\mathbf{i} + \frac{\partial}{\partial y}\mathbf{j} + \frac{\partial}{\partial z}\mathbf{k}\right ) f = \frac{\partial f}{\partial x}\mathbf{i} + \frac{\partial f}{\partial y}\mathbf{j} + \frac{\partial f}{\partial z}\mathbf{k}$

    What we did here was multiplying a vector (del) by a scalar (f).


    Let F be a vector function such that $\displaystyle F(x,y,z) = M\mathbf{i} + N \mathbf{j} + P \mathbf{k}$.

    $\displaystyle \text{div} F = \nabla \cdot F = \left (\frac{\partial}{\partial x}\mathbf{i} + \frac{\partial}{\partial y}\mathbf{j} + \frac{\partial}{\partial z}\mathbf{k}\right )\cdot \left ( M\mathbf{i} + N \mathbf{j} + P \mathbf{k} \right ) = \frac{\partial M}{\partial x} + \frac{\partial N}{\partial y} + \frac{\partial P}{\partial z}$

    We just applied dot product here.


    And curl,
    $\displaystyle \text{curl} F = \nabla \times F = \left (\frac{\partial}{\partial x}\mathbf{i} + \frac{\partial}{\partial y}\mathbf{j} + \frac{\partial}{\partial z}\mathbf{k}\right )\times \left ( M\mathbf{i} + N \mathbf{j} + P \mathbf{k} \right ) = \left|
    \begin{array}{ccc}
    i & j & k \\
    \tfrac{\partial }{\partial x} & \tfrac{\partial }{\partial y} & \tfrac{\partial }{\partial z} \\
    M & N & P
    \end{array}
    \right|$
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Replies: 1
    Last Post: Oct 17th 2011, 02:52 PM
  2. Directional Derivatives and Gradient Vectors
    Posted in the Calculus Forum
    Replies: 4
    Last Post: Mar 22nd 2011, 06:42 AM
  3. Working with gradient vectors
    Posted in the Calculus Forum
    Replies: 4
    Last Post: Mar 11th 2011, 09:50 PM
  4. gradient of vectors/tensors
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: Nov 6th 2010, 08:16 AM
  5. gradient vectors
    Posted in the Calculus Forum
    Replies: 1
    Last Post: Mar 31st 2010, 01:53 AM

Search Tags


/mathhelpforum @mathhelpforum