Results 1 to 2 of 2

Math Help - How do I differentiate a vector times vector?

  1. #1
    Senior Member Twig's Avatar
    Joined
    Mar 2008
    From
    Gothenburg
    Posts
    396

    How do I differentiate a vector times vector?

    Hi

    This came up as I was reading about Non-linear least squares fitting.

    We have data points  (t_{i},y_{i}) \, i=1 \cdots \, m .
    We wish to find the vector \vec{x} of parameters that gives the best fit in the least squares sense.

    So we have our model function  f(t,\vec{x}) \mbox{ , }f:\mathbb{R}^{n+1}\longrightarrow \mathbb{R} .

    Now we define the residual function  \vec{r}:\mathbb{R}^{n}\longrightarrow \mathbb{R}^{m} by  r_{i}(\vec{x})=y_{i}-f(t_{i},\vec{x}) \mbox{ , }i=1 \cdots \, m .

    Then we wish to minimize the function  \phi(\vec{x})=\frac{1}{2}\vec{r}(\vec{x})^{T}\vec{  r}(\vec{x}) .

    Here is my question, how would I get the gradient of this function?

    The book says \nabla \phi(\vec{x}) = J^{T}(\vec{x})\vec{r}(\vec{x}) , where J denotes the Jacobian.

    I donīt really follow..

    thanks!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    MHF Contributor

    Joined
    May 2008
    Posts
    2,295
    Thanks
    7
    Quote Originally Posted by Twig View Post
    Hi

    This came up as I was reading about Non-linear least squares fitting.

    We have data points  (t_{i},y_{i}) \, i=1 \cdots \, m .
    We wish to find the vector \vec{x} of parameters that gives the best fit in the least squares sense.

    So we have our model function  f(t,\vec{x}) \mbox{ , }f:\mathbb{R}^{n+1}\longrightarrow \mathbb{R} .

    Now we define the residual function  \vec{r}:\mathbb{R}^{n}\longrightarrow \mathbb{R}^{m} by  r_{i}(\vec{x})=y_{i}-f(t_{i},\vec{x}) \mbox{ , }i=1 \cdots \, m .

    Then we wish to minimize the function  \phi(\vec{x})=\frac{1}{2}\vec{r}(\vec{x})^{T}\vec{  r}(\vec{x}) .

    Here is my question, how would I get the gradient of this function?

    The book says \nabla \phi(\vec{x}) = J^{T}(\vec{x})\vec{r}(\vec{x}) , where J denotes the Jacobian.

    I donīt really follow..

    thanks!
    \phi(x)=\frac{1}{2}\sum_{i=1}^n (r_i(x))^2. thus for any 1 \leq j \leq n we have \frac{\partial \phi}{\partial x_j}=\sum_{i=1}^n r_i(x) \frac{\partial r_i}{\partial x_j}, which is exactly \nabla \phi(x) = J^T(x) r(x).
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Differentiate n times
    Posted in the Differential Equations Forum
    Replies: 2
    Last Post: November 6th 2011, 07:16 PM
  2. Differentiate with respect to vector
    Posted in the Calculus Forum
    Replies: 1
    Last Post: November 10th 2010, 09:09 PM
  3. (vector times its transpose)^2 proof?
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: July 6th 2010, 01:44 PM
  4. Simplify then differentiate times 3
    Posted in the Calculus Forum
    Replies: 1
    Last Post: May 5th 2010, 11:26 AM
  5. Proof for the div (scalar times a vector)
    Posted in the Calculus Forum
    Replies: 1
    Last Post: September 26th 2009, 12:35 PM

Search Tags


/mathhelpforum @mathhelpforum