Results 1 to 2 of 2

Math Help - Derivative of a matrix inverse and matrix determinant

  1. #1
    Newbie
    Joined
    Mar 2007
    Posts
    24

    Derivative of a matrix inverse and matrix determinant

    Hi,

    my multivariate statistics course strikes again with unproven linear algebra results!

    This time, there's a calculation in which I have to calculate the derivatives of the inverse of a matrix and of the determinant of the matrix, by varying the matrix itself.

    Luckily, in my course we are working only with strictly positive definite symmetric matrices.

    I would like to ask if the following results are correct.

    Result 1:

    \frac{d{\Sigma ^ {-1}}}{d \Sigma}=-{\Sigma}^{-2}

    Proof:

    It can be easily proven that the derivative of a matrix product AB follows the same rule of the derivative of the product of two scalars (by calculating the limit of the product variation and neglecting infinitesimals of superior order), of course, if we assume that A and B are both differentiable.


    \frac{d{AB}}{d \Sigma}=\frac{d{A}}{d \Sigma}B+A\frac{d{B}}{d \Sigma}

    Now, if our matrix variable is constrained to be always invertible, we know that:

    \frac{ d{\left(\Sigma \Sigma^{-1}\right)}}{d \Sigma}=\Sigma^{-1}+\Sigma\frac{d{\Sigma^{-1}}}{d \Sigma}=\frac{ d{I}}{d \Sigma}=0

    Where 0 is a null square matrix. Solving the equation gives the preceding result. Notice that the first result requires the variable to be always invertible, not necessarily symmetric or positive definite.

    Result 2: (this one requires the matrix to be symmetric and invertible)

    \frac{ d{|\Sigma|}}{d \Sigma}=|\Sigma|tr(\Sigma^{-1})

    Proof:

    Let \Lambda be the diagonal form of Sigma

    |\Sigma|=\prod _{i=1}^{n}{\lambda_i}


    \frac{d|\Sigma|}{d\Sigma}=\left( \prod _{i=1}^{n}{\lambda_i}\right )\left(\sum _{i=1}^{n} {\left( \lambda_i^{-1}\frac{d\lambda_i}{d\Sigma} \right)}\right)=|\Sigma|tr\left(\Lambda^{-1}\frac{d\Lambda}{d\Sigma} \right )

    \Lambda=R' \Sigma R

    Where R is also variable depending on Sigma, but constrained to:

    R'R=I

    Deriving the constraint we get:

    \frac{dR'}{d\Sigma}R=W=-W'


     \frac{d\Lambda}{d\Sigma}=\frac{dR'}{d\Sigma}\Sigma R+R'R+R'\Sigma\frac{dR}{d\Sigma}=W\Lambda+I+\Lambd  a W'


    Combining the results and remembering that the trace is linear and commutes, we finally get:

    tr\left(\Lambda^{-1}\frac{d\Lambda}{d\Sigma} \right )=tr(\Lambda^{-1})=tr(\Sigma^{-1})


    We assumed to vary Sigma on a set constrained to be always symmetric and invertible.

    Actually, we can get the same result by using similar calculations on any Sigma constrained to be invertible, even though not diagonalizable, by using the triangular form (which is always possible if we extend to complex variables).
    Last edited by rargh; February 24th 2011 at 12:27 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Newbie
    Joined
    Mar 2007
    Posts
    24
    I'm sorry, I should have checked before on the textbooks, this result is already known since ages. It's just a special case where the variable using in the derivation is the whole matrix.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. [SOLVED] Finding the inverse of a matrix using it's elementary matrix?
    Posted in the Advanced Algebra Forum
    Replies: 3
    Last Post: March 7th 2011, 06:08 PM
  2. Matrix: Determinant-Inverse-Transpose-Adjoint
    Posted in the Math Software Forum
    Replies: 5
    Last Post: July 1st 2010, 08:36 AM
  3. Derivative of matrix inverse
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: March 7th 2010, 04:49 PM
  4. Replies: 1
    Last Post: January 1st 2010, 10:37 AM
  5. Determinant of a 5 by 5 matrix
    Posted in the Advanced Algebra Forum
    Replies: 6
    Last Post: July 11th 2009, 07:25 PM

/mathhelpforum @mathhelpforum