Hi.

Firstly this topic involves both linear algebra and statistics so I'm not sure what this falls under, so sorry if this is not the right forum for this topic.

I have a matrix-valued random variable, C. The elements within each instance of C are dependant, but the elements across samples are independent with known (say Gaussian) distributions with known means and variances:

C_{ij} \sim N(\mu_{C_{ij}},\sigma^2_{C_{ij}})



I have another variable, v, which is ( a rather complicted) function of C, such that:

v=\lambda(W^TCW)

where

W=(L^TC^{-1}L)^{-1}L^TC^{-1}

\lambda is the largest singular value and L is a matrix constant.

I want to find an analytical expression for the mean and variance of v. So to break this down, what i fundamentally need to know is what happens to a matrix-valued random distributions during a dot product, inverse and SVD.

Currently i am generating several instances C and calculating v for each one and working out the moments of v from that, but this takes a long time as the matrices are quite large, so an analytical expression will save me a lot of time.


Is this possible? Any help would be greatly appreciated.

Mark