# Measuring model uncertainty

• December 22nd 2009, 05:19 AM
JohnQ
Measuring model uncertainty
I have a model with a number of parameters. I can calculate the posterior distribution of the parameter vector. This, in part, means that I can calculate the expected value and the variance of each parameter.

Question. How do I calculate model uncertainty as a single quantity? While I have some ad hoc ideas, I'd like to take a principled approach.

One idea is to look at the variance of the likelihood: $Var( L(\theta) )$ ? I'm not sure how to approach this, other than numerically.

Another idea is to compute something like this:
$\sum_i \left | \frac {\partial L} {\partial \theta_i} \right | \sigma_i$

Any and all help appreciated. :)
• December 22nd 2009, 09:57 AM
hametceq
hi,
I think you are thinking too complicated, you know already the variances of each parameter. If parameters are independent then system uncertainty would be the sum of the variances of each parameter. If they are correlated you need to take into the account the correlation factor too. Hope this will help. Merry Christmas.
Sincerely
Hametceq
• December 22nd 2009, 10:19 AM
JohnQ
I thought of the sum of the variances at first. But I do not understand why that should be the correct approach. Why not the sum of the standard deviations, for example?

Also, I do think that the likelihood should figure somehow into the calculation, since different parameters might have different importance in the model. An important parameter having a low variance and an unimportant one having a high variance should be better than the important parameter having a high variance and an unimportant one having a low variance.

As an extreme example, consider a parameter that has no influence on the model at all. Then its variance (or standard deviation) should not be figured into the sum. That's why I thought of weighing the standard deviations by the derivative of the likelihood with respect to the parameters. In the case of the parameter that has no influence on the model, the derivative would be zero.
• December 22nd 2009, 11:29 AM
hametceq
I see what is the question now, OK do the following,
Strategy 1:
STEP1: find the covariance matrix for the all parameters.
STEP2: find the generalized variance (determinant)

Strategy 2 (detailed analysis, I will do this for my analysis)
STEP1: find the covariance matrix for the all parameters.
STEP2: find eigenvalues and eigenvectors for that matrix (you can not set by yourself which one is less or more important so multivariate analysis should be applied).
STEP4: eliminate less important eigenvalues and correspondingly you will reduce dimension.
STEP5: find the generalized variance from the covariance matrix from the important parameters.

Merry Christmas
Sincerely
Hametceq (Hi)

p.s.
Q:Why not the sum of the standard deviations, for example?
Ans: Not for you case, but generally because Variances are additive for independent r.v.s, but SDs are not.