1. ## Simulation

Suppose you have some random vector $\bold{X} = (X_1, \dots, X_n)$ with a density function $f(x_1, \dots, x_n)$. Suppose you want to compute the expected value of some function of the random vector (e.g. $E[g(\bold{X})]$). We know that

$E[g(\bold{X})] = \int \int \cdots \int g(x_1, \dots, x_n) f(x_1, \dots, x_n) dx_1 \cdots dx_n$

What is a good way of developing approximation methods for computing $E[g(\bold{X})]$? If $g(\bold{X}) = \bold{X}$ then we are just computing the expected value of $\bold{X}$. So then we can reduce this question (in this case) to what are some good ways of approximating the expected value of a random vector?

2. Originally Posted by Sampras
Suppose you have some random vector $\bold{X} = (X_1, \dots, X_n)$ with a density function $f(x_1, \dots, x_n)$. Suppose you want to compute the expected value of some function of the random vector (e.g. $E[g(\bold{X})]$). We know that

$E[g(\bold{X})] = \int \int \cdots \int g(x_1, \dots, x_n) f(x_1, \dots, x_n) dx_1 \cdots dx_n$

What is a good way of developing approximation methods for computing $E[g(\bold{X})]$? If $g(\bold{X}) = \bold{X}$ then we are just computing the expected value of $\bold{X}$. So then we can reduce this question (in this case) to what are some good ways of approximating the expected value of a random vector?

If you can sample from the distribution with density $f(x_1, \dots, x_n)$ then the mean of the function values for the sample is an unbiased estimator of the expectation of $g(\bold{X})$.

CB

3. Originally Posted by CaptainBlack
If you can sample from the distribution with density $f(x_1, \dots, x_n)$ then the mean of the function values for the sample is an unbiased estimator of the expectation of $g(\bold{X})$.

CB
In other words, $\lim\limits_{n \to \infty} \frac{g(\bold{X}^{(1)}) + \cdots + g(\bold{X}^{(n)})}{n} = E[g(\bold{X})]$?

4. Originally Posted by Sampras
In other words, $\lim\limits_{n \to \infty} \frac{g(\bold{X}^{(1)}) + \cdots + g(\bold{X}^{(n)})}{n} = E[g(\bold{X})]$?
Yes

CB