# Thread: Integrals over independent random variables

1. ## Integrals over independent random variables

Suppose for $\displaystyle i\in[0,1]$, $\displaystyle U(i)$ is a uniformly distributed independent random variable.

Am I right in thinking that for any continuous function $\displaystyle f$:

$\displaystyle \int_0^1{f(U(i))di}=\int_0^1{E[f(U(i))]di}$

where $\displaystyle E$ is the expectation operator? (That is if these integrals are actually defined?) Is there a reference for this?

I imagine it's a straight forward application of the MCT and the LLN, but I'm just a little bit worried by the fact that Brownian motion is often referred to as an integral of white noise, though perhaps more strictly it's an integral with respect to white noise, which is why that's different?

Tom

2. Originally Posted by cfp
Suppose for $\displaystyle i\in[0,1]$, $\displaystyle U(i)$ is a uniformly distributed independent random variable.

Am I right in thinking that for any continuous function $\displaystyle f$:

$\displaystyle \int_0^1{f(U(i))di}=\int_0^1{E[f(U(i))]di}$

where $\displaystyle E$ is the expectation operator? (That is if these integrals are actually defined?) Is there a reference for this?

I imagine it's a straight forward application of the MCT and the LLN, but I'm just a little bit worried by the fact that Brownian motion is often referred to as an integral of white noise, though perhaps more strictly it's an integral with respect to white noise, which is why that's different?

Tom
What do you think $\displaystyle U(i)$ denotes? Or more precisely what do you think $\displaystyle f(U(i))$ denotes?

CB

3. There are a continuum of independent uniform random variables indexed by $\displaystyle i\in[0,1]$. For each such $\displaystyle i$, $\displaystyle U(i)$ is the realisation of that random variable, that is to say, a sample from it. (So a value in $\displaystyle [0,1]$.)

(If you want to be precise the state space $\displaystyle \Omega$ is the set of all functions from $\displaystyle [0,1]$ to $\displaystyle [0,1]$, and each random variable is a functional $\displaystyle p_i$ on this space defined by $\displaystyle p_i(\omega)=\omega(i)$. If $\displaystyle \omega\in\Omega$ is the realised state then $\displaystyle U(i)=p_i(\omega)=\omega(i)$.)

I perhaps should have been clearer in my original message. It's traditional in the field I work in to blur the distinction between random variables and samples from them, for notational convenience.

4. Originally Posted by cfp
There are a continuum of independent uniform random variables indexed by $\displaystyle i\in[0,1]$. For each such $\displaystyle i$, $\displaystyle U(i)$ is the realisation of that random variable, that is to say, a sample from it. (So a value in $\displaystyle [0,1]$.)

(If you want to be precise the state space $\displaystyle \Omega$ is the set of all functions from $\displaystyle [0,1]$ to $\displaystyle [0,1]$, and each random variable is a functional $\displaystyle p_i$ on this space defined by $\displaystyle p_i(\omega)=\omega(i)$. If $\displaystyle \omega\in\Omega$ is the realised state then $\displaystyle U(i)=p_i(\omega)=\omega(i)$.)

I perhaps should have been clearer in my original message. It's traditional in the field I work in to blur the distinction between random variables and samples from them, for notational convenience.
You have to specify the set for which your random variable is uniformly distributed on, so here you want $\displaystyle U_i(0,1)$. Without specifying that $\displaystyle U_i$ is uniform over $\displaystyle [0,1]$ (or $\displaystyle (0,1)$ ) you have not given a statement of the problem.

CB

5. Ahh I misunderstood your original point. But yeah, standard uniforms are fine (i.e. on $\displaystyle [0,1]$), though it's trivial to see that if my claim holds for any univariate probability distribution admitting a density (not least any uniform), then it holds for all univariate probability distributions which admit a density.

Which brings us back to the original question, is my claim that:

$\displaystyle \int_0^1{f(U(i))di}=\int_0^1{E[f(U(i))]di}$

true?

Thanks,

Tom

6. Originally Posted by cfp
Ahh I misunderstood your original point. But yeah, standard uniforms are fine (i.e. on $\displaystyle [0,1]$), though it's trivial to see that if my claim holds for any univariate probability distribution admitting a density (not least any uniform), then it holds for all univariate probability distributions which admit a density.

Which brings us back to the original question, is my claim that:

$\displaystyle \int_0^1{f(U(i))di}=\int_0^1{E[f(U(i))]di}$

true?

Thanks,

Tom
To me both sides look like they are equal to $\displaystyle E(f(x)),\ X \sim U(0,1)$ , but I'm no expert on stochastic calculus of any kind.

CB

7. Gah, I've actually just noticed that there was a mistake in my description of the problem. $\displaystyle f$ is also a function of $\displaystyle i$, so the question is actually whether:

$\displaystyle \int_0^1{f(i,U(i))di}=\int_0^1{E[f(i,U(i))]di}$