# Math Help - Integration problem

1. ## Integration problem

This must be a classical problem, but I don't know where to start. I have a simple (actually it's more complicated, but essentially like this) integral: . It seems that the function x(t) could be best described as a normal (Gaussian) process with mean mu and some variance (so x(t) is a random variable with Gaussian distribution). I think y(t) must then be also a random variable, but is there any way to define y(t) in a closed form? Or is this problem reasonable at all? How should I continue? Thanks for your help!

2. Originally Posted by hapap
This must be a classical problem, but I don't know where to start. I have a simple (actually it's more complicated, but essentially like this) integral: . It seems that the function x(t) could be best described as a normal (Gaussian) process with mean mu and some variance (so x(t) is a random variable with Gaussian distribution). I think y(t) must then be also a random variable, but is there any way to define y(t) in a closed form? Or is this problem reasonable at all? How should I continue? Thanks for your help!
I suspect that there is something missing here, can you provide more information about the real problem?

RonL

I'll try to explain more in detail: I have observations y(t), which are "known" (=assumed) to be of the previously presented integral form. Now, x(t) is the true value to be estimated, which is conventionally assumed to be some deterministic function. In such case it's easy to handle y. However, it is known that the observations are contaminated with some noise, so it would be natural to deal x(t) as a random variable (mu = true value) instead of a deterministic, and therefore also y(t) would be random as well(?). The next step will be to find ML-estimate for mu when only observations y(t) are known (x,mu,sigma unknown), but I'd like to get rid of the integral before searching (at least numerical) ML-estimate. If I can find a distribution for y, I could find ML-estimate for mu (and sigma), right?

The problem would of course be easily solved by adding the noise term OUTSIDE the integral, but I don't want to do that because I need to separate true variations of x and measurement equipment noise, while the observation probably contains both.

I don't know if this makes the problem even less understandable...

4. Originally Posted by hapap
I'll try to explain more in detail: I have observations y(t), which are "known" (=assumed) to be of the previously presented integral form. Now, x(t) is the true value to be estimated, which is conventionally assumed to be some deterministic function. In such case it's easy to handle y. However, it is known that the observations are contaminated with some noise, so it would be natural to deal x(t) as a random variable (mu = true value) instead of a deterministic, and therefore also y(t) would be random as well(?). The next step will be to find ML-estimate for mu when only observations y(t) are known (x,mu,sigma unknown), but I'd like to get rid of the integral before searching (at least numerical) ML-estimate. If I can find a distribution for y, I could find ML-estimate for mu (and sigma), right?

The problem would of course be easily solved by adding the noise term OUTSIDE the integral, but I don't want to do that because I need to separate true variations of x and measurement equipment noise, while the observation probably contains both.

I don't know if this makes the problem even less understandable...
I will have to go away and think about this now, but at least there is a

RonL

5. Originally Posted by hapap
This must be a classical problem, but I don't know where to start. I have a simple (actually it's more complicated, but essentially like this) integral: . It seems that the function x(t) could be best described as a normal (Gaussian) process with mean mu and some variance (so x(t) is a random variable with Gaussian distribution). I think y(t) must then be also a random variable, but is there any way to define y(t) in a closed form? Or is this problem reasonable at all? How should I continue? Thanks for your help!
I see two problems with $y(t) = \int_0^{\infty} x(t)\ dt,\ x(t) \sim N(\mu,\sigma).$ First, $y(t)$ cannot be not a function of $t$ since that is the variable of integration. Second, the integral is (almost surely) not finite since the $x(t)$ are identically distributed and the interval of integration is infinite.

To correct these problems, suppose we define $y = \int_0^1 x(t)\ dt,\ x(t) \sim N(\mu,\sigma).$ Now I want to argue informally that the random variable $y = \mu$ almost surely. Based on my brief reading of a bit of stochastic calculus, I think we need to do the following to evaluate the integral. Take $\Omega$ as the underlying probability space for the random variables $x(t)$ and write for each realization $\omega \in \Omega$, $y(\omega) = \int_0^1 x(t,\omega)\ dt.$ Using the Riemann integral, $y(\omega) = \lim_{n \to \infty} \sum_{i=1}^n x(i/n,\omega)/n .$ From the Central Limit Theorem, that limit is $\mu$ for almost all $\omega,$ and therefore the random variable $y = \mu$ almost surely.

So I think that, as you've formulated the problem, $y$ is constant almost surely and thus not much of a random variable and not getting at what you want.

6. Originally Posted by JakeD
$y(t)$ cannot be not a function of $t$ since that is the variable of integration. Second, the integral is (almost surely) not finite since the $x(t)$ are identically distributed and the interval of integration is infinite.
Oh yes, of course. I might want to correct the formula to this then: Does this make any more sense? However, I'm interested (among other things) what happens when t approaches to infinity, although we won't get infinite number of measurements y in real life.

7. Originally Posted by hapap
Oh yes, of course. I might want to correct the formula to this then: Does this make any more sense?
OK, now $y(t)$ is well-defined. But my second paragraph still applies, except that now $y(t) = \mu t$ almost surely.