# Thread: Using discrete and continuous rvs in a distribution

1. ## Using discrete and continuous rvs in a distribution

Suppose that W, the amount of moisture in the air on a given day, is a gamma random variable with parameters $\displaystyle (t, \beta)$
Suppose also that given that $\displaystyle W = w$, the number of accidents during that day - call it N - has a poisson distribution with mean $\displaystyle w$.
Show that the conditional distribution of W given that N = n is the gamma distribution with parameters $\displaystyle (t+n, \beta +\sum_{i = 1}^n x_i)$

I would like some help to write the formula for the second supposition, as it goes from a continuous rv to a discrete one

Thanks

2. Originally Posted by FGT12
Suppose that W, the amount of moisture in the air on a given day, is a gamma random variable with parameters $\displaystyle (t, \beta)$
Suppose also that given that $\displaystyle W = w$, the number of accidents during that day - call it N - has a poisson distribution with mean $\displaystyle w$.
Show that the conditional distribution of W given that N = n is the gamma distribution with parameters $\displaystyle (t+n, \beta +\sum_{i = 1}^n x_i)$

I would like some help to write the formula for the second supposition, as it goes from a continuous rv to a discrete one

Thanks
So we know the distribution of W and N|W. To get the distribution of W|N note that $\displaystyle f_{W|N}(w|n) \propto f_{W, N} (w, n) = f_{N|W} (n|w) f_W (w)$. You should be able to recognize the RHS as the kernal of a gamma with the appropriate parameters.

3. When I evalute the RHS I get
$\displaystyle \frac{e^{-(n+\beta)w} w^{x_1+...+x_n} (\beta)^t }{x_1!x_2!...x_n!\Gamma(t) }$

I do not see how to go further with this question

4. That shouldn't be what you get when you evaluate the RHS. Come to think of it, you didn't even define what $\displaystyle x_i$ is...I think what you intended was that $\displaystyle X_i | W$ is distributed Poisson with mean W, $\displaystyle i = 1, ..., n$ and you want the distribution of $\displaystyle W| X_1, ..., X_n$.

Can you post the question exactly as it is written? Among other things that don't make sense, if you interpret the question as I wrote it above you get $\displaystyle (t + \sum x_i, \beta + n)$ and not $\displaystyle (t + n, \beta + \sum x_i)$.

5. Question:
Let $\displaystyle W$ be a gamma random variable with parameters $\displaystyle (t, \beta )$, and suppose that conditional on $\displaystyle W = w, X_1, X_2, ..., X_n$ are independent exponential random variables with rate $\displaystyle w$.
Show that the conditional distribution of $\displaystyle W$ given that $\displaystyle X_1=x_1, X_2=x_2,..., X_n=x_n$ is gamma with parameters $\displaystyle (t+n, \beta + \sum_{i=1}^{n}x_i )$

6. Originally Posted by FGT12
Question:
Let $\displaystyle W$ be a gamma random variable with parameters $\displaystyle (t, \beta )$, and suppose that conditional on $\displaystyle W = w, X_1, X_2, ..., X_n$ are independent exponential random variables with rate $\displaystyle w$.
Show that the conditional distribution of $\displaystyle W$ given that $\displaystyle X_1=x_1, X_2=x_2,..., X_n=x_n$ is gamma with parameters $\displaystyle (t+n, \beta + \sum_{i=1}^{n}x_i )$
That is false; it should be that $\displaystyle W|X = x$ is $\displaystyle (t + \sum x_i, \beta + n)$.

$\displaystyle \displaystyle f_{W|X} (w|x) \propto f_{X|W} (x|w) f_W (w)$

$\displaystyle \displaystyle= \left(\prod_{i = 1} ^ n \frac{w^{x_i} e^{-w}}{x_i !}\right) \frac{\beta^t}{\Gamma(t)} w^{t - 1} e^{-\beta w} = \frac{\beta^t}{\Gamma(t) \prod_{i = 1} ^ n x_i !} w^{t + \sum x_i - 1} e^{-(\beta + n) w}$

$\displaystyle \displaystyle\propto w^{t + \sum x_i - 1} e^{-(\beta + n)w}$

(valid for positive w) which is the kernal of a Gamma $\displaystyle (t + \sum x_i, \beta + n)$.

7. what exactly do you mean by the kernel in this instance? How can I recognise other kernels for other distributions?

And is there no way of getting to

$\displaystyle f(w)=\frac{(\beta+n)e^{-(\beta+n)w}((\beta+n)w)^{t+\sum x_i -1}}{\Gamma(t+\sum x_i)}$

Could we integrate between infinity and minus infinity so that it equals one

8. When I say the "kernal" I mean the part of the density that matters, i.e. everything but a normalizing constant (in this case, the x_i are considered fixed so you can get rid of anything that is only a function of the x_i as well as any other fixed constants). You can retrieve the normalizing constant because the density must integrate to 1. If you can show that a pdf is proportional to the kernal of something you know then you know what the pdf is because you can get the normalizing constant by integrating the kernal.

The technique I used is nice because it saves you from having to calculate the marginal of X which requires integration.

To give another example, here are a couple of useful kernals for the normal distribution: $\displaystyle e^{\frac {-1} {2 \sigma^2} (x - \mu)^2}$ as well as $\displaystyle e^{\frac{-1}{2\sigma^2}(x^2 - 2x\mu)}$.

9. ## Re: Using discrete and continuous rvs in a distribution

So I'm working on this same problem and wondering, how is the claim of proportionality that you use here justified? I thought that it would be

$\displaystyle f_{W|X}(w|x) P_{X}(x) = f_{X|W}(x|w)P_{W}(w)$

so that when you divide, you're not dividing by a constant but instead dividing by a function of $\displaystyle x$.

10. ## Re: Using discrete and continuous rvs in a distribution

Yeah, that's fine. For finding the law of W|X you can think of all the stuff on the right side of the conditioning bar as being constants when you do any proportionality stuff.

11. ## Re: Using discrete and continuous rvs in a distribution

Oh right, duh, the thing we are to prove is that the resulting distribution has parameters which are themselves functions of $\displaystyle x_{i}$! Making sense now, thank you!

12. ## Re: Using discrete and continuous rvs in a distribution

Okay, I lied, I've been off-and-on staring at this some more and I'm back to not really getting it. I intuitively understand the idea of how, conditional on $\displaystyle X$, things in terms of $\displaystyle X$ are like a constant, but I'm not sure how to make rigorous use of that idea.

Here's an outline of what I've done, followed by a more detailed description if it's helpful.

By some simple algebraic manipulation and Bayes's Law, I get

$\displaystyle P_{W|X}P(w|x) = \frac{P_{X|W}(x|w)P_{W}(w)}{P_{X}(x)}$

Where the expressions in the numerator are described in the assumptions of the problem. From that, I combine expressions with a base of $\displaystyle w$ and with a base of $\displaystyle e$. The result is $\displaystyle w^{t+n-1}e^{-w(\beta +\sum x_{i})}$, which seems to me the (as you call it) "kernel" of a gamma distribution with parameters $\displaystyle t+n, \, \, \beta+\sum x_{i}$. Now I know that you earlier said this cannot be right, and maybe that's why I'm running into problems--however, I'm not seeing how what I've done is wrong or how anything else could work.

But as a result of so organizing my terms, my "coefficient" is now

$\displaystyle \frac{\beta^{t}}{\Gamma (t)P_{X}(x)}$

For this to truly be a gamma distribution in those parameters, I need my coefficient to be

$\displaystyle \frac{\Big( \beta+\sum x_{i} \Big)^{t+n}}{\Gamma(t+n)}$

So how do I do this? I don't have freedom to choose what any of the terms are, so it doesn't seem like I am able to compensate for this difference by assigning some value to a constant coefficient or anything like that.

A more detailed derivation of the expression that I ultimately obtain:

$\displaystyle P(X_{1}=x_{1}, ..., X_{n}=x_{n}|W=w)P(W=w) \quad = \\\\ P(W=w|X_{1}=x_{1}, ..., X_{n}=x_{n})P(X_{1}=x_{1}, ..., X_{n}=x_{n}) \quad \Longrightarrow \\\\ P(W=w|X_{1}=x_{1}, ..., X_{n}=x_{n}) \quad = \quad \frac{P(X_{1}=x_{1}, ..., X_{n}=x_{n}|W=w)P(W=w)}{P(X_{1}=x_{1}, ..., X_{n}=x_{n})} \quad = \\\\ \frac{w^{n}e^{-w(x_{1}+...+x_{n})}\frac{\beta^{t}}{\Gamma (t)}w^{t-1}e^{-\beta w}}{P(X_{1}=x_{1}, ..., X_{n}=x_{n})} \quad = \quad \frac{\beta^{t}}{\Gamma (t) P(X_{1}=x_{1}, ..., X_{n}=x_{n})}{w^{t+n-1}e^{-w(\beta + \sum_{i=1}^{n}x_{i})}}$

13. ## Re: Using discrete and continuous rvs in a distribution

By the way, I just noticed that, in your earlier statement you were using the Poisson distribution, which is what the original poster had originally posted.

However, in the original poster's reply (time-stamped May 18th 8:51 AM) when he wrote exactly what the problem was asking, he wrote the exponential distribution. So really, this problem should be about each $\displaystyle X_{i}|W$ being exponential with rate $\displaystyle w$.