# Thread: Joint Density Expected Value

1. ## Joint Density Expected Value

The question is:

Let X be a binomial random variable based on n trials, and a success probability of $\displaystyle p_x$; let Y be an independent binomial random variable based on m trials and a success probability of $\displaystyle p_y$. Find E(W) and Var(W), where W=4X+6Y.

(E(W) - expected value of W)

So from what I understand, that means $\displaystyle f_X(x)= \binom{n}{x}p^x(1-p)^{n-x}$ and the probability function for Y is similar.

I also know that given a variable say Z = X + Y, that $\displaystyle f_Z(z)=\int\limits_{-\infty}^{\infty} f_X(x)f_Y(z-x) \, dx$.

However since I have W=4X+6Y, I thought maybe then it should be something like $\displaystyle f_W(w)=\int\limits_{1}^{w} f_X(x)f_Y(\frac{w-4x}{6}) \, dx$. However that becomes a really ... nasty expression which I'm not sure how to simplify. I feel like I am doing something wrong.

I should be able to figure out the variance if I can do the expected value, because I should be able to just use the equation for variance.

2. ## Re: Joint Density Expected Value

First, you seem to have missed that a Binomial Random Variable has a discrete distribution. The integral is quite inappropriate.

Second, Given X, can you find the distribution of 4X? You don't HAVE to do it all at the same time. You may learn more from the intermediate exploration.

3. ## Re: Joint Density Expected Value

My teacher hinted that for a continuous variable, given W = 4X, then $\displaystyle f_W(w)=\int \limits_{-\infty}^{\infty} \frac{1}{4}f_X(\frac{x}{4}\, dx$. Looking through my text I'm not sure how I'd find it for the discrete case. My guess would be that it would be something like $\displaystyle p_W(w)=\sum_{w=1}^{n}\binom{n}{\frac{w}{4}}p_x^{ \frac{w}{4} }(1-p)^{n-\frac{w}{4}}$. However that intuitively doesn't feel quite right because you can't take the factorial of a non integer. So I'm at an entire loss to be honest.

4. ## Re: Joint Density Expected Value

You're close.

Okay, let's start with $\displaystyle R = X_{1} + X_{2}$ where $\displaystyle X_{i}$ are IID Binomial and see what happens.

Eventually, we'll need to realize that if we have W = 4X, that W/4 is Whole Number! No need to worry about Rational Numbers vs. Factorials.

5. ## Re: Joint Density Expected Value

Supposing $\displaystyle X_1$ has n trials and $\displaystyle X_2$ has m trials. Then $\displaystyle p_R(r)=\sum_{x=0}^{r}\binom{n}{x}\binom{m}{r-x}p^{r}(1-p)^{n+m-r}$.

I got that from doing:

$\displaystyle p_R(r)=\sum_{x=0}^{r}\binom{n}{x}p^x(1-p)^{n-x} \binom{m}{r-x}p^{r-x}p^{m-(r-x)}$ and simplifying.

However I distinctly remember my teacher saying something about how the pdf of X+X and the pdf and 2X are not the same.

edit: As a side note I was thinking about it and given some pdf for X, X is the sample space so multiplying a constant into X, like aX, would mean you're taking each individual object and multiplying it by that a. At least this is my understanding of it. In trying to apply this logic with a binomial distribution, say you have 5 trials, and you're looking at 4X, then if you multiply the values by 4, the only one that fits inside 5, is 4. So you only get the 4th trial?

6. ## Re: Joint Density Expected Value

You are right. X1 + X2 requires a convolution and 2X does not.