1. ## Continuous random variable

I'm stuck at this question, if anyone could help please.

$f XY(x,y) = kxy, 0 < x < 1, 0

Show k = 24, -- I proved this by taking the double integral and setting = 1.
Find the marginal probility density function of X and Y,
Find the pdf of U = X + Y, X + Y <= u ( 0 < u < 1)

My marginal pdf for X was $12x(1-x)^2 , 0 < x < 1$
and for Y, was $12y(1-y)^2 , 0 < y < 1$

Is this correct, and if so how would I find U = X + Y, do I integrate $(x+y)*24xy$

Also my epected value for E(X) and E(Y) was = 1/5

2. Originally Posted by mathsandphysics
I'm stuck at this question, if anyone could help please.

$f XY(x,y) = kxy, 0 < x < 1, 0

Show k = 24, -- I proved this by taking the double integral and setting = 1.
Find the marginal probility density function of X and Y,
Find the pdf of U = X + Y, X + Y <= u ( 0 < u < 1)

My marginal pdf for X was $12x(1-x)^2 , 0 < x < 1$
and for Y, was $12y(1-y)^2 , 0 < y < 1$

Is this correct, and if so how would I find U = X + Y, do I integrate $(x+y)*24xy$

Also my epected value for E(X) and E(Y) was = 1/5
Your marginal distributions are correct. But I get E(X) = E(Y) = 2/5.

There are several options for finding the pdf of U = X + Y.

One option is to use the Change of Variable Theorem (see for example Walpole, Myers and Myers - Probability and Statistics For Engineers and Scientists): Let U = X + Y and V = X. Get the joint pdf of U and V and then calculate the marginal pdf for U. I get g(u) = 4u^3.

Another option (more work) would be to use the method of distribution functions (see for example Mendenhall, Scheaffer and Wackerley - Mathematical Statistics With Applications).

3. Thanks, I made a typo for E(X), I got 2 /5 too.

I dont think we have covered these so far, so I'm gonna try the chage of variables.
Thanks