I think that what the problem says is that both x and y are in the interval [0,1]
I've been stuck on an assignment question for hours and I could use a little help.
The question is this:
------------------
Given:
f(x,y) = x + y for 0 <= x, y <= 1
0 otherwise
Find the marginal densities of X and Y.
------------------
My understanding is that the mdf f(x) is the integral of f(x,y) for all values of y. In this case, that's the integral of f(x,y) from -infinity to 1. However when I take the definite integral it goes to infinity. The same goes for the mdf f(y).
So, if somebody could offer some guidance I would be grateful.