Problem concerning Marginal Density function

I've been stuck on an assignment question for hours and I could use a little help.

The question is this:

------------------

Given:

f(x,y) = x + y for 0 <= x, y <= 1

0 otherwise

Find the marginal densities of X and Y.

------------------

My understanding is that the mdf f(x) is the integral of f(x,y) for all values of y. In this case, that's the integral of f(x,y) from -infinity to 1. However when I take the definite integral it goes to infinity. The same goes for the mdf f(y).

So, if somebody could offer some guidance I would be grateful.