Originally Posted by

**mistykz** Suppose x,y ~ f(x,y) = k(x-y) given 0=< X =< Y =< 1, otherwise it = 0

What value of k makes this a density?

I'm not quite sure what I'm being asked to find...I know it asks for k, but what constraints/definition is needing to be met in order for this to be a density? I just set up a double integral, each from 0 to 1, of k(x-y) but came out in the end with k*0, so I think this is somewhat incorrect...can anyone help me? I just need to know which direction to move in. Thanks!