Now this particular problem has done my head in. Just cant wrap my head around it and im completely lost. Here it is:
Assume that two continuous random variables x and y, are uniform distributed in the following way:
f(x,y) = 1/(d-c)*(b-a), a<x<b, c<y<d
Now show that the covariance between x and y, Cov(x,y), equals 0.
Any help is much appreciated, Thanks in advance guys:)