Converse to variable independence implying covariance is zero

If X and Y are two independent random variables, then . The converse is not true in general, because there are examples of two random variables where this equality holds but they are dependent, however, in all the examples I've seen, , so I feel like this is kind of a "fluke". Can anyone give an example of two dependent random variables such that ?

Re: Converse to variable independence implying covariance is zero

Hey SworD.

Instead of an example, I'd recommend a much more general idea to show you how you can construct random variables systematically that follow this property.

Define two random variables U and V where V = U^2. Now perform a principal component analysis on them to make them un-correlated and the solution to these un-correlated variables will always have the property you have provided where E[XY] = E[X]E[Y] for the transformed random variables X and Y.

You can make U and V have any relationship you want and you will still get the property to be true for the transformed random variables.

Principal Component Analysis does the equivalent of what Gram-Schmidt does to vectors: it creates an orthogonal basis (note - not orthonormal) which is ordered by the amount of variation that a particular variable contributes to the variability of the data.

It might be useful for you to look at a good book on the subject if you are keen enough.