Uncorrelated, orthogonal and independent???

Jun 2010
Hi, i have a couple of questions...

I know that when two random vectors are independent, they are uncorrelated. The opposite it isn't true necessarily.

What about orthogonal random vectors??
Orthongonality implies independence??
Independence implies orthogonality?


i know for example, that orthogonality exists when the correlation matrix is diagonal, and uncorrelation exists when covariance matrix is diagonal. But i don't understand clearly the relation with independence...

It's a little confusing to me... Thanks....
Jun 2010
Orthogonality imposes independence. The opposite may not hold.
To prove it:
Take two vectors X and Y such that they are orthogonal.
i.e. x^Ty=0
let us assume that if possible they are linearly dependent.
i.e. there exists scalar c(not equal to 0)
such that,
this means, c Y^TY=0
implies, Y^TY=0
implies, each element of Y vector is 0.which is absurd.
hence x and y are independent.
The opposite can be proved taking any counter example
x=(1 0 2)
y=(0 1 1)
the above two vectors are independent but not orthogonal.
Oct 2009
Dooti, I think the OP was referring to independence in the sense of random variables, not in the sense of "linear dependence."

I would think that what he means by orthogonal is either something along the lines of

\(\displaystyle Ex^\top y = 0\)


\(\displaystyle E(x - Ex)^\top (y - Ey) = 0\).

In neither case does orthogonality imply independence (this is trivial to show if you take x, y to be real numbers). In the first case, independence does not imply orthogonality, while in the second case it does.
Last edited: