# Uncorrelated, orthogonal and independent???

• June 6th 2010, 01:37 PM
wormhole
Uncorrelated, orthogonal and independent???
Hi, i have a couple of questions...

I know that when two random vectors are independent, they are uncorrelated. The opposite it isn't true necessarily.

Orthongonality implies independence??
Independence implies orthogonality?

Why??

i know for example, that orthogonality exists when the correlation matrix is diagonal, and uncorrelation exists when covariance matrix is diagonal. But i don't understand clearly the relation with independence...

It's a little confusing to me... Thanks....
• June 10th 2010, 09:52 AM
Dooti
Orthogonality imposes independence. The opposite may not hold.
To prove it:
Take two vectors X and Y such that they are orthogonal.
i.e. x^Ty=0
let us assume that if possible they are linearly dependent.
i.e. there exists scalar c(not equal to 0)
such that,
x=c.y
so,
x^T=cY^T
this means, c Y^TY=0
implies, Y^TY=0
implies, each element of Y vector is 0.which is absurd.
hence x and y are independent.
The opposite can be proved taking any counter example
say,
x=(1 0 2)
y=(0 1 1)
the above two vectors are independent but not orthogonal.
• June 10th 2010, 11:16 AM
theodds
Dooti, I think the OP was referring to independence in the sense of random variables, not in the sense of "linear dependence."

I would think that what he means by orthogonal is either something along the lines of

$Ex^\top y = 0$

or

$E(x - Ex)^\top (y - Ey) = 0$.

In neither case does orthogonality imply independence (this is trivial to show if you take x, y to be real numbers). In the first case, independence does not imply orthogonality, while in the second case it does.