# Expected squared euclidean distance

• Jan 18th 2009, 05:46 AM
Kyle_Katarn
Expected squared euclidean distance
Hi,

can someone help me with this problem?

Quote:

Let $X, Y$ be two independent $d$-dimensional random variables with Gaussian distribution $N(\mu_{X},\sigma^2 \mathbb{I}), N(\mu_{Y},\sigma^2 \mathbb{I})$, where $\mathbb{I}$ is the identity matrix in $\mathbb{R}^d, \sigma \in \mathbb{R}$ with $\sigma > 0$, and $\mu_X , \mu_Y \in \mathbb{R}^d$.

Compute the expected squared distance $E(||X-Y||^2)$.

Hint: Use that $||X-Y||^2 = ||X||^2 + ||Y||^2 - 2$
This is what I've got so far:

$E(||X-Y||^2) =$

$E(||X||^2 + ||Y||^2 - 2) =$

$E(||X||^2) + E(||Y||^2) - 2E() =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!
• Jan 18th 2009, 06:53 AM
Constatine11
Quote:

Originally Posted by Kyle_Katarn
Hi,

can someone help me with this problem?

This is what I've got so far:

$E(||X-Y||^2) =$

$E(||X||^2 + ||Y||^2 - 2) =$

$E(||X||^2) + E(||Y||^2) - 2E() =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i)$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!

$E\left(\sum_{i = 1}^d X_i^2\right)=\sum_{i=1}^d E(X_i^2)=\sum_{i=1}^d \left[\sigma_{X_i}^2 + \overline{X}_i^2\right]
=d\sigma^2 + \sum_{i=1}^d \overline{X}_i^2
=d\sigma^2 + \sum_{i=1}^d (\mu_X)_i^2$
$
=d\sigma^2 + \Vert \mu_X \Vert^2
$

.
• Jan 18th 2009, 07:01 AM
Laurent
Quote:

Originally Posted by Kyle_Katarn
Hi,

can someone help me with this problem?

This is what I've got so far:

$E(||X-Y||^2) =$

$E(||X||^2 + ||Y||^2 - 2) =$

$E(||X||^2) + E(||Y||^2) - 2E() =$

$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) =$
(...)
$E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!

$E[\|X\|^2]=E\left[\sum_{i=1}^d X_i^2\right]=\sum_{i=1}^d E[X_i^2]$ and $E[X_i^2]={\rm Var}(X_i)+E[X_i]^2=\sigma^2+(\mu_X^{(i)})^2$, so that $E[\|X\|^2]=d\sigma^2+\|\mu_X\|^2$.

By the way: if you know it, you can use the fact that $X-Y$ is a $d$-dimensional r.v. with distribution $\mathcal{N}(\mu_X-\mu_Y,2\sigma^2 I_d)$, and apply the above computation to $X-Y$ instead of just $X$. This way you'll be able to get the result quicker (almost instantly), so you can check yours.
• Jan 18th 2009, 09:23 AM
Kyle_Katarn
Oh man, I totally forgot the law $Var(X) = E(X^2) - (E(X))^2$! Thanks so much to both of you! (Yes)