# Expected squared euclidean distance

Printable View

• Jan 18th 2009, 04:46 AM
Kyle_Katarn
Expected squared euclidean distance
Hi,

can someone help me with this problem?

Quote:

Let $\displaystyle X, Y$ be two independent $\displaystyle d$-dimensional random variables with Gaussian distribution $\displaystyle N(\mu_{X},\sigma^2 \mathbb{I}), N(\mu_{Y},\sigma^2 \mathbb{I})$, where $\displaystyle \mathbb{I}$ is the identity matrix in $\displaystyle \mathbb{R}^d, \sigma \in \mathbb{R}$ with $\displaystyle \sigma > 0$, and $\displaystyle \mu_X , \mu_Y \in \mathbb{R}^d$.

Compute the expected squared distance $\displaystyle E(||X-Y||^2)$.

Hint: Use that $\displaystyle ||X-Y||^2 = ||X||^2 + ||Y||^2 - 2<X,Y>$
This is what I've got so far:

$\displaystyle E(||X-Y||^2) =$

$\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) =$

$\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!
• Jan 18th 2009, 05:53 AM
Constatine11
Quote:

Originally Posted by Kyle_Katarn
Hi,

can someone help me with this problem?

This is what I've got so far:

$\displaystyle E(||X-Y||^2) =$

$\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) =$

$\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i)$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!

$\displaystyle E\left(\sum_{i = 1}^d X_i^2\right)=\sum_{i=1}^d E(X_i^2)=\sum_{i=1}^d \left[\sigma_{X_i}^2 + \overline{X}_i^2\right] =d\sigma^2 + \sum_{i=1}^d \overline{X}_i^2 =d\sigma^2 + \sum_{i=1}^d (\mu_X)_i^2$ $\displaystyle =d\sigma^2 + \Vert \mu_X \Vert^2$

.
• Jan 18th 2009, 06:01 AM
Laurent
Quote:

Originally Posted by Kyle_Katarn
Hi,

can someone help me with this problem?

This is what I've got so far:

$\displaystyle E(||X-Y||^2) =$

$\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) =$

$\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) =$

$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) =$
(...)
$\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y>$

Now, what could be done further (especially regarding the 2 first addends)?

Thank you very much!

$\displaystyle E[\|X\|^2]=E\left[\sum_{i=1}^d X_i^2\right]=\sum_{i=1}^d E[X_i^2]$ and $\displaystyle E[X_i^2]={\rm Var}(X_i)+E[X_i]^2=\sigma^2+(\mu_X^{(i)})^2$, so that $\displaystyle E[\|X\|^2]=d\sigma^2+\|\mu_X\|^2$.

By the way: if you know it, you can use the fact that $\displaystyle X-Y$ is a $\displaystyle d$-dimensional r.v. with distribution $\displaystyle \mathcal{N}(\mu_X-\mu_Y,2\sigma^2 I_d)$, and apply the above computation to $\displaystyle X-Y$ instead of just $\displaystyle X$. This way you'll be able to get the result quicker (almost instantly), so you can check yours.
• Jan 18th 2009, 08:23 AM
Kyle_Katarn
Oh man, I totally forgot the law $\displaystyle Var(X) = E(X^2) - (E(X))^2$! Thanks so much to both of you! (Yes)