Results 1 to 4 of 4

Thread: Expected squared euclidean distance

  1. #1
    Newbie
    Joined
    Jan 2009
    Posts
    8

    Expected squared euclidean distance

    Hi,

    can someone help me with this problem?

    Let $\displaystyle X, Y$ be two independent $\displaystyle d$-dimensional random variables with Gaussian distribution $\displaystyle N(\mu_{X},\sigma^2 \mathbb{I}), N(\mu_{Y},\sigma^2 \mathbb{I})$, where $\displaystyle \mathbb{I}$ is the identity matrix in $\displaystyle \mathbb{R}^d, \sigma \in \mathbb{R}$ with $\displaystyle \sigma > 0$, and $\displaystyle \mu_X , \mu_Y \in \mathbb{R}^d$.

    Compute the expected squared distance $\displaystyle E(||X-Y||^2)$.

    Hint: Use that $\displaystyle ||X-Y||^2 = ||X||^2 + ||Y||^2 - 2<X,Y>$
    This is what I've got so far:

    $\displaystyle E(||X-Y||^2) = $

    $\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) = $

    $\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y> $

    Now, what could be done further (especially regarding the 2 first addends)?

    Thank you very much!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Member
    Joined
    May 2006
    Posts
    244
    Quote Originally Posted by Kyle_Katarn View Post
    Hi,

    can someone help me with this problem?

    This is what I've got so far:

    $\displaystyle E(||X-Y||^2) = $

    $\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) = $

    $\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i Y_i) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d E(X_i) E(Y_i) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 \sum_{i = 1}^d \mu_{X_i} \mu_{Y_i} = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y> $

    Now, what could be done further (especially regarding the 2 first addends)?

    Thank you very much!
    $\displaystyle E\left(\sum_{i = 1}^d X_i^2\right)=\sum_{i=1}^d E(X_i^2)=\sum_{i=1}^d \left[\sigma_{X_i}^2 + \overline{X}_i^2\right]
    =d\sigma^2 + \sum_{i=1}^d \overline{X}_i^2
    =d\sigma^2 + \sum_{i=1}^d (\mu_X)_i^2$ $\displaystyle
    =d\sigma^2 + \Vert \mu_X \Vert^2
    $

    .
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor

    Joined
    Aug 2008
    From
    Paris, France
    Posts
    1,174
    Quote Originally Posted by Kyle_Katarn View Post
    Hi,

    can someone help me with this problem?

    This is what I've got so far:

    $\displaystyle E(||X-Y||^2) = $

    $\displaystyle E(||X||^2 + ||Y||^2 - 2<X,Y>) = $

    $\displaystyle E(||X||^2) + E(||Y||^2) - 2E(<X,Y>) = $

    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2E(\sum_{i=1}^d X_i Y_i) = $
    (...)
    $\displaystyle E(\sum_{i = 1}^d X_i^2) + E(\sum_{i = 1}^d Y_i^2) - 2 <\mu_X , \mu_Y> $

    Now, what could be done further (especially regarding the 2 first addends)?

    Thank you very much!
    $\displaystyle E[\|X\|^2]=E\left[\sum_{i=1}^d X_i^2\right]=\sum_{i=1}^d E[X_i^2]$ and $\displaystyle E[X_i^2]={\rm Var}(X_i)+E[X_i]^2=\sigma^2+(\mu_X^{(i)})^2$, so that $\displaystyle E[\|X\|^2]=d\sigma^2+\|\mu_X\|^2$.

    By the way: if you know it, you can use the fact that $\displaystyle X-Y$ is a $\displaystyle d$-dimensional r.v. with distribution $\displaystyle \mathcal{N}(\mu_X-\mu_Y,2\sigma^2 I_d)$, and apply the above computation to $\displaystyle X-Y$ instead of just $\displaystyle X$. This way you'll be able to get the result quicker (almost instantly), so you can check yours.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Newbie
    Joined
    Jan 2009
    Posts
    8
    Oh man, I totally forgot the law $\displaystyle Var(X) = E(X^2) - (E(X))^2$! Thanks so much to both of you!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Euclidean distance between points
    Posted in the Calculus Forum
    Replies: 2
    Last Post: Jun 6th 2010, 11:21 AM
  2. Expected value of squared normal
    Posted in the Advanced Statistics Forum
    Replies: 2
    Last Post: Jan 25th 2010, 03:03 PM
  3. Chi-squared expected value
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: Oct 8th 2009, 04:25 PM
  4. euclidean distance in feature space
    Posted in the Algebra Forum
    Replies: 0
    Last Post: Jan 18th 2009, 11:38 PM
  5. [SOLVED] MDS and Euclidean distance
    Posted in the Advanced Math Topics Forum
    Replies: 0
    Last Post: Dec 9th 2006, 12:28 AM

Search Tags


/mathhelpforum @mathhelpforum