# Thread: Hellinger distance

1. ## Hellinger distance

Hi all,
I would appreciate if someone can help me to prove the following proposition.

Let $P$ and $Q$ be arbitrary distributions over $\Omega = \{w_1,w_2,... ,w_n\}.$let $||x|| = || (x_1, x_2, ..., x_n) || = \sqrt{\varSigma_{i=1}^n {x_i}^2$ denote the usual Euclidean norm. how we can prove that the Hellinger distance $h(P,Q)$ is the norm of vector with $i$th coordinate equal to the difference of the square roots of the probabilities of seeing $w_i$ according to $P$ and
$Q$ normalized by dividing by $\sqrt{2}$. that is prove $h(P,Q) = || \sqrt{\frac {P(w_1)}{2} - \frac {Q(w_1)}{2}},... ,\sqrt{\frac {P(w_n)}{2} - \frac {Q(w_n)}{2}}||.$

2. ## Re: Hellinger distance

Hey ghali.

Do you have a distributional definition of this? Like for example a conditional distribution like P(W=w|P=p)?

3. ## Re: Hellinger distance

Hi chiro,
Thank you very much for your reply. Actually we do not have conditional distribution like P(W=w|P=p). dose it make the proof hard ?

Thanks in advance. regards

4. ## Re: Hellinger distance

Hint: What is the length of a unit vector plus a unit vector when both are orthogonal?