# euclidean distance in feature space

• Jan 18th 2009, 11:38 PM
roshi
euclidean distance in feature space
hi everyone,

i got this equation from the article " Nonlinear Dimensionality Reduction for Classification Using Kernel Weighted Subspace Method " to solve euclidean distance between the mean of class i and j in feature space. Each of class i and j consist of several column vectors. Here is the equation :

Attachment 9716

where Ni and Nj = the number of column vectors in class i and j, and x is a column vector.

The above equation is solved using "kernel trick" by substituting k(xi,xj) with the appropriate kernel. In the article itself, the author is using RBF kernel :\$\displaystyle k(xi, xj) = exp( ||xi - xj||^2 / 10^9) \$

Given this example :
Class 1 consist of
x1 = { 3, 4, 1 } (column vector 1 in class 1)
x2 = { 9, 2, 3 } (column vector 2 in class 1)

Class 2 consist of
x1 = { 2, 6, 5 } (column vector 1 in class 2)
x2 = { 7, 1, 3 } (column vector 2 in class 2)

I wonder how to obtain the euclidean distance d(i,j) above?

I try this following approach :

a) sum each element of column vector in each class

sum of class 1 (denoted as s1) = { 12, 6, 4 }
sum of class 2 (s2) = { 9, 7, 8 }

b) apply the obtained sum to the equation, so that the sum of k[i1,i2] = k[s1,s1]; the sum of k[j1,j2] = k[s2,s2]; the sum of k[i1,j1] = k[s1,s2]; the sum of k[j1,i1] = k[s2,s1].

But here come the problem, let's say that k[s1,s1] = u, k[s2,s2] = v, k[s1,s2] = x, and k[s2,s1] = y.

Applying RBF kernel into those equation, I get this resulting value :
u = \$\displaystyle exp (0)\$ = 1;
v = \$\displaystyle exp (0)\$ = 1;
x = \$\displaystyle exp(26 / 10^9) = 1.000000026\$;
and y = \$\displaystyle exp(26 / 10^9) = 1.000000026\$;

So d(i,j) = \$\displaystyle sqrt( (1/Ni^2) * u + (1/Nj^2) * v - (1/(Ni*Nj)) * x - (1/(Ni*Nj)) * y )\$
= \$\displaystyle sqrt( 1/4 + 1/4 - (1/4) * 1.000000026 - (1/4) * 1.000000026)\$
= \$\displaystyle sqrt( -0.000000014 )\$

So, I got stuck in this square root of negative value result. I would appreciate any help on this matter.