hi everyone,

i got this equation from the article " Nonlinear Dimensionality Reduction for Classification Using Kernel Weighted Subspace Method " to solve euclidean distance between the mean of class i and j in feature space. Each of class i and j consist of several column vectors. Here is the equation :

where Ni and Nj = the number of column vectors in class i and j, and x is a column vector.

The above equation is solved using "kernel trick" by substituting k(xi,xj) with the appropriate kernel. In the article itself, the author is using RBF kernel :

Given this example :

Class 1 consist of

x1 = { 3, 4, 1 } (column vector 1 in class 1)

x2 = { 9, 2, 3 } (column vector 2 in class 1)

Class 2 consist of

x1 = { 2, 6, 5 } (column vector 1 in class 2)

x2 = { 7, 1, 3 } (column vector 2 in class 2)

I wonder how to obtain the euclidean distance d(i,j) above?

I try this following approach :

a) sum each element of column vector in each class

sum of class 1 (denoted as s1) = { 12, 6, 4 }

sum of class 2 (s2) = { 9, 7, 8 }

b) apply the obtained sum to the equation, so that the sum of k[i1,i2] = k[s1,s1]; the sum of k[j1,j2] = k[s2,s2]; the sum of k[i1,j1] = k[s1,s2]; the sum of k[j1,i1] = k[s2,s1].

But here come the problem, let's say that k[s1,s1] = u, k[s2,s2] = v, k[s1,s2] = x, and k[s2,s1] = y.

Applying RBF kernel into those equation, I get this resulting value :

u = = 1;

v = = 1;

x = ;

and y = ;

So d(i,j) =

=

=

So, I got stuck in this square root of negative value result. I would appreciate any help on this matter.

Thanks in advance!

(Sorry for my poor English)