# Minimum distance between two catmull rom splines

• Aug 14th 2010, 08:08 AM
kipple3
Minimum distance between two catmull rom splines
For a project I'm working on I need to find the minimum distance between two three dimensional catmull rom splines (cubic b-spline defined by at least four control points). At the moment I'm using a very naive algorithm that just recursively finds the closest points on the two curves then refines a new curve (set of points) using the adjacent points as new control points. A few recursions gives the true closest point to enough accuracy but of course if there are multiple closest points it won't find them all.

Attachment 18567

I'm hoping one of the math experts here can give me some advice on a way to solve this more efficiently.

Frustratingly, I was able to find a reference to a paper which supposedly describes an implementation of a multidimensional Newton's method that can solve this problem but the paper itself is nowhere to be found and I don't understand how Newton's method can be applied here.

Thanks a lot for any help!
• Aug 23rd 2010, 02:15 PM
TriKri
Do you have a formula that you can show here that describes the paths of the two curves? Otherwise, multidimensional Newton-Raphsons method will be good enough if the only thing you want is a numerical value. Normally, the method works as follows:

$\displaystyle x_{n+1}=x_n - \displaystyle{\frac{f(x)}{f'(x)}}$

But if you instead have a function operating on a vector

$\displaystyle f: \mathbb{R}^n \rightarrow \mathbb{R}$

there is no derivative of f, so the original formula doesn't work. However, there are directional derivatives, and the function does have a gradient. The gradient will point in the direction in which the function will increase the fastest. Therefore, setting

$\displaystyle \bar{d}=\displaystyle{\frac{\nabla f}{\left|\nabla f\right|}}$

will tell in which direction the function increases. Now, starting with the original formula, there are two things that have to be changed.

1. While $\displaystyle x_n$ becomes the vector $\displaystyle \bar{x}_n$, the second term in the subtraction is a scalar. It has to be made a vector too, before the subtraction be performed. Therefore, we will multiply it by the (normalized) vector d, since d is the direction in which the function increases. So

$\displaystyle \displaystyle{\frac{f(x)}{f'(x)}}$

becomes

$\displaystyle \bar{d}(\bar{x})\ \displaystyle{\frac{f(\bar{x})}{f'(\bar{x})}}$

2. There is no $\displaystyle f'(\bar{x})$. However, there is a directional derivative in the direction (d) we are heading, $\displaystyle \nabla_{\bar{d}}f'(\bar{x}),$ which should be used instead. Substitution will give us

$\displaystyle \bar{d}(\bar{x})\ \displaystyle{\frac{f(\bar{x})}{\nabla_{\bar{d}}f' (\bar{x})}}$

Now, we can make the realization that $\displaystyle \nabla_{\bar{d}}f'(\bar{x}) = \left|\nabla f(\bar{x})\right|$, so we will get

$\displaystyle \bar{d}\ \displaystyle{\frac{f(\bar{x})}{\left|\nabla f(\bar{x})\right|}}\ =\ \displaystyle{\frac{\nabla f(\bar{x})}{\left|\nabla f(\bar{x})\right|}}\ \displaystyle{\frac{f(\bar{x})}{\left|\nabla f(\bar{x})\right|}}\ =\ \displaystyle{\frac{\left(\nabla f(\bar{x})\right)f(\bar{x})}{\left|\nabla f(\bar{x})\right|^2}}$

Inserting this into the original formula yields

$\displaystyle \bar{x}_{n+1}=\bar{x}_n -\displaystyle{\frac{\left(\nabla f(\bar{x})\right)f(\bar{x})}{\left|\nabla f(\bar{x})\right|^2}}$
• Aug 23rd 2010, 02:57 PM
TriKri
I forgot to mention it, but the function that you want to find the root for is $\displaystyle \nabla f(\bar{x})$, where $\displaystyle f(\bar{x})$ is the distance between the two points depending on some parameters that are stored in $\displaystyle \bar{x}$. I just realized that you can't use the method from above directly. However, if you square all the elements in $\displaystyle \nabla f(\bar{x})$ and add them together to a new function g, i.e. $\displaystyle g(\bar{x})=(\nabla f(\bar{x}))^2$, then you will be able to find the root using g. Also, since g will now have a gradient equal to the null vector for the place where the root is, you might find it faster to double the step length when you get close to the root, i.e:

$\displaystyle \bar{x}_{n+1}\ =\ \bar{x}_n - \lambda\ \displaystyle{\frac{(\nabla g(\bar{x}))g(\bar{x})}{\left|\nabla g(\bar{x})\right|^2}}$

Where $\displaystyle 1\leq\lambda\leq 2$. You may just set $\displaystyle \lambda$ to be 2 all the time. However, if $\displaystyle \bar{x}$ is far away from any root, a lower value will probably work better.