1. ## Programming gradient descent for local minima

I've been working on a Java program to do gradient descent on 4 different functions.

This is what I'm doing:
1. Calculating a step size ( I need to do an adaptive step size, but for now I'm just using a small constant like 0.1).
2. Calculating the gradient of the function numerically
3. Multiplying the gradient by the step size
4. Adding that product to the current point. (i.e. p + $\alpha \nabla g$ where $\alpha$ is step size, p is current point, and $\nabla g$ is numerical gradient of the function.
5. Repeating until the distance between the points is less than a certain threshold (like 10e-6).

Now my question is - how do I search for the local minimum and not the global minimum?

In other words, some of my initial guesses just lead my algorithm to go to negative infinity. How can I tell if it's going to infinity or if going to a local minimum?

2. Hmm, gradient descent should find local minima by definition. That's one of it's pitfalls, kinda, that it gets caught in local minima instead of finding the global.

Did you try using a smaller step size? What kind of functions are you using?

3. Originally Posted by scskowron
I've been working on a Java program to do gradient descent on 4 different functions.

This is what I'm doing:
1. Calculating a step size ( I need to do an adaptive step size, but for now I'm just using a small constant like 0.1).
2. Calculating the gradient of the function numerically
3. Multiplying the gradient by the step size
4. Adding that product to the current point. (i.e. p + $\alpha \nabla g$ where $\alpha$ is step size, p is current point, and $\nabla g$ is numerical gradient of the function.
5. Repeating until the distance between the points is less than a certain threshold (like 10e-6).

Now my question is - how do I search for the local minimum and not the global minimum?

In other words, some of my initial guesses just lead my algorithm to go to negative infinity. How can I tell if it's going to infinity or if going to a local minimum?
If your step size is positive, for minimisation you want:

$p_{new}=p_{old} -\alpha \nabla g$

CB

4. Originally Posted by eigenvex
Hmm, gradient descent should find local minima by definition. That's one of it's pitfalls, kinda, that it gets caught in local minima instead of finding the global.

Did you try using a smaller step size? What kind of functions are you using?
It does not converge if you start outside the basin of attraction of one of the local minima.

Consider:

$\mathrm{f}\left( x,y\right)
:=-\left( {x}^{2}+{y}^{2}\right) -\frac{100}{{x}^{2}+{y}^{2}+1}$

CB