# Programming gradient descent for local minima

• May 2nd 2010, 01:51 PM
scskowron
Programming gradient descent for local minima
I've been working on a Java program to do gradient descent on 4 different functions.

This is what I'm doing:
1. Calculating a step size ( I need to do an adaptive step size, but for now I'm just using a small constant like 0.1).
2. Calculating the gradient of the function numerically
3. Multiplying the gradient by the step size
4. Adding that product to the current point. (i.e. p + $\alpha \nabla g$ where $\alpha$ is step size, p is current point, and $\nabla g$ is numerical gradient of the function.
5. Repeating until the distance between the points is less than a certain threshold (like 10e-6).

Now my question is - how do I search for the local minimum and not the global minimum?

In other words, some of my initial guesses just lead my algorithm to go to negative infinity. How can I tell if it's going to infinity or if going to a local minimum?
• May 7th 2010, 12:59 PM
eigenvex
Hmm, gradient descent should find local minima by definition. That's one of it's pitfalls, kinda, that it gets caught in local minima instead of finding the global.

Did you try using a smaller step size? What kind of functions are you using?
• May 7th 2010, 11:44 PM
CaptainBlack
Quote:

Originally Posted by scskowron
I've been working on a Java program to do gradient descent on 4 different functions.

This is what I'm doing:
1. Calculating a step size ( I need to do an adaptive step size, but for now I'm just using a small constant like 0.1).
2. Calculating the gradient of the function numerically
3. Multiplying the gradient by the step size
4. Adding that product to the current point. (i.e. p + $\alpha \nabla g$ where $\alpha$ is step size, p is current point, and $\nabla g$ is numerical gradient of the function.
5. Repeating until the distance between the points is less than a certain threshold (like 10e-6).

Now my question is - how do I search for the local minimum and not the global minimum?

In other words, some of my initial guesses just lead my algorithm to go to negative infinity. How can I tell if it's going to infinity or if going to a local minimum?

If your step size is positive, for minimisation you want:

$p_{new}=p_{old} -\alpha \nabla g$

CB
• May 8th 2010, 12:05 AM
CaptainBlack
Quote:

Originally Posted by eigenvex
Hmm, gradient descent should find local minima by definition. That's one of it's pitfalls, kinda, that it gets caught in local minima instead of finding the global.

Did you try using a smaller step size? What kind of functions are you using?

It does not converge if you start outside the basin of attraction of one of the local minima.

Consider:

$\mathrm{f}\left( x,y\right)
:=-\left( {x}^{2}+{y}^{2}\right) -\frac{100}{{x}^{2}+{y}^{2}+1}$

http://www.mathhelpforum.com/math-he...7&d=1273302230

CB