# Math Help - Constrained Steepest descent method - help in constraining it...

1. ## Constrained Steepest descent method - help in constraining it...

I am looking to find the minimum of the function:
f = x1 - x2 + 3x1^2 + 2x1x2 + x2^2 + 1
However I would also like to add a constraint to this so that x1*x2 > 0
In trying to implement this I have looked at the book engineering optimization by S.S. Rao and one method shown has sort to use a penalty function 'r', so that the function now equals: f(x1,x2) - r*(1/g(x1,x2)).
To this end i have recalculated gradf to give:

gradf = [1 + 6*x(1) + x(2)*(2 - r/((x(1)*x(2)-1)^2) ); -2 + 2*x(2) + x(1)*(2 - r /((x(1)*x(2) -1 )^2)];
and tried values of r ranging from 0 ---> 1000

However I still end up with a minimum value inside the constraint.
Any kind person out there willing to lend some assistance?

% minimise f = x1 - x2 + 3x1^2 + 2x1x2 + x2^2 + 1
clear
clf
x = [2 1]'; % initial solution

xx = x';
ll = [];
ss = [];
for icount = 1:30
gradf = [1 + 6*x(1) + 2*x(2) ; -2 + 2*x(1) + 2*x(2)];
% gradf = [1 + 4*x(1) + 2*x(2); -1 + 2*x(1) + 2*x(2)]

% minimise f(x + lamda*s)
a = x(1); b = s(1); c= x(2); d = s(2);
% Alambda^2 + Blambda + C
C = a - 2*c + 3*a^2 + 2*a*c + c^2 + 1;
B = b - 2*d + 6*a*b + 2*a*d + 2*b*c + 2*c*d;
A = 3*b^2 + 2*b*d + d^2;

lambda = -B/(2*A); % optimum step length

x = x + lambda * s;
xx = [xx;x'];
ll = [ll;lambda];
ss = [ss;s'];
end
xx = xx(1:end-1,: );
[xx ll ss]

icount = 0;
for ii =-2.5:.01:2.5
icount = icount + 1;
x1(icount) = ii;
jcount = 0;
for jj = -1.5:.01:3.5
jcount = jcount + 1;
x2(jcount) = jj;
f(icount,jcount)= ii - 2*jj + 3*ii^2 + 2*ii*jj + jj^2 + 1;
end
end
figure(1)
mesh(x2,x1,f)
figure(2)
contour(x2,x1,f)
xlabel('x2')
ylabel('x1')
hold on
plot(xx(:,2),xx(:,1),'ok-')
hold off

2. Originally Posted by p123nky
I am looking to find the minimum of the function:
f = x1 - x2 + 3x1^2 + 2x1x2 + x2^2 + 1
However I would also like to add a constraint to this so that x1*x2 > 0
In trying to implement this I have looked at the book engineering optimization by S.S. Rao and one method shown has sort to use a penalty function 'r', so that the function now equals: f(x1,x2) - r*(1/g(x1,x2)).
To this end i have recalculated gradf to give:

gradf = [1 + 6*x(1) + x(2)*(2 - r/((x(1)*x(2)-1)^2) ); -2 + 2*x(2) + x(1)*(2 - r /((x(1)*x(2) -1 )^2)];
and tried values of r ranging from 0 ---> 1000

However I still end up with a minimum value inside the constraint.
Any kind person out there willing to lend some assistance?
What are you using for $g(x_1,x_2)$ and $r(.)$ for that matter ??

3. Originally Posted by p123nky
I am looking to find the minimum of the function:
f = x1 - x2 + 3x1^2 + 2x1x2 + x2^2 + 1
However I would also like to add a constraint to this so that x1*x2 > 0
In trying to implement this I have looked at the book engineering optimization by S.S. Rao and one method shown has sort to use a penalty function 'r', so that the function now equals: f(x1,x2) - r*(1/g(x1,x2)).
To this end i have recalculated gradf to give:

gradf = [1 + 6*x(1) + x(2)*(2 - r/((x(1)*x(2)-1)^2) ); -2 + 2*x(2) + x(1)*(2 - r /((x(1)*x(2) -1 )^2)];
and tried values of r ranging from 0 ---> 1000

However I still end up with a minimum value inside the constraint.
Any kind person out there willing to lend some assistance?
There are a number of possibilities:

1. There is a calculus like minimum interior to the feasible region, but there is not (there is a critical point but it is not in the feasible region and so we need consider it no further).

2. The minimum occurs on the boundary of the feasible region (this is why we like the feasible region to be closed so I will assume your constraint is $x_1 x_2\ge 0$)

So as $x_1x_2=0$ implies that $x_1=0$ or that $x_2=0$ you now need to examine the two one dimensional problems:

Find the unconstrained global minimum of $x_1+3x_1^2+1$.

Find the unconstrained global minimum of $-x_2+x_2^2+1$.

Both of these have minima, keep the smaller.

CB