1. Lagrange Multipliers

Could use some help! I'm close but not quite able to see what I'm doing wrong.

Here's the setup:

f(x) = 2x1^2 - x2^2

subject to:
g1(x) = x1^2 - x2 - 9
g2(x) = -x1 + x2 +3

with xtau = {-2 -5}T

I found the gradient of f(xtau) to be {4x1 -2x2}T, which using xtau turns into {-8 10}.

I computed the gradients of g1(x) and g2(x) to be:
gradient g1(x) = {2x1 -1}T
gradient g2(x) = {-1 1}T

Correct so far?

From this, I get a bit confused when setting up the computation for the Lagrange multiplier (meaning I don't get an answer that matches one of the possible choices).

My understanding is that it should be:
gradient of f(x) + lambda1*gradient of g1(x) + lambda2*gradient of g2(x) = 0 matrix

In Matlab I enter the following:

b = [-8; 10];
A = [-4, 0; -1, 0];
x = A \ b; % to compute least squares

At this point, I get {-0.4 9.6}T which doesn't appear to be correct.

Any help in understanding what I'm doing wrong will be appreciated.

Thanks!

2. Originally Posted by dsprice
Could use some help! I'm close but not quite able to see what I'm doing wrong.

Here's the setup:

f(x) = 2x1^2 - x2^2

subject to:
g1(x) = x1^2 - x2 - 9
g2(x) = -x1 + x2 +3
Ok up to here, we have a constrained non-linear optimisation problem.

But what is all of this:

with xtau = {-2 -5}T

I found the gradient of f(xtau) to be {4x1 -2x2}T, which using xtau turns into {-8 10}.

I computed the gradients of g1(x) and g2(x) to be:
gradient g1(x) = {2x1 -1}T
gradient g2(x) = {-1 1}T

Correct so far?

From this, I get a bit confused when setting up the computation for the Lagrange multiplier (meaning I don't get an answer that matches one of the possible choices).

My understanding is that it should be:
gradient of f(x) + lambda1*gradient of g1(x) + lambda2*gradient of g2(x) = 0 matrix

In Matlab I enter the following:

b = [-8; 10];
A = [-4, 0; -1, 0];
x = A \ b; % to compute least squares

At this point, I get {-0.4 9.6}T which doesn't appear to be correct.

Any help in understanding what I'm doing wrong will be appreciated.

Thanks!
where is the Lagrangian, what does xtau represent and what is T?

CB

3. Sorry. I need to learn how to enter equations in normal format.

xtau is x with tau as the superscript. xtau is the candidate solution for the optimization problem. T should also be a superscript. T is the transform of the matrix shown as I didn't know how to represent it as a one column vector.

Okay, so the gradient of f(xtau) is a one column vector of df/dx1 and df/dx2. This translates into 4x1 and -2x2. At xtau, this computes to a one column vector of -8 and 10. The gradient of g1(x) should be a one column vector of 2x1 and -1, and the gradient of g2(x) should be a one column vector of -1 and 1. Hopefully right so far.

If I understand the computation of the Lagrange multipliers at xtau correctly, then the gradient of f(xtau) + lambda1*(gradient of g1(xtau)) + lambda2*(gradient of g2(xtau)) should be set to the 0 vector. At this point, the problem becomes one of solving for lambda1 and lambda2.

Ah! I just figured out what I did wrong! In setting up the equation, instead of using the gradient of f(xtau), I used xtau directly! Now I get one of the 8 possible solutions and am comfortable that I now have it right.

Thanks! This did help to walk through it as I've been racking my brains out for several days trying to figure out what was wrong.