Hi,
I have the following set of equations:
C1 = X1*X2*log(a) + X1*(1-X2)*log(b) >= T
C2 = X1*X2*log(c) + X2*(1-X1)*log(d) >= T
a,b,c,d are known constants. Now I want:
Max T over X1 and X2.
any ideas???
Thanks in advance.
Printable View
Hi,
I have the following set of equations:
C1 = X1*X2*log(a) + X1*(1-X2)*log(b) >= T
C2 = X1*X2*log(c) + X2*(1-X1)*log(d) >= T
a,b,c,d are known constants. Now I want:
Max T over X1 and X2.
any ideas???
Thanks in advance.
Let me see if I get this straight. You want to find maxT while C's are always greater than T?
Then it must be maxT=min{C1,C2} for X1,X2. If your X's are defined throught the plane, just solve
http://www.artofproblemsolving.com/F...195b144370.gif
and compare the values to see which ones yield the minima. The min of these is maxT.
If though your variables belong in some region, then use Lagrange coefficients.