I have an equation that looks like:

$\displaystyle \frac{x m_1 + (1-x) m_2}{\sqrt{x^2 v_1 +(1-x)^2 v_2 + 2x(1-x)c}}$

where m1 and m2 are means, v1 and v2 are variances and c is the covariance(m1, m2).

My objective is to find the absolute maxima of this equation.

This can be easily solved by taking the first derivative, set the equation to zero and solving for x.

Now, suppose that I have N of that equation, where the i index refers to the index of the ith equation, i.e., i = 1,...N.

My objective would be to find the maxima of the following equation:

$\displaystyle \sum_{i = 1}^N \omega_i\frac{x m_1_i + (1-x) m_2_i}{\sqrt{x^2 v_1_i +(1-x)^2 v_2_i + 2x(1-x)c_i}}$

where W_i is the weight in the running sum of the ith equation. Note that I have all values W_i, m1_i, m2_i, var1_i, var2_i and c_i. I need to find the value of x that maximizes the above mentioned sum.

I can easily solve this numerically, but I believe there is a smarter way. Since my background is in Biomedical Sciences, I would be really grateful for any help from math experts. There is any program/algorithm that I might use, for example?

Cheers!

Alonso

P.S. I can easily solve that problem numerically, because I now the value of x will be constrained on the interval [-5,+5]