Maximize the minimum value in a vector
Here's a problem I am trying to solve.
Maximize the minimum value of I:
I = C + R * P (* indicates coordinate-wise product)
where I, C, R and P are n x 1 vectors with the condition that the values of R sum to 1 and all its values be in [0,1]. C and P would be given.
Here is an example with n = 2 (I am writing these vectors transposed so I can type it easily). C is given as [100 120]. P is given as [50 40]
[I1 I2] = [100 120] + [R1 R2] * [50 40]
With n = 2, this breaks down to a system of 2 equations that I was able to solve:
I1 = 100 + (1 - R2) (50)
I2 = 120 + R2 (40)
I set I1 = I2 since this would maximize the minimum value between I1 and I2 (however even this approach would break down under certain conditions).
100 + 50 - 50R2 = 120 + 40R2
150 - 50R2 = 120 + 40R2
30 = 90R2
R2 = 1/3, R1 = 2/3
Is there any way to generally solve this problem? Algorithmic solutions would be helpful as well.