
Linear programming?
A financier plans to invest up to $500,000 in two projects. Project A yields a return of 10% on the investment whereas project B yields a return of 15% on the investment. Because the investment in project B is riskier than the investment in Project A, the financier has decided that the investment in Project B should not exceed 40% of the total investment. How much should she invest in each project in order to maximize the return on her investment? What is the maximum return?
I need help with this problem. I have set up:
max P= .10x +.15y
subj to: x+y ≤ 500,000
y <= 200,000
x, y >= 0
My professor says the constraint y ≤ 200,000 is not correct. It's supposed to be something like y<= .4 (x+y)?
I am completely confused, can someone help me clear this constraint up?
thanks.

Define x to be the amount invested in project A and y to be the amount invested in project B
Consider that you may not invest the entire amount as your first constraint implies.

would it be y ≤ .4 (x+y)
y≤ 24000? i need to graph this as well.