Hi there,
this is one of those problems I can't even figure out how to approach. Any idea would be highly appreciated.
Prove thatf is convex on an interval iff for all x and y in the interval,f(tx + (1 − t)y) < tf(x) + (1 − t)f(y), for 0 < t < 1
Hi there,
this is one of those problems I can't even figure out how to approach. Any idea would be highly appreciated.
Prove thatf is convex on an interval iff for all x and y in the interval,f(tx + (1 − t)y) < tf(x) + (1 − t)f(y), for 0 < t < 1