Originally Posted by

**transgalactic** it is given that

$\displaystyle 1\leq p< +\infty\\$

$\displaystyle \alpha ,\beta >0 \\$

$\displaystyle a,b\geq 0\\$

prove that

$\displaystyle (\alpha a+\beta b )^p\leq (\alpha +\beta )^p\left ( \frac{\alpha }{\alpha +\beta }a^p+\frac{\beta }{\alpha +\beta }b^p \right )$

hint: prove first that $\displaystyle f(t)=t^p$ is a convex

on this region $\displaystyle [0,+\infty)$

reminder: function f(t) is called convex on some region if for every b,a

and on

$\displaystyle 0\leq \lambda \leq 1\\$

we have

$\displaystyle f(\lambda a +(1-\lambda)b)\leq\lambda f(a)+(1-\lambda)f(b)$

my thoughts:

i know from calc1 that a function is convex if its second derivative is negative or something (i am not sure)

i dont know

how to prove that $\displaystyle f(t)=t^p$ is negative

its pure parametric thing

??