Use the Mean Value Theorem to prove that if p > 1 then $\displaystyle (1+x)^p$ $\displaystyle >$ $\displaystyle 1+px$ for
$\displaystyle x$ $\displaystyle \in$ (−1, 0) U (0,$\displaystyle \infty$).
Dear maximus101,
Using the Taylor's theorem you could write,
$\displaystyle (1+x)^p=1+px+p(p-1)(1+c)^{p-2}x^2~\text{where}~0<c<x~or~x<c<0$------(A)
$\displaystyle p>1\Rightarrow{p(p-1)>0}$-----(1)
If $\displaystyle 0<c<x~then~1<1+c$-----(2)
If $\displaystyle x<c<0~then~{1+x<1+c<1}$----(3)
$\displaystyle x>-1\Rightarrow{1+x>0}$------(4)
By, (3) and (4); $\displaystyle If~x<c<0\Rightarrow{0<1+c<1}$----(5)
By (2) and (5); for both cases, $\displaystyle 0<c<x~and~x<c<0\Rightarrow{0<1+c}$------(6)
Therefore, by (1) and (6); $\displaystyle p(p-1)(1+c)^{p-2}x^2>0~if~x\neq{0}~and~x>-1$-----(6)
By (A);
$\displaystyle (1+x)^p-1-px=p(p-1)(1+c)^{p-2}x^2>0~if~x\neq{0}~and~x>-1$
$\displaystyle (1+x)^p>1+px~if~x>-1~and~x\neq{0}$
There are several mean value theorems; Rolle's mean value theorem, Cauchy's mean value theorem and the Taylor's mean value theorem. Please refer Mean value theorem - Wikipedia, the free encyclopedia. I have used the Taylor's mean value theorem in the above answer.