Guessing here but if n has to be integer greater than 1 the, greatest it can be is 2. and since x can't be greater than -1 but less than 0 then 0< 1 + x<1, and since n is postive (1 +x)< (1+x)^n all the time. Now for 1+nx, we know n has to be greater than 1 and a integer therefore it always increases the value of x(which is a -negative) which 1 + nx < 1, and we knwo that (1 + x) < 1, therefore (1 +n)^n must be atleast equal if not greater than 1, threfore (1 +x)^n>1+nx. I know it is a really bad explanation and might be wrong but it is something