Originally Posted by
Mush $\displaystyle f(x) = sin(x)(1+cos(x)) {\color{red}= sin(x) + sin(x)cos(x)}$ Mr F says: The red stuff in this line is unnecessary (since you don't use it in your solution) and in fact unhelpful and potentially confusing to the OP.
Differentiate!
$\displaystyle f'(x) = cos(x)( \, 1+cos(x) \, ) + sin(x)(\, -sin(x) \, )$
Maximum occurs when $\displaystyle f'(x) = 0$
$\displaystyle f'(x) = cos(x)(1+cos(x) + sin(x)(-sin(x)) = 0$
$\displaystyle cos(x) + cos^2(x) - sin^2(x) = 0$ (Remember that $\displaystyle cos^2(x) - sin^2(x) = cos(2x) $) Mr F says: Alternatively and more efficiently, remember the Pythagorean Identity $\displaystyle {\color{red}\cos^2 x + \sin^2 x = 1 \Rightarrow \sin^2 x = 1 - \cos^2 x}$. I'll pick up from here in my main reply below.
$\displaystyle cos(x) + cos(2x) = 0$
$\displaystyle cos(x) + 2cos^2(x) -1 = 0$ Remember that $\displaystyle cos(2x) =2cos^2(x) - 1$
$\displaystyle cos(x)(1+2cos(x)) = 1$ Mr F says: This line does not lead anywhere helpful and may well reinforce a common misconception regarding the null factor law and its misuse.
Take it from here?