I am reviewing some problems from polynomial rings and hope that someone can tell me if I am on the right track..

PROBLEM: Let $\displaystyle f(x) \in R[x]$. If $\displaystyle f(a) = 0, f'(a)=0 $show that $\displaystyle (x-a)^{2}$ divides $\displaystyle f(x) $.

What I have done:

$\displaystyle f(x) = g(x)(x-a)^{2}+r(x) g(x), r(x) \in R[x], deg r(x) < deg (x-a)^{2}$ or $\displaystyle deg r(x) = 0$

$\displaystyle f'(x) = 2g(x)(x-a) + g'(x)(x-a)^{2} + r'(x)$

$\displaystyle f(a) = g(a)(0) + r(a) \implies f(a) = r(a) = 0$

so $\displaystyle f(x) = g(x)(x-a)^{2}$

$\displaystyle f'(a) = 2g(a)(0) + g'(a)(0) +r'(a) \implies f'(a) = r'(a) = 0$

Thus $\displaystyle (x-a)^{2}$ is a factor of $\displaystyle f(x) \implies (x-a)^{2}|f(x)$

Is this even close???? Thanks in advance