This is an exercise from Stein & Shakarchi - Complex Analysis

II.7) Prove that $\displaystyle \int_{0}^{2\pi}\frac{1}{(a+\cos\theta)^2} d\theta=\frac{2\pi a}{(a^2-1)^{3/2}}$ for $\displaystyle a>1$

How do I start?

I know this function has a pole in $\displaystyle a=-cos \theta$

I know the function can be rewritten as: $\displaystyle \frac{2}{2a^2+4a\cos(\theta)+\cos(2\theta)+1}$ or as $\displaystyle \frac{1}{(a+\frac{e^{i\theta}+e^{-i\theta}}{2})^2}$

What I do not know: what to do? Do I use the residue theorem? Do I look for a nice contour?

(Also in general: how to tackle complex integrals? Step by step.)