Solving an integral by calculating residues

The integral is $\displaystyle I=\int _0 ^{+\infty} \frac{dx}{x^2+b^2}$ with $\displaystyle b>0$.

Attempt: Let $\displaystyle f(z)=\frac{1}{z^2+b^2}$. The poles of $\displaystyle f$ are at $\displaystyle z_1=ib$ and $\displaystyle z_2=-ib$.

As a contour, I'm tempted to choose a circle with radius R and let it tend to infinity so that the enclosed region is the whole complex plane (there's only 2 residues so it shouldn't be that hard).

$\displaystyle I=\lim _{R \to \infty} \int _0 ^R \frac{dz}{z^2+b^2} = \lim _{R \to \infty} \int _ 0 ^R \frac{dz}{(z-ib)(z+ib)}=2 \pi i \sum _{m=1}^2 Res (f,z_m)$.

My problem is that $\displaystyle Res (f,ib)= \lim _{z \to ib} \frac{z}{z^2+b^2}=\frac{-i}{2b}$. Similarly, $\displaystyle Res (f,-ib)=\frac{i}{2b}$ and therefore the integral is worth 0, which is untrue. Where did I go wrong?