I am having trouble with the following integral. I would like to use residue theorem to solve it:

\int_{-1}^{1}\frac{1}{(1-x^{2})^{t}(a^{2}+x^{2})}dx where a > 0 and t is either +1/2 or -1/2

However I am having trouble finding a contour to use that will work. Any suggestions? I realize that for the -1/2 case there will be a residue at infinity.

There are obviously branch points at x = +1 and -1.