I need to compute this integral:
int ((1-x^2)/(x^3+x)) dx
I think I should use the partial fraction method to simplify the fraction
so
(1-x^2)/(x^3+x)= A/x + B/ (1+x^2)
Therefore
A(1+x^2)+B(x)=1-x^2
putting it in a polynomial form:
x^2 (A)+x (B) + 1 (A)= 1- x^2
and by equating the coefficients,
A = -1
B = 0
However, A doesn't satisfy the constant on the LHS (+1)
So what is the thing that I did wrong??
Thanks in advance