the problem is : (x+y) dx - x dy = 0
Let M = x+y , N = -x
my first attempt at the solution was via non-exact equations and integrating factors; i noticed that
M_y = 1 and N_x = -1 ; therefore i created an integrating factor mu(x) = (M_y-N_x)/ N
which is equal to -2/x.
therefore my integrating factor 'should' be exp[integral(-2/x) dx] = x^-2
then M* = M*mu(x) = 1/x + y/x^2 & N* = -1/x
integrating these both with respect to x and y respectively gives :
int [M* dx] = ln x - (1/3)* y/x^3 + g(y)
int [ N* dy] = -y/x + h(x)
which i could not succeed in matching to find a family of equations...
is there something wrong with my proccess or am i not allowed to use the method of integrating factors with nonexact equations on homogenous ODEs? my textbook supplies a simple solution using the fact that both M and N are both of degree 1, i had missed this when solving the problem and took the long way--but im not sure why it doesnt seem to work.
oh wow, how embarrassing, i found my error - a simple integration miscalculation stemming from a lack of coffee so late at night, apologies and thanks for the help on this example.
As a relatively related aside -> will there every be scenarios in which the method of integrating factors would be unable to solve first order ( or any order) linear homogeneous equations? intuitively it feels to me as if the process for r h.o.d.es [using the substitution y=xv and making a separable equation in x & v] is a sort of sub-case of the integrating factor method... i cant really think of any counterexamples, though my differential equations skills are still weak.