non-exact integrating factors vs homogeneous first order equations [example problem]

the problem is : (x+y) dx - x dy = 0

Let M = x+y , N = -x

my first attempt at the solution was via non-exact equations and integrating factors; i noticed that

M_y = 1 and N_x = -1 ; therefore i created an integrating factor mu(x) = (M_y-N_x)/ N

which is equal to -2/x.

therefore my integrating factor 'should' be exp[integral(-2/x) dx] = x^-2

then M* = M*mu(x) = 1/x + y/x^2 & N* = -1/x

integrating these both with respect to x and y respectively gives :

int [M* dx] = ln x - (1/3)* y/x^3 + g(y)

int [ N* dy] = -y/x + h(x)

which i could not succeed in matching to find a family of equations...

is there something wrong with my proccess or am i not allowed to use the method of integrating factors with nonexact equations on homogenous ODEs? my textbook supplies a simple solution using the fact that both M and N are both of degree 1, i had missed this when solving the problem and took the long way--but im not sure why it doesnt seem to work. (Angry)