Ok, I was committing a silly mistake. Its solved.
Hi there. I was seeing the proofs on my class on differential equations from last year. And I have some doubts about this demonstration.
We demonstrate that given a solution f(x) for
(1)
A linear independent solution is given by:
We used abel's theorem, and that for linear differential equations the solution is given by
(2)
The demonstration goes as follows:
If g is solution of (1) because of abel's theorem:
Then
Because of (2) we have:
The problem comes with the next step. This is what follows in my notebook:
I know that: is a constant, and then it can goes out of the integral, but what bothers me is that f that is multiplied outside of the integrand, and divided inside of the inegrand, like multiplying by 1 (that's what I think the professor did, but I might be wrong). I think that can't be done by preserving the equality. Instead of it, considering k=0 and k1=1, we get the result that we were looking for:
I don't know if it's clear what I don't get. This is what I think that the professor did:
And I think thats wrong, I don't think the equality is preserved by multiplying f outside the integrand and dividing inside of it. Perhaps I didn't copied something, I don't think my professor did a mistake really, perhaps there is some middle step that I've lost, I don't know. Anyway, I need to know the demonstration.