Consider the DE

. Give 2 solutions; one regular and worth 1 at the origin and the other of the form

where

and

are regular at the origin. Give the first 3 terms of the series of

and

.

My attempt: Divide the DE by x:

. In order to solve this DE, I had in mind to propose a solution of the form

where

would be the solution to this DE but when x tends to infinity. It turns out that this didn't simplify things as I'd hoped.

When

, the DE becomes

. I used Frobenius's method to solve this DE:

I assumed that

. I derivated this once and twice and plugged into the DE.

I eventually reached

.

The indicial equation leads to

or

. At first glance it looks like both solutions are acceptable.

So now I get a recurrence relation with

in terms of

and

which isn't what I hoped for. Maybe I shouldn't have proposed a solution of the form

? How would you tackle this problem?