How do i solve this ODE?
f"(y) + y = cosx
char. eqn = r^2 + 1
therefore r = (+/-)j
therefore yc = Acos jx + Bsin jx ?
For RHS
yp = Ccosx + Dsinx <<<<< Do i need to multiply x or x^2 here?
many thks.
wrong.
Here's a quote from wikipedia:
Method of undetermined coefficients - Wikipedia, the free encyclopediaIf a term in the particular solution for y appears in the homogeneous solution, it is necessary to multiply by a sufficiently large power of x in order to make the two solutions linearly independent
I'm wondering why you say that $\displaystyle r=\pm j$
When you solve the homogeneous solution with irregular roots $\displaystyle \alpha \pm \beta$ you get a solution of the form $\displaystyle e^{\alpha t}(C_{1}sin(\beta x)+C_{2}cos(\beta x))$
So, unless you know something I don't, $\displaystyle j=1$ so it's not required in your solution as part of the angles.
As for what your question was, just to follow up on the answer Peritus gave, you never want to multiply by a degree of x greater than what is needed. A double root would never occur here, because you have irregular roots, so you will never have to multiply by more than x. The only situation you would need x^2 in is if you had a double root for the homogeneous solution, that was also the same form as the inhomogeneous solution. Very rarely is x^2 needed in the Method of Undetermined Coefficients, although I don't want you to think that it is never used.