Originally Posted by

**Danshader** Well i got this question from my tutorial. Given the answers as well with it.Still couldn't understand how my lecturer got the answer.

Using Laplace Transform methods, solve the following ode:

$\displaystyle tV'' + 2V' + tV = 0$

subject to the following boundary conditions:

$\displaystyle V(0) = 1, V( \pi ) = 1 $

*Hint*: You may assume that $\displaystyle L^{-1} \{ tan^{-1} \left( \frac{1}{s} \right) \} = \frac {sin t}{t} $

This is what i did:

$\displaystyle - \frac{d}{ds} \{ s^2V(s) - sv(0) - v'(0) \} + 2\{ sV(s) - v(0) \} - \frac{d}{ds}V(s) = 0 $

$\displaystyle V'(s)(-s^2 - 1) - v(0) = 0 $

$\displaystyle V(s) = \int \frac{1}{s^2 +1} ds $

letting $\displaystyle u = \frac{1}{s} $

$\displaystyle V(s) = \int \frac{1}{1+u^2} du $

$\displaystyle V(s) = tan^{-1} \frac{1}{s} + C $

Then by inversing laplace

$\displaystyle V(t) = \frac{sin t}{t} +L^{-1} C $

the answer given was $\displaystyle V(t) = \frac{sint}{t} + \frac{\pi}{t} $

i don't know how did $\displaystyle L^{-1} C $ becomes $\displaystyle \frac{\pi}{t} $