I am trying to integrate the following problem with limit +D and -D
Now intuition tells me to treat it like
Here is where I'm going wrong - the way I'm integrating gives me
But if I differentiate that I get
So have randomly created a factor of . Therefore I'm doing something wrong but I can't seem to get a rule to fit the situation.
Returns me to the original problem
But that would imply that the basic integral is divided by despite the fact that for all intents and purposes is a constant.
Can anyone explain, as I think I've horribly confused myself about something not that complex?