Hi all, can't really figure this one out, any help will be much appreciated.
Find using complex methods lim as r --> infinity for the integral, from -r to r, x/(1 + x + x^2).
Also, why is the limit necessary?
Many thanks in advance!
Thanks for the prompt reply. I am aware that the integral diverges, I was thinking that that's the reason why lim r-->infinity is used instead of the taking the integral directly from （infinity， -infinity）
However I'm afraid I don't see how this applies to the question. I understand your example and am able to solve it, but how does it apply to my question? I am unable to show that the semi-circle integral vanishes.
Say we have . The semi-circle integral is,However I'm afraid I don't see how this applies to the question. I understand your example and am able to solve it, but how does it apply to my question? I am unable to show that the semi-circle integral vanishes.
Now use estimation techniques,
But now note,
Because the degree of numerator is less than degree of denominator.
The problem is when you have thne by trying to use the same estimation trick you run into a problem. Because then you would end up with,
Thus, the semi-circle integral does not vanish. But that still is not the major probem. The serious problem is that the integral that you are trying to find diverges. So there is not point in doing this computation anyway.
I think I've solved it - the semi-circle integral doesn't vanish, but can be determined taking a limit to infinity. I tried two different approaches and got different answers when assuming the integral vanishes, and so kind of realised that it can't vanish, and guessed from the two values what the value of the integral should be. And when I worked out that integral I got that value - I guess that's the reason why the follow-up to the question asks why it is necessary to take a limit.