L'Hospital's rule is applicable. Perhaps though you don't wish to go that route.
Hi everybody, thanks for reading.
I'm having trouble proving lim(sinz/z) = 1 |z-->0.
I tried using the definition with epsilons but reached no where.
The regular proof, for one variable, is of no help here. I cannot see how to prove that |sinz|<|z|. I don't even think it's true....
Also, using U(x,y) and V(x,y) isn't much help for I get pretty long two-variable functions of which it is not any easier to calculate the limit when x,y--->0.
In other words - I'm lost :-\
Thank you!
Tomer.
I think it would be hard to do this from first principles. But it's easy enough if you're allowed to quote results for example about power series representations.
The power series has infinite radius of convergence. So also has infinite radius of convergence, and goes to 1 as z goes to 0.
1. Thanks
2. Unfortunately, we have not yet studied expensions of complex functions (this is also giving me a hard time proving the chain rule, bah).
However, a friend of mine has only 10 minutes ago told me of something he thought of:
lim sinz/z = lim (sinz-sin(0))/(z-0) = sin'(0) = cos(0)
of course I only need to show sin'x = cosx, but I think that's not gonna be hard...
Anyway, I really appreciate the suggestions, and if you have other original ideas I'd be glad to hear.
Gotta go right now
Thanks again!
Tomer.
I can see where you want to go from there, which is to slip the limit past the integral sign. But exchanging limiting operations (applied to a function of a complex variable in this case) is a delicate business, which needs careful justification. There's a substantial theorem being secretly used here!
In fact, this looks like a very neat way to show that , and the change-of-limiting-operations procedure is justified, essentially because the function (z,y)→cos(zy) is locally uniformly continuous. But that takes at least as much machinery to prove as the other methods proposed in the previous comments.
Hi Aurora,
I shall give a [very] brief proof. Someone will pick me up if there are any mistakes.
If then there is not much difficulty in showing that . Since we are assuming then we know that and thus . In other words,
Note that
Since we are assuming that then and thus and hence from we have
We have been assuming that but equation also holds if since and . We know that we have and so using equation and the squeeze theorem we can see that
Hope this helps.
If you've studied a bit of complex differentiation, you probably know how to differentiate the exponential function (same formula as for the exponential function on real numbers). Then, since you've probably defined (and ), you can prove that indeed for complex , and apply your friend's idea.
I do not understand why to use all the unnecessary proofs because, by definition, is defined in terms its power series*.
That is what Opalg did.
*)Every formal text on complex analysis would define sine in terms of a power series. If not, like through exponentials, then we can use the power series on the exponential to derive the power series for sine. Thus, the power series definition is basically the definition for sine. So why use any other approach?