It turns out this limit exists:

\(\displaystyle \lim_{x \to 0} \frac{sin(ax)}{sin(bx)} = \frac{a}{b}\)

This limit can be confirmed via L'Hôpital's Rule, expanding numerator and denominator as Taylor Series, and expanding numerator and denominator as Pade Approximants. The algebra gets real ugly real fast, but the first few terms are enough to see that the limit exists and converges to the result above.

However, this limit does not seem to affect the corresponding limit for

\(\displaystyle \lim_{x \to 0} \frac{cos(bx)}{cos(ax)}\)

I was hoping the first limit would indicate the limit for the second, something like the following (assuming a = 0.5 and b = 1):

\(\displaystyle \sqrt{1-0.5^{2}} = \sqrt{1-0.25} = \sqrt{0.75} = 0.8660254037844\)

However, it does not. As far as I can tell,

\(\displaystyle \lim_{x \to 0} \frac{cos(bx)}{cos(ax)} = 1\)

I have checked this limit via L'Hôpital's Rule and expanding numerator and denominator as Taylor Series. They both converge to 1.

Yet it seems counter-intuitive.

I mean, if the sine version of the limit does not go to 0, you'd expect the cosine version NOT to go to 1, wouldn't you? Or have I flubbed up the algebra?