Originally Posted by

**rortiz81** Kinda weird how little I could glean about this one on the webernet. Oh, there were TONS of other examples, but this one is, uhh, too simple? Maybe?

$\displaystyle \lim_{x\to\0+}\frac{ln x}{x}$

I know that the answer is $\displaystyle -\infty$. I know multiple ways to figure out the answer. My question to you is if I can use l'Hopital's rule on it?

If you use it without factoring first, you get $\displaystyle \frac{\frac{1}{x}}{1}$ OR $\displaystyle \frac{1}{x}$ which will tend to $\displaystyle \infty$ as $\displaystyle x \to\0$. But if you factor first, you get $\displaystyle \lim_{x\to\0}\frac{1}{x}$$\displaystyle \cdot$$\displaystyle \lim_{x\to\0} ln x$. Now from here you can just say it's $\displaystyle \infty\cdot-\infty=-\infty$ and be done with it, but I don't think that holds up to a math teacher (does it?) so IF I continue and get derivatives then it becomes $\displaystyle \lim_{x\to\0}-\frac{1}{x^2}$$\displaystyle \cdot$$\displaystyle \lim_{x\to\0}\frac{1}{x}$ which can be written as $\displaystyle \lim_{x\to\0}-\frac{1}{x^3}=-\infty$.

Intuitively this SEEMS to make sense, but am I missing something? I don't want to demonstrate this only to lose points for not seeing that l'hopital's rule doesn't apply. Can someone show me the light?

Thanks!