Prove the famous identity
$\displaystyle \lim_{n\rightarrow \infty} \left(1+\frac{x}{n}\right)^n = e^x$
let $\displaystyle y = \lim_{n \to \infty} \Big(1+\frac{x}{n}\Big)^{n} $
$\displaystyle \ln y = \lim_{n \to \infty} \ln \Big[\Big(1+\frac{x}{n}\Big)^{n}\Big] $
$\displaystyle = \lim_{n \to \infty} n \ln \Big[\Big(1+\frac{x}{n}\Big)\Big] $
$\displaystyle = \lim_{n \to \infty} \frac{\ln \Big[\Big(1+\frac{x}{n}\Big)\Big]}{\frac{1}{n}} $
$\displaystyle = \lim_{n \to \infty} \frac{\frac{1}{1+\frac{x}{n}}\frac{-x}{n^{2}}}{\frac{-1}{n^2}} $
$\displaystyle = \lim_{n \to \infty} \frac{x}{1+\frac{x}{n}} = x$
so if $\displaystyle \lim_{n \to \infty} \ln y = x $ then $\displaystyle \lim_{n \to \infty} y = e^{x} $
Good!
Let's see in how many different ways we can prove it!
Here is my proof, without l'Hôpitals; let $\displaystyle P_n = \left(1-\frac{x}{n}\right)^n$. Then
$\displaystyle \log P_n = n\log\left(1-\frac{x}{n}\right)$.
As soon as $\displaystyle n>|x|$, we can use the series expansion $\displaystyle \log(1-y)=-\sum_{k=1}^\infty\frac{y^k}{k}$ with $\displaystyle y=\frac{x}{n}$ to get
$\displaystyle \log P_n = -n\sum_{k=1}^\infty\frac{x^k}{kn^k} = -\sum_{k=1}^\infty\frac{x^k}{kn^{k-1}} $
$\displaystyle = -x\sum_{k=1}^\infty\frac{x^{k-1}}{kn^{k-1}}=-x\left(1+\sum_{k=2}^\infty\frac{x^{k-1}}{kn^{k-1}}\right)$
and $\displaystyle \left|\sum_{k=2}^\infty\frac{x^{k-1}}{kn^{k-1}}\right| \leq \sum_{k=2}^\infty\frac{|x|^{k-1}}{kn^{k-1}} \leq \sum_{k=2}^\infty\frac{|x|^{k-1}}{n^{k-1}} = \frac{|x|/n}{1-|x|/n} = \frac{|x|}{n-|x|} \rightarrow 0$ as $\displaystyle n \rightarrow \infty$.
So as $\displaystyle n \rightarrow \infty$, we have $\displaystyle \log P_n = -x\left(1+\sum_{k=2}^\infty\frac{x^{k-1}}{kn^{k-1}}\right) \rightarrow -x$, so $\displaystyle P_n \rightarrow e^{-x}$.
Hello,
$\displaystyle \ln P_n=n \ln\left(1+\frac xn\right)$
As n goes to infinity, $\displaystyle \frac xn$ goes to 0.
Thus we have the asymptotic equivalent : $\displaystyle \ln\left(1+\frac xn\right)=\frac xn +O\left(\frac 1n\right)$
Then $\displaystyle \ln P_n=x+O(1)$
---> $\displaystyle \lim_{n\to\infty} \ln P_n=x$
And it follows that $\displaystyle \lim_{n\to\infty}P_n=e^x$
Thank you Moo, that is much simpler.
Here is another proof I thought of today. Taking the logarithmic derivative of $\displaystyle Q_n(z)=\left(1+\frac{z}{n}\right)^n$, we get $\displaystyle n\left(\frac{1/n}{\left(1+\frac{z}{n}\right)}\right)=\frac{1}{\le ft(1+\frac{z}{n}\right)}$, and this goes to $\displaystyle 1$ as $\displaystyle n\rightarrow \infty$. Hence the function $\displaystyle f(z)=\lim_{n\rightarrow \infty}Q_n(z)$ satisfies $\displaystyle \frac{f'(z)}{f(z)}=1$ for all complex $\displaystyle z$, and it follows that $\displaystyle f(z)=e^z$
Edit : oops, this is exactly what Random Variable did
"To prove" means to argue for the validity of a statement based on prior knowledge. But what prior knowledge are we allowed to presuppose? In introductory courses, for example, this limit is sometimes used to define $\displaystyle \mathrm{e}^x$. Therefore, no prior knowledge of this function or its inverse may be used to show the existence of the limit in such a case.
Instead of involving logarithm and its development in the neighbourhood of 1, we can use (in a more "algebraic" way) the power series definition of the exponential: $\displaystyle e^x=\sum_{k=0}^\infty \frac{x^k}{k!}$.
We develop $\displaystyle \left(1+\frac xn\right)^n=\sum_{k=0}^n{n\choose k}x^k$ $\displaystyle =\sum_{k=0}^n \frac{x^k}{k!}\frac{n(n-1)\cdots (n-k+1)}{n^k}$.
For a fixed $\displaystyle k$, the ratio $\displaystyle \frac{n(n-1)\cdots (n-k+1)}{n^k}$ tends to 1 (the numerator is a monic polynomial of degree $\displaystyle k$ in the variable $\displaystyle n$).
"Formally", we thus have $\displaystyle \left(1+\frac xn\right)^n\to_n \sum_{k=0}^\infty \frac{x^k}{k!}=e^x$.
The good point of this proof is that we can "see" why the limit is the exponential; this may have been Euler's way to prove the formula. However, it is not trivial to prove the limit rigorously (there is a summation in $\displaystyle k$, not a fixed index...). This can probably be done elementarily, but here is a short not-so-elementary way. We have $\displaystyle \sum_{k=0}^n {n\choose k}x^k=\sum_{k=0}^\infty a_{k,n}$ where $\displaystyle a_{k,n}=0$ for $\displaystyle k>n$, and $\displaystyle a_{k,n}={n\choose k}x^k$ otherwise. For fixed $\displaystyle k$, $\displaystyle a_{k,n}\to_n \frac{x^k}{k!}$. The previous ratio is bounded by 1, hence $\displaystyle |a_{k,n}|\leq \frac{|x|^k}{k!}$ for all $\displaystyle k,n$. We have $\displaystyle \sum_{k=0}^\infty \frac{|x|^k}{k!}<\infty$, therefore we can apply the bounded convergence theorem to take the limit in $\displaystyle n$ in the summation.
En passant: you probably meant little o's. Indeed, $\displaystyle x+O(1)$ is a bounded sequence, and not a sequence converging toward $\displaystyle x$ (contrary to $\displaystyle x+o(1)$). The first two lines were correct, but you couldn't deduce the limit with big O's.
If you use a function $\displaystyle f(x) = a^x$ and try to find its derivative, by defining $\displaystyle e$ as the value of $\displaystyle a$ that makes $\displaystyle f'(x) = f(x)$ we have...
$\displaystyle f'(x) = \lim_{h \to 0}\frac{f(x + h) - f(x)}{h}$
$\displaystyle = \lim_{h \to 0}\frac{a^{x + h} - a^x}{h}$
$\displaystyle = \lim_{h \to 0}\frac{a^x(a^h - 1)}{h}$
$\displaystyle = a^x \lim_{h \to 0}\frac{a^h - 1}{h}$.
For the function to be its own derivative, we would require that
$\displaystyle \lim_{h \to 0}\frac{a^h - 1}{h} = 1$
and for simplicity, we will now use the symbol $\displaystyle e$ to represent this particular value of $\displaystyle a$.
So we have
$\displaystyle \lim_{h \to 0}\frac{e^h - 1}{h} = 1$
$\displaystyle \lim_{h \to 0}(e^h - 1) = \lim_{h \to 0} h$
$\displaystyle \left[\lim_{h \to 0}e^h\right] - 1 = \lim_{h \to 0} h$
$\displaystyle \lim_{h \to 0}e^h = 1 + \lim_{h \to 0}h$
$\displaystyle \lim_{h \to 0}e^h = \lim_{h \to 0}\left(1 + h\right)$
$\displaystyle e = \lim_{h \to 0}(1 + h)^\frac{1}{h}$.
Now by letting $\displaystyle n = \frac{1}{h}$, we notice that as $\displaystyle h \to 0, n \to \infty$.
So $\displaystyle e = \lim_{n \to \infty}\left(1 + \frac{1}{n}\right)^n$
Can you go from here?
But if you separate the limits then you may change the speeds of the variables, i.e., you may use $\displaystyle \lim_{h\to0}h^{2}$ instead of $\displaystyle \lim_{h\to0}h$, which will lead you to a different result...
Also in (**), you can not take the power $\displaystyle \cdot^{h}$ outside the limit either since it is not a constant, i.e., depends on the limit variable...
A simple note.
$\displaystyle \lim_{n\to0}\frac{f(n)}{g(n)}=\frac{\lim\limits_{n \to0}f(n)}{\lim\limits_{n\to0}g(n)}$ provided that $\displaystyle \lim_{n\to0}g(n)\neq0$.