## bernouilli inequality

show that $\displaystyle \left| \frac{a}{b} \right| <1$ then $\displaystyle \lim_{k\rightarrow \infty} \left( \frac{a}{b} \right)^k=0$

I'm questioning my proof because of the absolute value but here's what I have:

$\displaystyle 0< \left| \frac{a}{b} \right| <1$ let $\displaystyle q= \frac{a}{b}$ then $\displaystyle 0< |q| <1$

let $\displaystyle \delta >0$ and $\displaystyle |q|=\left| \frac{1}{1+\delta} \right|$

now applying the Bernoulli inequality I get:

$\displaystyle 0 <|q^k| = \left| \frac{1}{(1+\delta)^k} \right|\leq \left| \frac{1}{1+k\delta} \right|$

$\displaystyle \forall \ k \ \in \mathbb{N}, \ \epsilon >0 \ \mbox{let} \ k_0 \ \in \mathbb{N}: \frac{1}{k_0} <k\epsilon \Rightarrow k\geq k_0$

$\displaystyle 0 <|q^k| = \left| \frac{1}{k\delta} \right| \leq \left| \frac{1}{k_0\delta} \right| \leq \left| \frac{1}{\delta} \delta\epsilon \right| =|\epsilon|$

therefore $\displaystyle \left(\frac{a}{b} \right)^k \rightarrow 0$ as $\displaystyle n\rightarrow \infty$