the problem goes like that:
show that if f(x)-g(x)<<1 (x->0+), then exp(f(x))~exp(g(x)) (x->infinity)
thanks
You will have to be more precise about what:
"f(x)-g(x)<<1"
(It seems likely that you mean "|f(x)-g(x)|<<1" that is the difference is close to zero, rather than "f(x)-g(x)" is of large absolute value and negative.)
and
" ~ "
means, what happens is that:
$\displaystyle \frac{\exp(f(x))}{\exp(g(x))} \to k$ as $\displaystyle x \to \infty$ and $\displaystyle k\approx 1$ under one interpretation and $\displaystyle k \approx 0$ under another
RonL