the problem goes like that:
show that if f(x)-g(x)<<1 (x->0+), then exp(f(x))~exp(g(x)) (x->infinity)
thanks
You will have to be more precise about what:
"f(x)-g(x)<<1"
(It seems likely that you mean "|f(x)-g(x)|<<1" that is the difference is close to zero, rather than "f(x)-g(x)" is of large absolute value and negative.)
and
" ~ "
means, what happens is that:
as and under one interpretation and under another
RonL