Okay, so I have a problem where
and I need to show using the formal definition that
converges to 0:
.
So I let
. Simplifying it, I get
, essentially getting within distance
of
for any integer
where
.
So then I get
.
Doing some manipulation, I eventually get to
This is where I get lost on showing that
.
Now my professor didn't really sufficiently explain how to do a formal proof even when we already assume we know the value of
. For instance, we know
for this
. Knowing this, the inequality would become
. Do we know that this is true because
is always less than
since
, or is there more that needs to be shown? If that is sufficient, than how can that line of thought be applied when we assume we don't know the value of
(like in this problem)?
Sorry if there is already a thread that asks and explains this. Our class pretty much went through sequences today in about 30 minutes so I'm not too sure on this stuff. Thanks!