Originally Posted by

**edudlive** We started on this in class today and my professor gave no more than a quick glance at how to solve and possible questions that would arise when trying to complete the homework.

When graphed and explained the proof made perfect sense. It is when I try to complete the homework that I run into problems. The instructions for the problem read:

1) lim of (x^2 + 1) as x approaches 0 (the value of L is given in these problems, and is of course 1 in this case)

The only way in which the professor showed us how to solve the problems is such:

the proof states that |F(x) - L| < epsilon

so my problem is such:

|[x^2 + 1] - 1| < 0.1

We break down the absolute value down into:

-0.1 < (x^2 + 1) - 1 < 0.1

which ends up looking like this:

-0.1 < X^2 < 0.1

Basically from there he jumped to solving X^2 = 0.1 to get x = (0.1)^1/2, which is the correct answer. Now for all problems in which one of the sides of the inequality is outside the range of the function (ex. -1.1 as a value for cosx, or -0.1 as a value for x^2), this gives me the correct answer...however it is obvious to me that I'm missing a piece of the puzzle.

The second problem on the homework is this:

lim of (x^2 + 1) as x approaches 2, L=5 epsilon=0.1

i start with |(x^2 + 1) - 5|<0.1 which breaks down into

3.9 < X^2 < 4.1, now where do I go from here to find a correct delta?

I've tried reading the book (which is useless), my companion ("The Calculus Lifesaver"), and the internet and none of the resources seem to explain to me how to solve the problem...at least in a fashion that I can grasp.