Hi,
I have a question about an Epsilon Delta proof in the form of f(x) = L, where x goes to infinity and L is 1/3.
(x^2 + x - 1) / (3*x^2 - 1) = 1/3 (where x goes to infinity)
For the question I did the following things:
I wrote the whole expression (A) in the following way:
A - (1/3) < Epsilon
The function did not get simplified nicely at the end, and I had to to use the square root formula to get x at the end:
I got this as inequality for x:
x > (1 + sqrt(1 - 8Epsilon + 12Epsilon^2)) / 6Epsilon
From that point I didn't know how to continue. I will be more than glad, if you could give me an idea.
Clearly, you can see that the is increasing in a neighborhood of (you can see it by differentiating) and . We have to show that for every there exists such that implies , where for . Note also that since is increasing, then is also increasing and , which implies that is positive in a left neighborhood of while it is positive in right neighborhood.
Now consider the case, but arbitrarily closer to . Then, for every , we have to find such that implies .
By considering the increasing nature of , we solve and get (we picked the positive root of the quadratic equation).
Now let but again closer to .
Then, for every , we have to find such that implies .
In this case, we solve and get (we picked the smaller root in a very small neighborhood of ).
We therefore for every have whenever , where for .
This completes the proof.
But I have to say that to learn technique this example is not a good start...