Hi,

Problem 1.

Let f be a continuous function on the interval I, [x,y] with [x,y] subset of I, and t R with f(x) < t < f(y), then there exists s (x,y) s.t f(s) = t.

W.L.O.G suppose that f(x) < t < f(y).

Let U = {u (x,y) | f(u) < t}, x U, so U is non-empty.

Clearly, the set U is bounded above by y.

Let s = sup U, and n N. Since s is the sup of U, s - 1/n is not an upper bound for U,

By bolzano-weierstrass, there exists a x_n U such that s - 1/n < x_n < s. For all n N, x_n U implies that f(x_n) < t. (this is how U is defined)

You can see that x_n converges to the sup U = s. By continuity, f(x_n) converges to f(s) f(t).

Finish the proof, using a similar argument, you want to show that f(s) f(t).

Hint: Use Bolzano-Weirstrass, the fact that s = sup U, and the fact that there is a convergent subsequence such that y_n is not in U.

Problem 2

We first need to define a dominant term. The nth term of a sequence is dominant if for all m > n, a_m < a_n. In other words the dominant term is greater than all of the elements that come after it. (note that this term may be defined a little different in your book or an inferior term may be discussed). Anyway, the proof would not be any different.

Now, we continue our proof by inspecting two cases.

1. First case: A sequence contains infinitely many dominant terms.

-Find the first dominant term, then the second, then the third. Do you see a pattern? You can obtain a decreasing subsequence by listing all of the dominant terms in order.

2 Second case: A sequence contains finitely many dominant terms.

- List all of the dominant terms. Then, select the element a_k that comes right after the last dominant term, and construct your increasing subsequence with those elements. Do you see why?

Let me know if you run into problems.