[HARDCORE] first order recursive sequence problem.

Here is a problem I need to investigate from my 300 level calculus course. But I'm a stats student!!!! I've forgot most of hardcore calculus from first year courses. Especially sequence convergence stuff. Please help me with this.

Define the following first-order recursion that depends on the parameter z:
x(0) = 0
x(n+1) = 1 – z*x(n)^2 for all n≥0. (Note: the * here means multiplication, the z* appears below means the critical value of z, well thats what i believe)

(a) Restrict attention to z and x both between 0 and 1. The set of values of z for which the sequence {xn} converges has the form [0, z*]. Find z* and argue carefully that the sequence does indeed converge for all z in the interval [0, z*]. In your argument you may use without proof any results that are found in standard calculus texts, but these results should be clearly stated. You might well want to use other results that were developed in class. You should provide proofs for these, at least give me a good sense of how the argument goes and what standard results are used. It’s worth working on your arguments to find the cleanest and most natural path to the results you are aiming for.

(b) For the case z = 0.85 it turns out that the odd and even terms of the sequence both converge but that the two limits are different. Find these two limits (approximately) and provide the polynomial of which they are roots. Proofs are not needed.