Given , , and
I need to show monotonous decrease, and
I can show monotonous decrease like this:
but I'm not sure if that's proper induction, because the assumption step ( ) doesn't match the final step.
Is that a rigorous proof?
Given , , and
I need to show monotonous decrease, and
I can show monotonous decrease like this:
but I'm not sure if that's proper induction, because the assumption step ( ) doesn't match the final step.
Is that a rigorous proof?
Right, that's what I thought. I'm wondering if it can be done by induction at all, because of the x_n and the 1/x_n which don't seem to cancel if you're doing it for x_n+2. Anyway, maybe that will do.
The reason I wanted to prove monotony was that it's not safe to assume until you can prove they're converging. I showed that x is always positive (bounded below by zero) by a similar method:
which is true since since both x_n and alpha are positive. Not sure if that's safe to say..
I don't think so, since we don't know that yet, we only know this for , it should be repeated by induction.
Therefore the proof for the decreasing behaviour by naught101 is not actually a good one.
Here is what I have, I hope you will find it rigorous.
Let
....for .
So the rational difference equation can be written as
....for .....(1)
Then, we see that
, and ,....(2)
i.e., attains its minimum value at with the minimum value which is at the same time the unique equilibrium of the rational difference equation, i.e., there is no value such that .
Now, define ....for .
Here is the most important part.
Clearly, ....for all .
And , which yields ....for all , i.e.,
....for all .....(3)
See the following graphic.
Red: , Blue: , and Green: .
Codes of the graphic for Mathematica 7.0
Now, we are ready to finalize the proof.Code:Show[{Plot[{\[Lambda],1/2(\[Lambda]+1/\[Lambda]),1},{\[Lambda],0,3},PlotRange->{{0,3},{0,3}},PlotStyle->{{Red},{Blue},{Green}},AxesOrigin->{0,0},AxesLabel->{\[Lambda],\[Nu]},LabelStyle->Directive[White,Large]],Graphics[{Text[Style[Sqrt[\[Alpha]]],{2.7,0.8}],Text[Style["f"],{2.7,1.4}],Text[Style["I"],{2.7,2.5}]}]}]
Let , then taking on both sides by considering (3), we have
.....(4)
Then similarly considering (3) and (4), we have
, and by the emerging pattern, we have in general that
....for all ,
which implies that the sequence is decreasing and so it has a finite limit, more precisely, , where .
The rest is simple, passing now to limit as on both sides of (1) and using (2), we learn that .
This completes the solution.
@redsoxfan325: Can you please tell me in which post we have shown boundedness or decreasing nature of ?
I would like to have your attention that it is not shown in the first post since we don't know for all but we only know it for .
Please don't get me wrong, I just want to clarify the situation.
naught101 showed that it was monotonically decreasing in the first post. And it is bounded below by zero simply because the first term is positive, and the sequence is defined recursively using only division and addition of positive numbers, so there is no possibility of any of the terms being less then zero.
Do you mean boundedness from below imply convergence of a sequence?
If so, every nonnegative sequence would be convergent, but not.
Please refer to the first post again, while naught101 was trying to prove decreasing behaviour of the solution he/she assumed (which is not proved), so that we still don't know that the sequence is decreasing (which implies boundedness from above).
Therefore, we don't know whether the limit of the solution exists or not, and therefore, we can not pass to limit on both sides of the equation.
That's all I want to mention.
I think you have missed some parts of the discussion.
I read naught101's proof incorrectly. I thought he had proved the monotonically decreasing part but he sort of "spliced" two separate proofs and I didn't even notice. So you're right. We didn't yet know that it was monotonically decreasing.
The theorem I was referring to states that (in a complete metric space):
"If a sequence is monotonically decreasing and bounded below, then it converges." (and vice-versa)