let's say that every subsequence of has a subsequence that converges to 0. Show that = 0.
If the lim X does not equal 0, then X either converges to a real number not equal to 0, or X diverges.
So can I use a theorem that says that if X converges to a real number x, then any subsequence will also converge to x....since it was given that the subsequence converges to 0, this is a contradiction....?
But I am having trouble with divergence. Also, in the question it is referring to a subsequence of a subsequence. Is this important? Or can I simply view a subsequence of a subsequence as a subsequence of X?
yes, and after re-reading your question, i realize that this doesn't help. haha. i should read more carefully
let me reconsider in light of this new information
(in case you are interested, the point i was getting at is that if we assume the subsequences of the subsequences converge to zero, yet, the sequence itself does not converge to zero, then we would have found a subsequence of the subsequence that does not converge to zero and hence arrive at a contradiction. however, the problem says each subsequence has "a" subsequence that converges to zero. not necessarily all subsequences of subsequences converge. so finding one that doesn't converge doesn't say anything, since you can argue there is some other subsequence that converge. with the "a" there, we only need one that converges, not all)
i still think contradiction is the way to go though. we just have to work a bit harder
yes. either of these should not be that bad. there is a third case i am worried about though. what if the limit simply does not exist? as in, the sequence alternates, or jumps up and down between values.
what about using limsup or liminf. have you done that in class? if we use those, then their limits will always exist, no need to worry about the bouncing up and down scenario
we do not need to know the supremum. it is enough to know that the limsup always exists. it either converges to a real number (particularly here, a non-zero one), or diverges to infinity or minus infinity.
thus, you have those 3 cases to deal with. we cannot fulfill the condition that each subsequence has a subsequence converging to zero in any of those cases.
assume . then, in particular, . thus, we have 3 cases:
(1)
(2)
(3) , where is some real number.
cases (1) and (2) are similar. and the same argument works in general. thus, it suffices to deal with cases (1) and (3)
case (1):
if , then infinitely many terms of become arbitrarily large. Thus, we can make a subsequence from such arbitrarily large elements. Say, a subsequence in which all the terms are greater than 1, for instance. then, clearly, no subsequence of that subsequence will converge to zero, which contradicts the defintion of .
case (3):
if , then infinitely many values of the sequence get arbitrarily close to . thus we can create a subsequence of terms "close" to . In particular, terms that are closer to than they are to . Then, for such a subsequence, no subsequence of that subsequence will converge to , which, again, contradicts how was defined.
thus, we have that