I am reviewing my analysis midterm to study for the final, and I can't seem to remember how to get the correct solution to one of the problems.

It says: consider the sequence {xn} with

{xn}=((n+1)^2+(-1)^n*n^2)/(n*(n+1))

Determine if it converges or diverges.

I got that it diverges, which is correct, but I didn't get credit because my method was invalid

I just did a simple divide through by the highest term and you see that it boils down to 1+(-1)^n, which diverges. Of course you can't actually do this because in doing so I took terms like 1/n^2 and said they converge to 0, and so, this is what you are left with.

The professor said the correct solution involves finding two subsequences which converge to different values. I am not sure how to do this. Any help please? Thanks.