Given that x_i - y_i goes to zero as i goes to infinity and that the ∑ x_i converges, show that ∑ y_i may diverge.

I don't understand this question. Am I supposed to prove this using ∑ y_i as some arbitrary sequence, or can I come up with some example with real numbers?