
Question on series
Given that x_i  y_i goes to zero as i goes to infinity and that the ∑ x_i converges, show that ∑ y_i may diverge.
I don't understand this question. Am I supposed to prove this using ∑ y_i as some arbitrary sequence, or can I come up with some example with real numbers?

let $\displaystyle x_i=\frac1{i^2},$ $\displaystyle y_i=\frac1i,$ clearly the difference goes to zero, but $\displaystyle \sum y_i$ diverges.

You have simply to put a concrete example such that this situation happens. take $\displaystyle y_i=\frac{1}{i}$, and $\displaystyle x_i$ any convergente sequence.