By definition the derivative is additive, and it trivially follows that the derivative "moves in" to finite sums: .

What I don't fully understand is when the derivative commutes withinfinitesums:

. (1)

We use this property all of the time in Fourier analysis, study of solutions to PDEs, etc. But I don't think it holds in general, even when all sums involved converge absolutely. For instance, consider

for i > 0 and .

Then for all x the sum of the f_i converges absolutely to 0. But

What, then, are the conditions necessary for the derivative to commute with an infinite sum? Is absolute convergence of both sums in equation 1 in a neighborhood of a (and not just at a) enough? If so, how do I prove it?