It converges. Note

, and

.

The sign changes when

is a square, so that we find, grouping the terms between two squares:

where

.

The idea is to prove that

converges. There are different ways to do that:

- using the expansion

to find that

is the sum of the general terms of two convergent series;

- applying directly the alternating series theorem by showing that

is decreasing and converges to 0. Neither is trivial. To show that

is decreasing, I found (if there's no mistake in my computation) that using comparison with an integral works: show that

is less than some integral (

), and

is greater than another one, which is greater than the previous one (usual comparison with the integral of a decreasing function)... This comparison allows as well to show that

converges to 0. [I let you try to fill in the details]

Once you've shown that

converges, you must deduce that the initial series converges. However, the difference between

and

(where

is the greatest square less than

) is less than

, which converges to 0. There may be mistakes with the indices here, but this is the main idea.