Hei.

So I'm stuck on this series problem where I have:

1 / (n + sqr.root (n))

Starting from n = 1.

Can I simply conclude here that since 1/n is the lowest fraction which will give a divergent series the abovementioned series has to converge? (according to the p-test?).

I just can't find a way to actually calculate this. I tried to treat it as a telescoping series with 1/(sqr.root (x)*(sqr.root(x) + 1). However, this did not lead to any conclusive answer.

Any help will be greatly appreciated!