using divergence theorem to prove Gauss's law

so if i have the vector field F = R/r^3 where R is (x,y,z) and r is the magnitude of the vector R, then div(F) = 0 except for at the origin where F is not defined.

i want to show that the volume integral of div(F) over a sphere centered at the origin is 4pi. so using the divergence theorem i have the surface integral of F over the surface of the sphere = the volume integral of div(F) over the solid sphere. it is easy to show that the flux of F through the surface of a sphere is 4pi. but the problem is that F is not continuously differentiable at the origin and the statement of the divergence theorem (at least the one i know of) requires that the vector field F be defined on a neighborhood containing the solid sphere. but that is not true because F is not defined at the origin so i shouldn't be able to apply the divergence theorem right?

however it seems my physics book has overlooked this and came to the conclusion that the volume integral of div(F) is indeed equal to 4pi by the divergence theorem which i am confused about.