I have no idea how i should solve this problem..
Please help.. I seriously have no idea..
From the definition of dr
Take the magnitude of dr
You can find the values of in terms of t easily. Then get |dr| on its own and integrate it between t=0 and t=1
Edit. Didnt see that the density wasn't constant.
After you get an expression for dr in the form |dr|=g(t)dt since the density is 1+t find the integral of (1+t)g(t)dt between 0 and 1