By changing to polar coordinates, evauluate the integral (where a>0):
int[0,a]int[0,sqrt(a^2-x^2)](x^2 + y^2)dydx
Sorry I don't know latex, ill say it in words...
"...The integral from 0 to a from the square root of a square minus x squared of the function x^2+y^2 with respect to dydx"
I've got as far as changing my function to r^3 and changed my inner limits too a and 0, but im stuck on how to change my outer limits of a to 0 :S. I hope this makes sense
But the way I solved changing the first set of limits is by setting y= sqrt(a^2 - x^2) thus y^2 + x^2 = a^2, then using the fact x=rcos and y=rsin i have r^2=a^2 (sin^2+cos^2=1) so r=a as a>0, in normal polars to find my angle i'd have tan = (y/x), so what would be the steps to find my angle in this case systematically? For other more complicated cases? Thanks