If I have Y(x2,x3,x4)=(sqrt(1-x2^2-x3^2-x4^2),x2,x3,x4), how do I find the magnitude of the gradient? Thanks for any advice!
Thank you for the reply, but I'm still confused. So with pythagoras' theorem I would have Y=(x1,x2,x3,x4)? I know that for Y(s)=(sqrt(1-s^2),s) the gradient is (-s/sqrt(1-s^2),s) and the magnitude of the gradient is 1/sqrt(1-s^2), and I'm supposed to get an expression similar to this, but I'm not sure how I can rewrite the original equation. If I put r=(x2,x3,x4) then it would be Y(r)=(sqrt(1-r.r),r), and this is where I'm confused, how do I find the gradient of this?
Consider ||gradY|| with the gradient defined by:
Gradient - Wikipedia, the free encyclopedia
where in R^n, ||.|| is defined by ||.|| = SQRT(x1^2 + x2^2 + ... + xn^2).