Hey PerrisKaplan.
What did you get for the partial derivatives (i.e. the expression)? What happens if you evaluate this at (0,0)?
I need to show that the partial derivatives of f(x, y) = (xy)^{1/3} exists at the origin, but I don't know how to go about it. I have f_{x} and f_{y}, but I don't know what to do next. I'd appreciate if you just told me how to do it and didn't solve the problem outright.
Remember that you are doing a 0*blah so you should get 0 since 0*anything (even something that goes to infinity in the limit) will be zero.
You should try using limit laws for f_x and f_y where for f_x you fix y = 0 and for f_y you fix x = 0 and then prove the limits are both 0.
Back to the definition:
What happens when you set x=y=0?
It might help to think of what happens if h is really small (positive or negative). That's essentially what a limit does - when h=0, you have nonsense, but you have a clearly defined value when h is nonzero.
- Hollywood
You obviously mean "-2/3" not "2/3".
However, the "partial derivatives" at (a, b) are given by
and
which are especially easy to do for a= b= 0.
Note that, in calculus 1, we say that a function is "differentiable at x= a" if and only if its derivative exists there. However, in Calculus "of several variables", it is not sufficient that the partial derivatives exist. One can show that the function is differentiable at (a, b) if and only if both partial derivatives exist and are continuous for some area around (a, b). This function has partial derivatives , at (0, 0) but is NOT differentiable there because the derivatives are not continuous at (0, 0).