I think you can avoid having to square y' by doing this
So you only need to find the first derivative and you don't need to square it. Note that the limits of the integral change because you change the variable of the integration from x to y
Hi,
I just started learning calculus of variations, and got this problem on the first day:
Given
and
,
find the and that minimizes .
My approach so far:
First find and , then integrate .
Define the answer as a new function and find the point where its partial derivatives are both zero:
This turns out to be a lot of work, and it's hard not to make any mistakes along the way. So I was wondering if there's a better way to get the solution. A shortcut, if you will.
Thanks,
TwoPlusTwo
I think you can avoid having to square y' by doing this
So you only need to find the first derivative and you don't need to square it. Note that the limits of the integral change because you change the variable of the integration from x to y
Thanks for the tip!
Since and , I'm left with this:
But this is a polynomial in x with two unknown parameters, and I'm not sure where to go from here.
I tried setting x = 1, but then it seems like I can make the answer as small as I want by just increasing and .
Oh yes that was a big oversight by me. You can't treat the (1+x) like a constant when integrating with respect to y, you would need to put x in terms of y first before integrating. I think you do have to do it the original way you were going to which will be a lot of work like you said