• July 12th 2012, 10:51 AM
bugatti79
Folks,

I need to show that $\int_\Omega (\nabla G)w dxdy=-\int_\Omega (\nabla w) G dxdy+\int_\Gamma \hat{n} w G ds$ given

$\int_\Omega \nabla F dxdy=\oint_\Gamma \hat{n} F ds$ where $\Omega$ and $\Gamma$ are the domain and boundary respectively. F,G and w are scalar functions...any ideas?

I attempted to expand the LHS but I didnt feel it was leading me anywhere...

$\displaystyle \int_\Omega (\hat{e_x}\frac{\partial G}{\partial x}+\hat{e_y}\frac{\partial G}{\partial y})w dx dy$....?
• July 23rd 2012, 12:55 PM
bugatti79
If we let $u=w## then ##du=dw=\nabla w$???

$\displaystyle dv=\nabla G dxdy$ then

$\displaystyle v=\int_\Omega \nabla G dxdy=\int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})Gds$

Thus

$\displaystyle \int_\Omega(\nabla G)wdxdy= \int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})G w ds- \int \int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})G \nabla w ds$

Clearly I have gone wrong somewhere....? Thanks

Note: This has been posted at PF approx week and a half ago. Unlikely to be answered at this stage. Gradient and Divergent Identities