1. ## Vector/Tensor Proof

I'm trying to prove what is shown in figure 1 where V is a vector and T is a tensor. However if I arbitrarily assign values to the vectors and tensors and try to prove it by performing the operation, I end up calculating that all three terms are exactly the same which obviously cannot be. Input anyone?

2. Originally Posted by Jensen
I'm trying to prove what is shown in figure 1 where V is a vector and T is a tensor. However if I arbitrarily assign values to the vectors and tensors and try to prove it by performing the operation, I end up calculating that all three terms are exactly the same which obviously cannot be. Input anyone?
I'm lost on something here. How can you calculate $\displaystyle \nabla \cdot ( V \cdot T^T )$? $\displaystyle V \cdot T^T$ is a scalar. You can't take the divergence of a scalar.

If this is supposed to be a gradient operation instead of a divergence, notice that
$\displaystyle \nabla ( V \cdot T^T ) = V \cdot ( \nabla T^T ) + T \cdot ( \nabla V )$
is an identity. You can easily show this by working it out component by component.
$\displaystyle \partial _i (V_j T_j) = V_i \partial _j T_j + T_i \partial _j V_j$
using the summation convention.

-Dan

3. As I understand it, the dot product of a vector and a tensor is a vector so you could take the divergence of what is shown in the parenthesis.

4. Originally Posted by Jensen
As I understand it, the dot product of a vector and a tensor is a vector so you could take the divergence of what is shown in the parenthesis.
Hmmmm.... I see what you are saying for a higher dimensional case, and I'll admit that I didn't think it through, but vectors with the correct transformation properties are tensors also and you can't do the problem when T is a vector.

Otherwise I think you can make the same argument that I did, just in a slightly different form. Now the $\displaystyle T_j$ are a set of vectors is the only difference.

-Dan