# Vector/Tensor Proof

• Aug 29th 2008, 05:07 PM
Jensen
Vector/Tensor Proof
I'm trying to prove what is shown in figure 1 where V is a vector and T is a tensor. However if I arbitrarily assign values to the vectors and tensors and try to prove it by performing the operation, I end up calculating that all three terms are exactly the same which obviously cannot be. Input anyone?
• Aug 31st 2008, 10:57 AM
topsquark
Quote:

Originally Posted by Jensen
I'm trying to prove what is shown in figure 1 where V is a vector and T is a tensor. However if I arbitrarily assign values to the vectors and tensors and try to prove it by performing the operation, I end up calculating that all three terms are exactly the same which obviously cannot be. Input anyone?

I'm lost on something here. How can you calculate $\nabla \cdot ( V \cdot T^T )$? $V \cdot T^T$ is a scalar. You can't take the divergence of a scalar.

If this is supposed to be a gradient operation instead of a divergence, notice that
$\nabla ( V \cdot T^T ) = V \cdot ( \nabla T^T ) + T \cdot ( \nabla V )$
is an identity. You can easily show this by working it out component by component.
$\partial _i (V_j T_j) = V_i \partial _j T_j + T_i \partial _j V_j$
using the summation convention.

-Dan
• Sep 1st 2008, 07:02 AM
Jensen
As I understand it, the dot product of a vector and a tensor is a vector so you could take the divergence of what is shown in the parenthesis.
• Sep 2nd 2008, 03:10 PM
topsquark
Quote:

Originally Posted by Jensen
As I understand it, the dot product of a vector and a tensor is a vector so you could take the divergence of what is shown in the parenthesis.

Hmmmm.... I see what you are saying for a higher dimensional case, and I'll admit that I didn't think it through, but vectors with the correct transformation properties are tensors also and you can't do the problem when T is a vector.

Otherwise I think you can make the same argument that I did, just in a slightly different form. Now the $T_j$ are a set of vectors is the only difference.

-Dan