## Vector Analysis

Prove the orthogonality condition

$\displaystyle \sum_i a_{ji}a_{ki} = \delta_{jk}$

I know that for:

$\displaystyle \sum_i a_{ij}a_{ik} = \delta_{jk}$

is shown by:

$\displaystyle \sum_i \frac{\partial{x_j}}{\partial{x^{'}_i}}\frac{\part ial{x_k}}{\partial{x^{'}_i}}$

$\displaystyle = \sum_i \frac{\partial{x_j}}{\partial{x^{'}_i}}\frac{\part ial{x^{'}_i}}{\partial{x_k}}$

$\displaystyle = \frac{\partial{x_j}}{\partial{x_k}}$

The last step follows by the standard rules for partial differentiation, assuming that $\displaystyle x_j$ is a function of $\displaystyle x^{'}_1$, $\displaystyle x^{'}_2$, $\displaystyle x^{'}_3$, and so on. The final result, $\displaystyle \frac{\partial{x_j}}{\partial{x_k}}$ is equal to $\displaystyle \delta_{jk}$, since $\displaystyle x_j$ and $\displaystyle x_k$ as coordinate lines ($\displaystyle j \neq k$) are assumed to be perpendicular or orthogonal. Equivalently, we may assume that $\displaystyle x_j$ and $\displaystyle x_k$ ($\displaystyle j \neq k$) are totally independent variables. If $\displaystyle j=k$, the partial derivative is clearly equal to 1.

But, I have no idea how to prove it with the $\displaystyle i$ and $\displaystyle j/k$ switched.

By the way:

$\displaystyle \delta_{jk} = 1$ for $\displaystyle j = k$

$\displaystyle \delta_{jk} = 0$ for $\displaystyle j \neq k$.