Vector Analysis

Prove the orthogonality condition

$\sum_i a_{ji}a_{ki} = \delta_{jk}$

I know that for:

$\sum_i a_{ij}a_{ik} = \delta_{jk}$

is shown by:

$\sum_i \frac{\partial{x_j}}{\partial{x^{'}_i}}\frac{\part ial{x_k}}{\partial{x^{'}_i}}$

$= \sum_i \frac{\partial{x_j}}{\partial{x^{'}_i}}\frac{\part ial{x^{'}_i}}{\partial{x_k}}$

$= \frac{\partial{x_j}}{\partial{x_k}}$

The last step follows by the standard rules for partial differentiation, assuming that $x_j$ is a function of $x^{'}_1$, $x^{'}_2$, $x^{'}_3$, and so on. The final result, $\frac{\partial{x_j}}{\partial{x_k}}$ is equal to $\delta_{jk}$, since $x_j$ and $x_k$ as coordinate lines ( $j \neq k$) are assumed to be perpendicular or orthogonal. Equivalently, we may assume that $x_j$ and $x_k$ ( $j \neq k$) are totally independent variables. If $j=k$, the partial derivative is clearly equal to 1.

But, I have no idea how to prove it with the $i$ and $j/k$ switched.

By the way:

$\delta_{jk} = 1$ for $j = k$

$\delta_{jk} = 0$ for $j \neq k$.