# Math Help - Determinant, Change of basis

1. ## Determinant, Change of basis

Hi!

Let $V$ be a vector space, $\dim (V)=n$ and $\omega \in \mathrm{Alt}^n V,\ \omega \neq 0$ an alternating form. Let $M$ be the change of basis matrix from the basis $a=(a_1,...,a_n)$ for $V$ to the basis $b=(b_1,...,b_n)$ for $V$.

Proof that $\det M=\frac{\omega(a_1,...,a_n)}{\omega(b_1,...,b_n)}
$

I would like to use the Leibniz formula
$\det M = \sum_{\sigma \in S_n} \mathrm{sign} (\sigma) \cdot M_{1 \sigma(1)}\cdot ... \cdot M_{n \sigma(n)}$

Since $a=(a_1,...,a_n)$ and $b=(b_1,...,b_n)$ are bases for $V$, I can write

$a_i = \sum_{j=1}^{n}M_{ij}b_j$
and therefore

$\frac{\omega(\sum_{j=1}^{n}M_{1j}b_j,..., \sum_{j=1}^{n}M_{nj}b_j)}{\omega(b_1,...,b_n)}$

$=\frac{\omega( M_{11}b_1+...+M_{1n}b_n ,..., M_{n1}b_1+...+M_{nn}b_n )}{\omega(b_1,...,b_n)}$

How should I continue? Can I write the last as $\frac{\sum_{j=1}^n M_{1j} \cdot ... \cdot M_{nj} \cdot \omega(b_1,...,b_n)}{\omega(b_1,...,b_n)}$ because $\omega$ is alternating?

Bye,
Lisa

2. Originally Posted by lisa
Hi!

Let $V$ be a vector space, $\dim (V)=n$ and $\omega \in \mathrm{Alt}^n V,\ \omega \neq 0$ an alternating form. Let $M$ be the change of basis matrix from the basis $a=(a_1,...,a_n)$ for $V$ to the basis $b=(b_1,...,b_n)$ for $V$.

Proof that $\det M=\frac{\omega(a_1,...,a_n)}{\omega(b_1,...,b_n)}
$

I would like to use the Leibniz formula
$\det M = \sum_{\sigma \in S_n} \mathrm{sign} (\sigma) \cdot M_{1 \sigma(1)}\cdot ... \cdot M_{n \sigma(n)}$

Since $a=(a_1,...,a_n)$ and $b=(b_1,...,b_n)$ are bases for $V$, I can write

$a_i = \sum_{j=1}^{n}M_{ij}b_j$
and therefore

$\frac{\omega(\sum_{j=1}^{n}M_{1j}b_j,..., \sum_{j=1}^{n}M_{nj}b_j)}{\omega(b_1,...,b_n)}$

$=\frac{\omega( M_{11}b_1+...+M_{1n}b_n ,..., M_{n1}b_1+...+M_{nn}b_n )}{\omega(b_1,...,b_n)}$

How should I continue? Can I write the last as $\frac{\sum_{j=1}^n M_{1j} \cdot ... \cdot M_{nj} \cdot \omega(b_1,...,b_n)}{\omega(b_1,...,b_n)}$ because $\omega$ is alternating?

the point here is that since $\omega$ is alternating we have $\omega(x_1, \cdots , x_n)=0$ if $x_i=x_j$ for some $i \neq j.$ so using multilinearity of $\omega$ to expand $\omega(\sum_{j=1}^{n}M_{1j}b_j,..., \sum_{j=1}^{n}M_{nj}b_j),$ all terms in which one of $b_j$ appears more than once will be zero. thus we'll be left with terms in the form $c_{\sigma} \omega(b_{\sigma(1)}, \cdots , b_{\sigma(n)}),$ where $\sigma \in S_n$ and $c_{\sigma}$ is in terms of $M_{ij}.$ again, using the fact that $\omega$ is alternating, we have
$\omega(b_{\sigma(1)}, \cdots , b_{\sigma(n)})=\text{sgn}(\sigma) \cdot \omega(b_1, \cdots , b_n).$ the only thing you need to show now is that $c_{\sigma}=M_{1\sigma(1)} \cdot \cdots M_{n \sigma(n)}.$ (left for you!)