# Thread: Proving when Gram's determinant is equal to zero

1. ## Proving when Gram's determinant is equal to zero

Task: Prove that Gram's determinant $\displaystyle G(x_1,\dots, x_n)=0$ if and only if $\displaystyle x_1, \dots, x_k$ are linearly dependent.

So I know that

$\displaystyle G(x_1,\dots, x_n)=\det \begin{vmatrix} \xi( x_1,x_1) & \xi( x_1,x_2) &\dots & \xi( x_1,x_n)\\ \xi( x_2,x_1) & \xi( x_2,x_2) &\dots & \xi( x_2,x_n)\\ \vdots&\vdots&\ddots&\vdots\\ \xi( x_n,x_1) & \xi( x_n,x_2) &\dots & \xi( x_n,x_n)\end{vmatrix}$

(where $\displaystyle \xi$ denotes an inner product) which means that if $\displaystyle G(x_1,\dots, x_n)=0$, then the determinant has to be equal to 0 as well. Why does it happen only with $\displaystyle x_1,\dots, x_k$ being linearly dependent, though?

Also, sorry for the <br> tags in the matrix but TeX converts my \\ signs that way here. Weird...

2. ## Re: Proving when Gram's determinant is equal to zero

Hi I would try this way, just briefly:

If $\displaystyle x_1,\dots ,x_n$ are linearly dependent, then there exist scalars $\displaystyle \alpha_1,\dots ,\alpha_n$ such that
$\displaystyle \alpha_1 x_1+\alpha_2 x_2 +\dots + \alpha_n x_n =0$ and $\displaystyle \sum_{k=1}^n |\alpha_k|^2>0$.

Denote $\displaystyle G$ the matrix, to which determinant $\displaystyle G(x_1,\dots ,x_n)$ belongs, $\displaystyle (G)_j$ its j-th row and $\displaystyle \alpha=(\bar{\alpha_1},\dots ,\bar{\alpha_n})^{T}$.

Then there holds $\displaystyle G\alpha=0$, since for each $\displaystyle j\in\{1,\dots ,n\}$ there holds
$\displaystyle (G)_j\alpha=\sum_{k=1}^n \bar{\alpha_k}\xi (x_1,x_n)=\xi \Big(x_j,\sum_{k=1}^n \alpha_k x_k\Big)=\xi (x_j,0)=0$

so nonzero vector $\displaystyle \alpha$ belongs to $\displaystyle Ker(G)$ and thus $\displaystyle G$ is singular matrix.

The other way,
if $\displaystyle G$ is singular matrix, there exists nonzero vector $\displaystyle \alpha=(\alpha_1,\dots ,\alpha_n)$ such that for all $\displaystyle j\in\{1,\dots ,n\}$ there holds
$\displaystyle \xi (x_j,\beta)=0$, where $\displaystyle \beta=\sum_{k=1}^n\bar{\alpha_k} x_k$. But then $\displaystyle \xi (\beta,\beta)=0$ so $\displaystyle \beta=0$
and $\displaystyle x_1,\dots ,x_n$ are linearly dependent