# Thread: Proving when Gram's determinant is equal to zero

1. ## Proving when Gram's determinant is equal to zero

Task: Prove that Gram's determinant $G(x_1,\dots, x_n)=0$ if and only if $x_1, \dots, x_k$ are linearly dependent.

So I know that

$G(x_1,\dots, x_n)=\det \begin{vmatrix} \xi( x_1,x_1) & \xi( x_1,x_2) &\dots & \xi( x_1,x_n)\\
\xi( x_2,x_1) & \xi( x_2,x_2) &\dots & \xi( x_2,x_n)\\
\vdots&\vdots&\ddots&\vdots\\
\xi( x_n,x_1) & \xi( x_n,x_2) &\dots & \xi( x_n,x_n)\end{vmatrix}$

(where $\xi$ denotes an inner product) which means that if $G(x_1,\dots, x_n)=0$, then the determinant has to be equal to 0 as well. Why does it happen only with $x_1,\dots, x_k$ being linearly dependent, though?

Also, sorry for the <br> tags in the matrix but TeX converts my \\ signs that way here. Weird...

2. ## Re: Proving when Gram's determinant is equal to zero

Hi I would try this way, just briefly:

If $x_1,\dots ,x_n$ are linearly dependent, then there exist scalars $\alpha_1,\dots ,\alpha_n$ such that
$\alpha_1 x_1+\alpha_2 x_2 +\dots + \alpha_n x_n =0$ and $\sum_{k=1}^n |\alpha_k|^2>0$.

Denote $G$ the matrix, to which determinant $G(x_1,\dots ,x_n)$ belongs, $(G)_j$ its j-th row and $\alpha=(\bar{\alpha_1},\dots ,\bar{\alpha_n})^{T}$.

Then there holds $G\alpha=0$, since for each $j\in\{1,\dots ,n\}$ there holds
$(G)_j\alpha=\sum_{k=1}^n \bar{\alpha_k}\xi (x_1,x_n)=\xi \Big(x_j,\sum_{k=1}^n \alpha_k x_k\Big)=\xi (x_j,0)=0$

so nonzero vector $\alpha$ belongs to $Ker(G)$ and thus $G$ is singular matrix.

The other way,
if $G$ is singular matrix, there exists nonzero vector $\alpha=(\alpha_1,\dots ,\alpha_n)$ such that for all $j\in\{1,\dots ,n\}$ there holds
$\xi (x_j,\beta)=0$, where $\beta=\sum_{k=1}^n\bar{\alpha_k} x_k$. But then $\xi (\beta,\beta)=0$ so $\beta=0$
and $x_1,\dots ,x_n$ are linearly dependent