# Show element is invertible in a Clifford algebra

• Nov 22nd 2011, 04:16 PM
redsoxfan325
Show element is invertible in a Clifford algebra
Here is the problem:

Let $F$ be a field and $V$ a quadratic space with an anisotropic form $q$ not representing $1$ (i.e. $q(v)\neq1$ for all $v\in V$). Show that if $a\in F$ and $v\in V$ are not both zero, then $a+v$ is an invertible element in the Clifford algebra $C(V,q)$.

I'm not really sure how to start this problem. I don't fully understand Clifford algebras, so I don't even really know what invertible elements even look like. I understand the definition - $T(V)/I$, where I is the ideal generated by $v\otimes v-q(v)$ - but I don't understand the resulting algebra (except when $q$ is a binary or nonzero unary form).

Any help is appreciated.
• Nov 22nd 2011, 06:47 PM
NonCommAlg
Re: Show element is invertible in a Clifford algebra
Quote:

Originally Posted by redsoxfan325
Here is the problem:

Let $F$ be a field and $V$ a quadratic space with an anisotropic form $q$ not representing $1$. Show that if $a\in F$ and $v\in V$ are not both zero, then $a+v$ is an invertible element in the Clifford algebra $C(V,q)$.

I'm not really sure how to start this problem. I don't fully understand Clifford algebras, so I don't even really know what invertible elements even look like. I understand the definition - $T(V)/I$, where I is the ideal generated by $v\otimes v-q(v)$ - but I don't understand the resulting algebra (except when $q$ is a binary or nonzero unary form).

Any help is appreciated.

what do u mean by "not representing 1"? do u mean $q(v) \neq 1$ for all $v \in V$?
• Nov 22nd 2011, 06:48 PM
redsoxfan325
Re: Show element is invertible in a Clifford algebra
Quote:

Originally Posted by NonCommAlg
what do u mean by "not representing 1"? do u mean $q(v) \neq 1$ for all $v \in V$?

Yes.
• Nov 22nd 2011, 07:03 PM
NonCommAlg
Re: Show element is invertible in a Clifford algebra
Quote:

Originally Posted by redsoxfan325
Yes.

ok then, so $q(v) \neq a^2$ because otherwise $q(a^{-1}v)=1.$ let $b = a(a^2 - q(v))^{-1}$ and $c =-ba^{-1}$ and see that $(a+v)(b+cv)=1.$