Formal Power Series

• Feb 11th 2010, 05:27 PM
mathman88
Formal Power Series
Let $F$ be a field. Consider the ring $R = F[[t]]$ of formal power series in $t$ i.e. if $a \in F[[t]]$, then $a=\sum_{n=0}^{\infty} a_nt^n$ where $a_n\in F$.

Multiplication is defined by $\left(\sum_{n=0}^{\infty} a_nt^n\right)\cdot\left(\sum_{n=0}^{\infty} b_nt^n\right) = \sum_{n=0}^{\infty}\left(\sum_{k=0}^{\infty} a_k b_{n-k} t^n\right)$.

(a) Prove that $\alpha \in R$ is a unit if and only if the constant term $a_0 \neq 0$. (ex. $1-t$ is the inverse of $1+t+t^2+t^3+t^4+...$)
(b) Prove that $R$ is a Euclidean domain with respect to the norm $N(\alpha) = n$ if $a_n$ is the
first term of $\alpha$ that is non-zero.
(c) In the polynomial ring $R[x]$, prove that $x^n-t$ is irreducible.
• Feb 11th 2010, 07:14 PM
tonio
Quote:

Originally Posted by mathman88
Let $F$ be a field. Consider the ring $R = F[[t]]$ of formal power series in $t$ i.e. if $a \in F[[t]]$, then $a=\sum_{n=0}^{\infty} a_nt^n$ where $a_n\in F$.

Multiplication is defined by $\left(\sum_{n=0}^{\infty} a_nt^n\right)\cdot\left(\sum_{n=0}^{\infty} b_nt^n\right) = \sum_{n=0}^{\infty}\left(\sum_{k=0}^{\infty} a_k b_{n-k} t^n\right)$.

(a) Prove that $\alpha \in R$ is a unit if and only if the constant term $a_0 \neq 0$. (ex. $1-t$ is the inverse of $1+t+t^2+t^3+t^4+...$)
(b) Prove that $R$ is a Euclidean domain with respect to the norm $N(\alpha) = n$ if $a_n$ is the
first term of $\alpha$ that is non-zero.
(c) In the polynomial ring $R[x]$, prove that $x^n-t$ is irreducible.

Good. What've you tried so far and where are you stuck? Anyone seeing this stuff must be a medium undergraduate level in algebra, at least, so you must have some ideas. Let's see.

Tonio
• Feb 11th 2010, 07:32 PM
NonCommAlg
Quote:

Originally Posted by mathman88
Let $F$ be a field. Consider the ring $R = F[[t]]$ of formal power series in $t$ i.e. if $a \in F[[t]]$, then $a=\sum_{n=0}^{\infty} a_nt^n$ where $a_n\in F$.

Multiplication is defined by $\left(\sum_{n=0}^{\infty} a_nt^n\right)\cdot\left(\sum_{n=0}^{\infty} b_nt^n\right) = \sum_{n=0}^{\infty}\left(\sum_{k=0}^{\infty} a_k b_{n-k} t^n\right)$.

(a) Prove that $\alpha \in R$ is a unit if and only if the constant term $a_0 \neq 0$. (ex. $1-t$ is the inverse of $1+t+t^2+t^3+t^4+...$)
(b) Prove that $R$ is a Euclidean domain with respect to the norm $N(\alpha) = n$ if $a_n$ is the
first term of $\alpha$ that is non-zero.
(c) In the polynomial ring $R[x]$, prove that $x^n-t$ is irreducible.

(a): let $\alpha=\sum_{n=0}^{\infty}a_nt^n.$ it's obvious that if $a_0=0,$ then there exists no $\beta \in F[[t]]$ such that $\alpha \beta = 1$ because the constant term of $\alpha \beta$ is 0.

if $a_0 \neq 0,$ then $a_0$ is invertible in $F.$ define $b_0=a_0^{-1}$ and inductively $b_n=-a_0^{-1}\sum_{j=1}^n a_jb_{n-j}, \ n \geq 1.$ let $\beta=\sum_{n=0}^{\infty}b_nt^n$ and see that $\alpha \beta = 1.$

(b) straightforward.

(c) this part is nice! suppose $x^n - t = r(x) s(x),$ for some $r(x)=\sum_{i=0}^k r_ix^i, \ s(x)=\sum_{i=0}^m s_i x^i, \ k, m > 0, \ r_i, s_i \in F[[t]].$ then $r_0s_0=-t$ and

hence, without loss of generality, we may assume that $r_0=tu, \ s_0=-u^{-1},$ for some invertible element $u \in F[[t]].$ to complete the proof of

(c), use induction to show that, in $F[[t]],$ we have $t \mid r_i,$ for all $i,$ which is the contradiction we need.
• Feb 11th 2010, 08:06 PM
mathman88
Quote:

Originally Posted by NonCommAlg
(a): let
(c) this part is nice! suppose $x^n - t = r(x) s(x),$ for some $r(x)=\sum_{i=0}^k r_ix^i, \ s(x)=\sum_{i=0}^m s_i x^i, \ k, m > 0, \ r_i, s_i \in F[[t]].$ then $r_0s_0=-t$ and

hence, without loss of generality, we may assume that $r_0=tu, \ s_0=-u^{-1},$ for some invertible element $u \in F[[t]].$ to complete the proof of

(c), use induction to show that, in $F[[t]],$ we have $t \mid r_i,$ for all $i,$ which is the contradiction we need.

Very nice!

So the contradiction arises when we set $r(x) = t\cdot p(x)$, so then $x^n-t = t\cdot p(x)\cdot s(x) \Longrightarrow t \mid x^n-t$ since no inverse of $t$ exists (shown in part a) to cancel out $t$ in the right hand side. $\longrightarrow\longleftarrow$
• Feb 11th 2010, 08:09 PM
NonCommAlg
Quote:

Originally Posted by mathman88
Very nice!

So the contradiction arises when we set $r(x) = t\cdot p(x)$, so then $x^n-t = t\cdot p(x)\cdot s(x) \Longrightarrow t \mid x^n-t$ since no inverse of $t$ exists (shown in part a) to cancel out $t$ in the right hand side. $\longrightarrow\longleftarrow$

correct!
• Feb 11th 2010, 08:33 PM
ChrisBickle
Tonio,

Medium undergraduate?!? Im in the last of my undergraduate algebra classes and rings is what we just started with 3 weeks left so we must not be going that far because that makes no sense to me....what school did you go to?
• Feb 12th 2010, 03:44 AM
tonio
Quote:

Originally Posted by ChrisBickle
Tonio,

Medium undergraduate?!? Im in the last of my undergraduate algebra classes and rings is what we just started with 3 weeks left so we must not be going that far because that makes no sense to me....what school did you go to?

Hebrew University in Jerusalem. Indeed, we studied formal powers series in some algebra course in the second half part of undergraduate studies, and got deep into it in graduate courses. I wrote "at least" since imo there could be some school somewhere where this stuff is studied in 2nd-3rd year.
Now basic ring theory we studied as a final chapter to linear algebra 1 and then again, more thorough, as part of Agebraic Structures in 2nd year, as preparation to Fields and Galois theory, so it could be they're going to teach you guys all this stuff in your last year...but it looks a little too little to me.
My school though is considered as rather strong in algebra in general, with some emphasis in group theory, and perhaps this has something to do.

Tonio
• Feb 14th 2010, 09:46 AM
Jim63
Quote:

Originally Posted by NonCommAlg
(b) straightforward.

What do you mean by that?
• Feb 14th 2010, 09:45 PM
NonCommAlg
Quote:

Originally Posted by Jim63
What do you mean by that?

well, it means "it's straightforward but i'm too lazy to write it down!" haha ... anyway, you need to check two conditions.

1) $N(\alpha \beta) \geq N(\alpha),$ for all $0 \neq \alpha, \beta \in F[[t]]$: well, from the definition of $N$ it's clear that $N(\alpha \beta)=N(\alpha) + N(\beta).$ so this part is done.

2) suppose that $\alpha=\sum_{i=m}^{\infty}a_it^i, \ \beta=\sum_{i=k}^{\infty}b_it^i \in F[[t]],$ where $k,m \geq 0, \ a_mb_k \neq 0.$ we want to prove that there exist $\gamma, \delta \in F[[t]]$ such that $\alpha=\gamma \beta + \delta$ and either $\delta=0$ or $N(\delta) < N(\beta)$:

if $m=N(\alpha) < N(\beta)=k,$ then choose $\gamma=0, \ \delta=\alpha.$ if $m \geq k,$ then put $\alpha=t^m \alpha_0, \ \beta=t^k\beta_0,$ where $\alpha_0=a_m + a_{m+1}t + \cdots , \ \beta_0=b_k + b_{k+1}t + \cdots.$ now $\beta_0$ is invertible since $b_k \neq 0.$

so there exists $u \in F[[t]]$ such that $\beta_0u=1.$ thus $\alpha=t^{m-k}\alpha_0 u \beta.$ so in this case we can choose $\delta=0, \ \gamma=t^{m-k} \alpha_0 u. \ \Box$