(a) Find a primitive root β of F3[x]/(x^2 + 1).
(b) Find the minimal polynomial p(x) of β in F3[x].
(c) Show that F3[x]/(x^2 + 1) is isomorphic to F3[x]/(p(x)).
Please help, im lost on this one.
(a) Find a primitive root β of F3[x]/(x^2 + 1).
(b) Find the minimal polynomial p(x) of β in F3[x].
(c) Show that F3[x]/(x^2 + 1) is isomorphic to F3[x]/(p(x)).
Please help, im lost on this one.
I havent really tried anything, im pretty lost when it comes to this topic and the book is really confusing. I already handed in the assignment with this question blank but would still like to know how to solve it. Can you possibly show me how its done step by step?
This concerns me. You seem to know nothing about what, if you know the basic definitions, is a fairly simple problem just involving "arithmetic". Are you clear on what " " is?
In , the only "numbers" are 0 and 1: 0+ 0= 1, 0+1= 1+0= 1, 1+1= 0; 0*0= 0*1= 1*0= 0, 1*1= 1. If x= 0, then f(0)=0+ 0+ 0+ 0+ 0+ 1 but if x= 1, then f(1)= 1+1+ 1+ 1+ 1+ 1= 0 just because there are an even number of terms. That's how we know x-1 is a factor and, by synthetic division, that it is equal to Now, if , then g(0)= 0+0+ 1= 1 and g(1)= 1+ 1+ 1= 1 so g has no linear factors and so no third degree factors. The only possible way to factor g would be as two quadratic factors and the only possible "quadratic" polynomials in are of the form with a and b either 0 or 1. That is, there are only four possible quadratics: , , , and . You could try dividing by those (being careful to use " " arithmetic) to see if it factors and, if so, what those factors are.