Please show me how to factor 2x^3 - 3x + 1
Thanks in advance!
It is easy to see that expression =0 when x=1 It follows that x-1 is a factor. The 2nd bracket then must begin with 2x^2 and end in -1
SO 2x^3-3x+1= (x-1)(2x^2 _1) Multiplying these out as they are would give a term -2x^2 which we dont want. Putting =2x in the middle of the 2nd bracket will get rid of this. So 2x^3-3x+1=(x-1)(2x^2+2x-1) Notice this also gives us the required -3x term.
Thank you, but I was wanting to break it into three roots. The term being factored is a derivative for which I am trying to find the critical values. I got as far as you did but was not able to break it down further. Feel free to speak in calculus terms; what do I do next?