If b and c are rational and non zero,
prove that
[-2^(1/3)](b+c*2^(1/3)) is irrational
Edit: I realized a simpler way.
Then you have:
From the first and last equations, you have:
Simplifying, you have
Now, you can apply the rational roots theorem.
Oh, and I should also mention you should transform the rational equation into a polynomial equation before you can apply the rational roots theorem.
Let where and . Then
Let . Then . Multiplying out by , you have:
So, any rational root must be of the form where divides and divides .
"Oh, and I should also mention you should transform the rational equation into a polynomial equation before you can apply the rational roots theorem. "
What do you mean? Isn't x just the unknown?
Doesn't rational root theorem just tell me that the possible rational roots are +-factors of
If that's right , how does it help?
GCD means greatest common divisor. In other words, the fractions are irreducible.
The rational roots theorem requires the equation be a polynomial equation, which means the coefficients must all be integers. Yes, is the unknown, but it is also the value you are trying to show is irrational. So, if it is rational, the only possible rational numbers it can be are given by the rational roots theorem (once you transform the equation into a polynomial equation).
I am still not at all sure of what you are telling me. I am trying to self-teach and I don't have a very good student. Sorry.
Can't I just assume that x is rational - and then show the fallacy?
Find the solutions to
Transfer the rational equation to a polynomial equation?
Doesn't rational root theorem just tell me that the possible rational roots are +-factors of
I don't know how this helps
No, I know, I am not making sense. Maybe this stuff is just over my head.
Maybe I should try to make sense of your other solution. I haven't tried to work through that one yet.
I am trying to learn from your Latex as well, which is making the whole process longer and more difficult. (I don't belong to this century)
I did down load a full copy and i think i could learn that from 'you tube' clips but my version doesn't work. I don't know why.
That's why i am mainly trying to learn from your posts. Which are beautifully laid out.
Thanks for being patient with me.
That is the idea. I am not sure if it will work by finding the cubic polynomial. It was just the first thought I had.
The rational root theorem only applies to polynomials with integer coefficients. Currently, the coefficients are rational numbers. So, you would need to multiply out by the denominators of each rational number to get a polynomial with integer coefficients. Then you could apply the rational roots theorem. The idea is to plug in each possible rational root and find that it does not satisfy the equation. Possibly use the solution to a cubic polynomial. Let . Plugging that in, you get:
By adding a variable, we can choose the additional variable to ensure that certain parts of the equation "go away". So, separate the equations as follows: and .
From the second equation, . Hence .
Plugging that in to the first equation
Multiplying out by you have:
Now you can use the quadratic equation to get the solution for . Then you have . So, if is irrational, so too is . Also, you only need to worry about the positive solution to the quadratic. The other solutions will be picked up by multiplying by third roots of unity.
Then
Again, I don't know if this will help you at all. But, but maybe you can show that is irrational.
I use Eclipse when writing LaTeX. I had to add the TeXclipse package. Maybe that will be easier.
Thanks Slip Eternal
I really appreciate all the effort you have put in. It'll take me a while to try and assimilate what you have said.
I'll try downloading Eclipse too.
Thanks again.