Hello hope you're having a good day
I need help in solving these problems
Find a set of real numbers (R) such as
sqrt(ab)+sqrt(cd) >= sqrt(a+b) + sqrt (c+d)
the Second is solving this
(a^3+b)(a+b^3)= (a+b)^4
a and b are integers
I presume that the solutions a = 0, b = anything, and a = anything, b = 0 are both already known to you, so we will go ahead and assume that neither a nor b are 0. As scounged suggested, expand both sides. After some amount of simplifying I get
$\displaystyle \displaystyle (b^2 - 4)a^2 - a(6b) + (1-4b^2) = 0$
This is a quadratic in a. Now take a look at the discriminant:
$\displaystyle \displaystyle \Delta = 36b^2 - 4(b^2 - 4)(1 - 4b^2)$
$\displaystyle \Delta = 32b^2(b^2 - 1)$
For an integer solution for a we must require that $\displaystyle \Delta = 32b^2(b^2 - 1)$ is a perfect square. Are there any integer b that do this? If you find solutions to b, use the quadratic equation in a, $\displaystyle \displaystyle (b^2 - 4)a^2 - a(6b) + (1-4b^2) = 0$
to find if this gives an integer value for a.
-Dan
$\displaystyle \displaystyle \Delta = 32 b^2(b^2 - 1)$
Factor all perfect squares from the RHS:
$\displaystyle \displaystyle \Delta = (16b^2) \cdot 2(b^2 - 1)$
That means that we need a value of an integer b such that $\displaystyle \displaystyle 2(b^2 - 1)$ is a perfect square. If you don't see it, start picking values for b and see what you get. Note that, since b is squared all you need to do is check all the non-negative values for b. (And, of course once you do this, you still need to verify that these solutions for b give an integer value for a.)
Hint: There are only two values of b that make $\displaystyle \Delta$ a perfect square. Can you prove this?
-Dan
Can't prove there are only two b values or that there is no integer a for either of these two b values?
First remove any perfect square factors from $\displaystyle \Delta = 32b^2(b^2 - 1) = (16b^2) \cdot 2(b^2 - 1)$. So we require that $\displaystyle 2(b^2 - 1)$ must be a perfect square.
There are only two possibilities for b: b is even or it is odd.
If b is even then let b = 2n. This implies that $\displaystyle 2((2n)^2 - 1) = 2(4n^2 - 1)$. We need $\displaystyle 4n^2 - 1$ to be an even number divisible by 2 to make this a square number. But $\displaystyle 4n^2 - 1$ is odd. Thus b cannot be an even number.
So let b be an odd number: b = 2n + 1. Then $\displaystyle 2((2n + 1)^2 - 1) = 2(4n^2 + 4n) = 4 \cdot 2n(n + 1)$. Now, 4 is a perfect square, so again I am going to factor that out of the problem. This leaves the statement that $\displaystyle 2n(n + 1)$ must be a square. By inspection n = 0 makes this a perfect square. So n = 0 implies b = 1. Further, the only other way that $\displaystyle 2n(n + 1)$ can be a perfect square is to let $\displaystyle 2n = n + 1$. This has the solution n = 1, which says that b = 3.
So we have only two possible solutions for an integer b: b = 1 or b = 3. Both of these give integer solutions for a.
-Dan