if ac + bd = (b+d+a-c)(b+d-a+c)
we can factorise it, because (b+d+a-c)(b+d-a+c) is a difference of perfect squares
(b+d)^2-(a-c)^2
expands to b^2+d^2-a^2-c^2+2bd+2ac
factor of 2 here
b^2+d^2-a^2-c^2+2(a+b+c+d)
because there is a factor of two, the number is even and therefore cannot be prime
The question is given
a, b, c, d integers with a > b > c > d > 0, and:
ac + bd = (b + d + a − c)(b + d − a + c).
Prove that ab + cd is not prime.
What you have proven is that ac + bd is not prime not that ab + cd is not prime. Which was the whole point of Jane's first post.
It is an easy mistake to make, I know because I made it myself.
(Also your argument could do with some work it is not entirly clear what you are trying to do. Also maybe your language could do with some moderation as well it is close to getting an infraction for being insulting)
RonL
Therefore if (a,b,c,x) are integers, than x is divided by 2.
You have
This reduces to a quadratic equation for a:
So a is an even number (from 1). The same can be proved for any of the (a,b,c,d).
Because of that a*b+c*d=even*even+even*even=even+even=even
Even numbers are not primes except for 2. But the smallest values of a,b,c,d are:
d=1
c=2
b=3
a=4
So:
=>2 CAN'T be a solution
=>ab+cd=even 2=>ab+cd is not a prime
Hope that helps