Use a common denominator to add 1/a+ 1/b. Then set it equal to 1/(a+ b). What does that tell you?
I've just currently started MATH187 at university and was flipping through the exercises we were given in our subject notebook when I came upon a particular question that I am unsure how to approach.
The question is:
"Let a, b be non-zero real numbers for which a + b cannot = 0. Show that 1/a + 1/b is never equal to 1/a+b"
I was never particularly good at this style of question at school and need some direction as to how to set it out and approach it.
Any help would be particularly appreciated
So once you do use the common denominator it becomes b/ab + a/ab which simplifies to (a + b)/ab. When I equate it to 1(a + b) I get (a + b)/ab = 1/(a + b). How do I progress from here to complete the final proof? Is this where the "a + b cannot = 0" comes into play?