Ancient Egyptians were fond of writing general fractions as sums of fractions with unit numerator. Specifically, they would write 2/3 = 1/2 + 1/6,3/4 = 1/2 + 1/4 and, more exotically,6/7 = 1/2 + 1/3 + 1/42 . To be precise, we say that a rational number in the interval is written as an Egyptian fraction if it is written as a sum of a finite number of distinct unit fractions (so 2/17 = 1/17 + 1/17 is forbidden). Consider the following algorithm for writing a rational number (between 0 and 1) as an Egyptian fraction:
Algorithm E: If for some r = 1/d, output "1/d". Otherwise, find the largest unit fraction 1/d that is less than r ; output "1/d +"; repeat with the rational number r - 1/d .
Prove, using induction, that this process always terminates. (Hint: Try using induction on the numerator of the fraction.)
Alright, this is what I have so far :
Base case: R is of the form 1/d.
If R is of the form 1/d, then the program terminates by the algorithm's definition
So assume R = a/d, a != 0 and d > 0
If a is 1, then the program terminates
else a is greater than 1 and we apply r - 1/d
Examine the case for a + 1, since a + 1 > 1, we apply the second part of the algorithm, namely we repeat with the new fraction r2 = r - 1/d
since r = (a+1)/d,
r2 = (a+1)/d - 1/d
r2 = a/d +1/d - 1/d
r2 = a/d
Since we assume a/d is true, thus the algorithm terminates for all valid input.
I know I am probably way of, because this part of computer science is not my strong part, so I ask for help any guidance will be greatly appreciated. Thanks.