Hi guys, I have 2 computer programming problems but it is math related so please help out if you can thanks!.
1.Sometimes software optimization can dramatically improve the performance of a computer system. Assume that a CPU can perform a multiplication operation in 10ns, and a subtraction operation in 1ns. How long will it take for the CPU to calculate the result of d = a x b - a x c?
Could you optimize the equation so that it will take less time?
I know the first part will take 21ns, but is there a way to manipulate the equation to make it perform the equation less than 21ns? Ive tried but nothing comes up.
2. Moore's law, which is attributed to Intel co founder Gordon Moore, states that computing power doubles every 18 months for the same price. An unrelated observation is that floating point instructions are executed 100 times faster in hardware than via emulation. Using Moor's law as a guide, how long will it take for computing power to improve to the point that floating point instructions are emulated as quickly as their hardware counterparts?
Is what I did right?
x * ln(2) = ln(100)
x = ln(100) / ln(2)
from that x = 6.6438, so it has to double 6.6438 times, which if multiplied by 1.5 years(18 months) is 9.96578 years or 119.589411 months.
without a change in hardware architecture emulation will never be as fast
The only way out is if the emulation can be parallelised by a factor of 100
and the hardware floating point not improved in any way.