In this message, I will be talking about real numbers only. I also presume that every reader understands that for any real number , and that .
Why is it, that converting a fraction to a percentage, the fraction should be "multiplied by 100%"? Here are some examples:
What sense does it make to multiply a fraction by 100%, which equals 1?
But here is what is far worse:
Many business books in my country actually suggest that, for example
! Yes, multiplied by 100, not 100%.
Am I incorrect if I state that
Multiplying by 100, not 100%:
How do you obtain
like they have done here:
Contribution margin - an example
I got 1300%.
If you multiply by things do change .
Since multiplying by does nothing there is never any harm done, all you have is a computational aide for those who really do not understand what is going on.
Multiplying by and then appending a at the end of the calculation is a slight abuse of notation but if you know what you are doing then it does not matter.