In this message, I will be talking about real numbers only. I also presume that every reader understands that for any real number $\displaystyle a$, $\displaystyle 1a = a$ and that $\displaystyle 100\% = 1$.

Why is it, that converting a fraction to a percentage, the fraction should be "multiplied by 100%"? Here are some examples:

http://www.cimt.plymouth.ac.uk/proje...gcse/bkb11.pdf
What sense does it make to multiply a fraction by 100%, which equals 1?

**But here is what is far worse:**
Many business books in my country actually suggest that, for example

$\displaystyle \frac{1}{2} \cdot 100 = 50\%$! Yes, multiplied by 100, not 100%.

Am I incorrect if I state that

$\displaystyle \frac{1}{2} \cdot 100 = 50 = 5000\%$ and that

$\displaystyle \frac{1}{2} = 0.5 = 50\%$?