In this message, I will be talking about real numbers only. I also presume that every reader understands that for any real number , and that .

Why is it, that converting a fraction to a percentage, the fraction should be "multiplied by 100%"? Here are some examples:

http://www.cimt.plymouth.ac.uk/proje...gcse/bkb11.pdf

What sense does it make to multiply a fraction by 100%, which equals 1?

But here is what is far worse:

Many business books in my country actually suggest that, for example

! Yes, multiplied by 100, not 100%.

Am I incorrect if I state that

and that

?