So, here is the situation. I'm taking a string of 5 integer digits and using a formula to convert that string of digits into a real-valued number.

The basic formulation is this:

$\displaystyle \text{String} = abcde $

$\displaystyle \text{Real} = (a+0.1\times b + 0.01 \times c + 0.001 \times d) \times 10^{\frac{e}{2}-2} $

So for example...

$\displaystyle \text{String} = 56796 $

$\displaystyle \text{Real} = (5+0.1\times 6 + 0.01 \times 7 + 0.001 \times 9) \times 10^{\frac{6}{2}-2} = 5.679 \times 10^{1} = 56.79 $

Now, if I set up 10,000 such strings, and populate each string with 5 randomly (and uniformly) selected integers in the interval $\displaystyle [0,9] $, then I would expect to get a nice uniform distribution of reals on the interval $\displaystyle (0, 3161.96) $, which each bound in that interval corresponding to $\displaystyle 00000$ and $\displaystyle 99999$ respectively.

However, what I'm finding is not a uniform distribution at all. The distribution I am finding is the one that is attached as a histogram, with small numbers receiving a far larger share of the selection.

Can anybody explain why this is not showing a uniform distribution, and what can be done to make it so?