So, here is the situation. I'm taking a string of 5 integer digits and using a formula to convert that string of digits into a real-valued number.

The basic formulation is this:

So for example...

Now, if I set up 10,000 such strings, and populate each string with 5 randomly (and uniformly) selected integers in the interval , then I would expect to get a nice uniform distribution of reals on the interval , which each bound in that interval corresponding to and respectively.

However, what I'm finding is not a uniform distribution at all. The distribution I am finding is the one that is attached as a histogram, with small numbers receiving a far larger share of the selection.

Can anybody explain why this is not showing a uniform distribution, and what can be done to make it so?