Originally Posted by

**jaxon4** Here's a problem I've been struggling with:

If I select two numbers at random between 0 and 100 (inclusive and they can be the same number) what is the average difference between these numbers? Assume difference is always positive (i.e. if i select 2 and 40 or 41 and 3 the difference is 38) So if I select 1000000 pairs, what will the average difference be? I'm pretty sure it's less than 50 but I'd like a proof or integral etc...

Ideally I would get a generic solution in the form of "N" where the range is between 0 and N. ( I really want the average between 0 and 1)

Thanks.