I have been looking at a game which in it's simplest form is a "Lucky Dip".
Imagine a box containing 100 tickets, numbered 1 to 100.
A player picks tickets from the box and keeps taking them until the sum of the numbers on their tickets would exceed 100, the ticket which takes them over a hundred is placed back in the box.
(For example: player one picks tickets 51, 25, 17 and 30. ticket 30 is then returned to the box and player one's turn is over)
The next player then steps forward and chooses tickets until the sum of their ticket values is over a hundred.
This continues with more players until there are no tickets left.
I wrote some software to simulate this and was surprised to find that the first few players got the most tickets while later players had fewer tickets.(Surprised)
Can anyone explain why this should be? I think I can see why, but I don't have enough math skill to work it out in any formal way.
Is there any way the game could be adjusted to balance it so the last players have an equal chance of getting a similar number of tickets as the first players?
Feb 18th 2009, 09:29 PM
I think the focus of figuring this out will be that the first players are returning to the box a ticket which was capable of pushing them over the limit. Depending on their current total, the size of this "returned ticket can vary" but it's fairly unlikely that a small value will be returned.
This should skew the average ticket value to be larger in the pool as time goes on.