OK this might be too advanced a forum for this question, hence the 'easy' in the title

I might answer this myself as I go along but anyway here goes.

Say two people roll a dice, the highest number wins you would expect

each person to win 50% of the time. But what if one person was luckier

and won 55% of the time. You could say he was 5% luckier than average.

How extend that to 3 people, you would expect each person to win 33%

of the time. Now say one persoon was luckier as in the first example, and

also the same %age luckier. Would that be 5% luckier, ie 33+5=38%?

I think this is too high, I think the answer is he would be 3.3% luckier for

it

to be the same amount of 'luckierness' than in the first example.

Also I think I am into problems here with sample size possibly.

Going back to the first example another way is to say 55% is 10%

luckier than normal (5/50 times 100).Thats how I get the 3.3% value for

the second example.

So for four players it would be 2.5%, 5 players=2%, 10 players=10% etc...

You see I want to compare how lucky a person is in poker 'showdowns'

there would be no problem if it was just 2 players but sometimes there are 3

or maybe 4, the max possible (but rather unlilkely) being 10.

Problem X

------------

But lets take 10 players as an example, say one player wins 20% of the time,

which is twice as lucky as expect. Now if he got the same amount of luck

in 2 player games what would the figures be? It can't be twice because

twice 50% is 100% and that don't seem right.

-----------------

I think I am into sample size problems here too, I never did statistics

unfortunately.

Well I pretty much know I am into sample size stuff because I know that

over say a million game even winning 5% more than average would be

impossible (at least statistically anyway).

I guess I need to look at something like standard deviation (yuk) or

something like that?

OK back to problem X.

I can't even guess the right answer so I am going to make it easier and say

it is on a sample of 100 to see if that helps.

So in the 10 player games normal is 10 in 100 but he wins 20 in 100

So in a 2 player game normal is 50 in 100 but he wins ?? in 100 with the

same amount

of good luck.

What is the correct value of ??

One guess would be 60 but that seems to low, another guess is 100 but that

seems

way to high. 75 feels about right but I can't explain why.

OK I will have a crack at Problem X.

I think a better way of thinking is how often would he win 20 in 100 which

would have

a value 'x' which means it would happen once in every x number of trials so

then I

would need to work out the value which happened once in every 'x' trials for

two

players, and this would be the answer!??

Trouble is I am not sure how to do it (yet). I think I could do a simulation

though.

On my computer :O)

I would just get the maximum value which occured in 'x' trials. I better

still obtain

an average value over repeated trials??

I expect there is a (well known?) equation for this, which I might be able

to work

out if I spent few years thinking about it!!!

Any thoughts/answers?