A friend of mine have been discussing a scenario where you get a certain amount of tries at rolling higher than a particular number on a die (for example, rolling a 3 up on a 6 sided die). Weve been trying to figure out the average amount of successes if you can retry one failed die roll.

We got this far:

3 up on a 6 sided die has a 2/3 chance.

average number of successes from x rolls is x*2/3

chance to succeed x times in a row is 2/3^x

chance to fail at least once on x tries (and thus be able to use the reroll) is 1-(2/3^x)

average number of successes begotten from the reroll is thus (1-(2/3^x))*2/3

which leaves us with our end conclusion that the average number of successes is:

(x*2/3)+((1-(2/3^x))*2/3) or simplified (x+1-(2/3^x))*2/3

Weve also concocted a quick-and-dirty computer program that runs the situation a million times to get a decent average result.

Now the problem is that out math and our program produce different results(3.06... vs 3.20... for x=4) and were not sure wether our math is wrong (math class has been a while for both of us) or if it's the program.

So I figured I'd ask some fancible mathy folk like yourselves to give an opinion on wether our math is on to something or way the heck off. (And in the case of the latter, what the actual propper formula would be)

Also, as a bonus question. What if there was more than one reroll?