Originally Posted by
lisakki I have a statistics problem I can’t solve for some reason:
A blackjack player at a Las Vegas casino learned that the house will provide a free room if play is for four hours at an average bet of $50. The player’s strategy provides a probability of .49 of winning on any one hand, and the player knows that there are 60 hands per hour. Suppose the player plays for four hours at a bet of $50 per hand.
b. What’s the probability the player loses $1000 or less? Answer: .1788
For some reason I can’t get their answer. It looks like a really simple normal approximation of binomial probabilities question, but for some reason none of my answers turn out right.
I got the expected value of this game as -240, because he wages $12,000 total and the difference between his expected winnings and expected losses is -240.
This is the part where I don’t know how I went wrong: I found standard deviation using sqrt(12000*.49*(1-.49)), which is from sqrt(n*p*(1-p)). It’s 54.7613
Next, I find the z-score, which comes out as -13.8689. I got this using z=(x-u)/s
-13.8689 is so high of a z value it’s not even on my chart. What did I do wrong?