Golf balls must meet a set of five standards in order to be used in professional tournaments. One of these standards is the distance traveled. When a ball is hit by a mechanical device, Iron Byron, with a 10-degree angle of launch, a backspin of 42 revolutions per second, and a ball velocity of 235 feet per second, the distance the ball travels may bot exceed 291.2 yards. Manufacturers want to develop balls that will travel so close to the 291.2 yards as possible without exceeding that distance. A particular manufacturer has determined that the distances traveled for the balls it produces are normally distributed with a standard deviation of 2.8 yards. This manufacturer has a new process that allows it to set the mean distance the ball will travel.
1.) If the manufacturer sets the mean distance traveled to be equal to 288 yards, what is the probability that a ball that is randomly selected for testing will travel too far?
2.)Assume the mean distance traveled is 288 yards and that five balls are independently tested. What is the probability that at least one of the five balls will exceed the maximum distance of 291.2 yards?
3.) If the manufacturer wants to be 99 percent certain that a randomly selected ball will not exceed the maximum distance of 291.2 yards, what is the largest mean that can be used in the manufacturing process?
No ideas? wanna show me whos boss lol :P thanks in advanced!