Let $\displaystyle X_1,....,X_n$ be a random sample of size n from a normal distribution$\displaystyle N(\mu, \sigma ^2)$, where both $\displaystyle \mu$, $\displaystyle \sigma^2$ are unknown. Find a minimum value of n to guarantee, with probability $\displaystyle r$, that a $\displaystyle 100(1-\alpha)$% confidence interval for $\displaystyle \mu$ will have length no more than $\displaystyle k\sigma$, where$\displaystyle k>0$ is a specified constant.