## help on Confidence Interval statistical question

Let $X_1,....,X_n$ be a random sample of size n from a normal distribution $N(\mu, \sigma ^2)$, where both $\mu$, $\sigma^2$ are unknown. Find a minimum value of n to guarantee, with probability $r$, that a $100(1-\alpha)$% confidence interval for $\mu$ will have length no more than $k\sigma$, where $k>0$ is a specified constant.