The question is: If a normal distribution with mean mu and variance > 0 has 46th percentile equal to 20*sigma, then what is mu in terms of standard deviation? I know the answer should be mu = 20.1*sigma by solving using the standard normal table, but I need to solve it using integration.