1) It was your goal to INCREASE variance?
2) What "Industry Standards"?
You have been working on a CPU utilization program that had a variance of 80 clock ticks. After you made the changes you took a random sample size 26 and the CPU utilization program now appears to be operating with a variance of 100 clock ticks. Based on your sample and industry standards, do you believe the hypothesis that the changes increased to program variance?