An electric utility is examining the relationship between temperature and electricity use in its service region on warm days. The utility has bivariate data detailing the maximum temperature (denoted by x, in degrees Fahrenheit) and the electricity use (denoted by y, in thousands of kilowatt hours) for a random sample of 28 warm days. For these data, the utility has computed the least-squares regression equation to be yhat= 58.67+3.01x.

Tomorrow's forecast high temperature is 80 degrees Fahrenheit. With this in mind, utility managers have used the regression equation to predict tomorrow's electricity use, but they're also interested in both a prediction interval for the electricity use and a confidence interval for the mean electricity use on days for which the maximum temperature is 80 degrees Fahrenheit. The managers have computed the following for their data:

mean square error (MSE):776.23

some confidence interval expression:.0516

You then have to find the lower and upper limit with a 95% confidence interval.

I know the formula is to find t.05(26)=1.706.

Then its (1.706)(sqrt776.23)(sqrt.0516)

My first problem is, it always .0516, because I thought you added 1 sometimes?

The problem goes on to ask: Consider but do not actually compute the 95% confidence interval for the mean electricity use when the max temp is 80 degrees. how would the confidence interval compare to the prediction interval computed before? I'm lost hear.

Then:For the maximum temp values in this sample, 71 degrees is more extreme than 81 degrees, that is, 71 is farther from the sample mean than 80. How would the 95% confidence interval for the mean electricity use when the maximum temp is 80 degrees compare to the 95% confidence interval for the mean electricity use when the temp is 71 degrees?

I know it looks like alot, but I really need help and there's not even math involved.Thanks so much.