So I get how to do the confidence interval, but not for coefficients. I also know how to calculate F-values with a table with columns: MS, SS, etc, but I cant seem to grasp how to do either of these.
Anyone?
1)
y = crime rate = Annual number of crimes in county per 1000 population
x1 = education = Percentage of adults in county with at least a high school education
x2 = urbanization = Percentage in county living in an urban environment
Multivariate regression output from R is given below (note that the numbers are not the same as in the text book):
>summary(fitmc)
Call:
lm(formula = y ~ x1 + x2)
Residuals:
Min 1Q Median 3Q Max
-34.693 -15.742 -6.226 15.812 50.678
Coefficients:
Estimate Std.Error t value Pr(>|t|)
(Intercept) 43.181 20.720 2.084 0.0411 *
x1 -0.5834 0.4725 -1.235 0.2214
x2 0.6825 0.1232 5.539 6.11e-07 ***
---
Residual standard error: 19.32 on 64 degrees of freedom
Multiple R-squared: 0.5314 Adjusted R-squared: 0.4549
F-statistic: 29.21 on 2 and 64 DF, p-value: 1.19e-09
What is the F-value for the overall F-test to see if x1 and/or x2 explain at least part of the variation in the y variable?
2)
Data collected from the state of Florida includes:
y = crime rate = Annual number of crimes in county per 1000 population
x1 = education = Percentage of adults in county with at least a high school education
x2 = urbanization = Percentage in county living in an urban environment
Multivariate regression output from R is given below (note that the numbers are not the same as in the text book):
>summary(fitmc)
Call:
lm(formula = y ~ x1 + x2)
Residuals:
Min 1Q Median 3Q Max
-34.693 -15.742 -6.226 15.812 50.678
Coefficients:
Estimate Std.Error t value Pr(>|t|)
(Intercept) 41.4353 18.5828 xxxxx xxxxx
x1 -0.8811 0.4840 xxxxx xxxxx
x2 0.5250 0.0573 xxxxx xxxxxx
---
Residual standard error: 15.40 on 64 degrees of freedom
Multiple R-squared: 0.5853 Adjusted R-squared: 0.4549
F-statistic: 29.21 on 2 and 64 DF, p-value: 1.19e-09
Find the lower bound for a 95% confidence interval for the coefficient for urbanization.