There are two variables, X1 and X2, sample size N=1000. X1 has a mean of 72 with stddev = 12 and X2 has a mean of 62 with a stddev=18. The correlation coefficient between the two variables is 0.600.
How can this information be used to find a regression equation X2 = B0 + B1(X1) with slope B1 and intercept B0?
Thanks for any help
ah, thanks a bunch.
If I remember correctly cov(X1,X2) = (correlation coefficient)*(Stddev(X1))*(Stddev(X2))
the first equation is basically saying that once we know the slope B1, the intercept B0 will be fitted such that the line must pass through the averages of X1 and X2, right?
Related Question: How would one find the root mean square error for this problem? That's the topic we're going over in class in a few days and it'd be nice to get a jump on preparation.
I know how to find RMSE from an actual set of data, but from statistics like this I'm not sure