I've never seen this before and I am assuming that the model is still .
Now if we assume and we introduce the estimator
we have for i=1 and i=n, actually for any two points we select.
So, if the x's are fixed we have
.
The estimators for the slope parameter and intercept parameter in a simple linear regression model were obtained in class using the method of Least Squares. The resulting estimators turned out to be unbiased. That is E[Alpha_hat]= Alpha and E[Beta_hat] = Beta. Now, suppose that in order to save some time a careless researcher decides to estimate the slope parameter Beta by connecting the first point collected (X1, Y1) by a straight line to the last point collected (Xn, Yn) and then using the slope of this line to estimate Beta. Suppose we call this estimator Beta*. Is Beta* an unbiased estimator of Beta? That is, is it true that E [Beta*] =Beta?
Select the answer from the following:
a. The estimator is not unbiased and the bias is σ2.
b. The estimator is not unbiased and the bias is the mean square error, MSE.
c. The estimator is not unbiased and the bias is the sample variance of the n-2 points not included in the calculation of the slope.
d. The estimator is unbiased.
e. The estimator is not unbiased, but the bias cannot be obtained in closed form. It depends on where among the scatter plot of points the two points (X1, Y1) and (Xn, Yn) happen to lie.
f. It cannot be determined from the information given.
I've never seen this before and I am assuming that the model is still .
Now if we assume and we introduce the estimator
we have for i=1 and i=n, actually for any two points we select.
So, if the x's are fixed we have
.