Hey,

Im having trouble with a question finding an estimate for \beta_1 as well as showing its unbiased and finding its variance. There is a set of data with the points

 (1,Y_1) , (2,Y_2) , (3,Y_3) , (4,Y_4), (5,Y_5)

We want to fit a simple linear regression of the model:

 Y_i = \beta_0 + \beta_1i + \epsilon_i

Where: \epsilon_i ~ N(0,\sigma^2)

The question asks me to find an estimate for \beta_1 by joining each set of points and writing the slope of each line formed. Thus, using the average of all the individual slopes to calculate \beta_1

\hat\beta_1 = \frac{\sum\gamma_i}{4}

Where: \gamma_i is the slope between the points (i, Y_i) \mbox{and} (i+1,Y_{i+1})

My attempt:

The slope of any  \gamma_i is:
 \gamma_i = \frac{Y_{i+1} - Y_i}{i+1-i} = Y_{i+1} - Y_i

Thus:

\sum\gamma_i = (Y_2 - Y_1) + (Y_3 - Y_2) + (Y_4 - Y_3) + (Y_5 - Y_4) = (Y_5 - Y_1)<br />

Which leads me to the estimate  \hat\beta_1 = \frac{Y_5 - Y_1}{4}

Can someone tell me if I am on the right track, and how can I determine the bias of this estimator? Aren't the values just constant...? Thanks