You are given a dependent variable y, and independent variable x such that the regression would be ( x and y are endogenous):

$\displaystyle y=\beta_{0} + \beta_{1}x + \epsilon$

There is the possibility that the coefficients estimated from the model is affected by x and y having measurement errors in them.

You are also given the lagged (by one period) dataset of x and y (x_1 and y_1 respectively).

How would you go about fixing this error?

May I ask if my approach is correct?

I would use the lagged data to estimate the true values of x & y, so that the errors are no longer correlated so that the model would look as follows

$\displaystyle y_1=\beta_{0} + \beta_{1}x_1 + \epsilon \\ x = x_1 + u_{1} \\ y = y_{1} + u_{2}$

Then this would be the equivalent of running a two way least squares regression with x_{1} and y_{1} as instrumental variables?

Thank you in advance for any feedback