Does anybody know how to do this question? Thank you for any help
Consider the model
y i = α + β.x i + ui
where u i is a disturbance identically and independently normally distributed with mean 0 and variance σ^2 , and x i is a stochastic regressor, identically and independently normally distributed with mean 0 and variance λ^2 .
a) Suppose a researcher mistakenly regresses x on y, that is, runs the regression
x i = α + δ.y i + u i
and uses 1/δ as an estimator for β prove that 1/δ is not a consistent estimator for β.
b) What is the asymptotic distribution of the least squares estimator when the correct regression of y on x is run?