Regression dilution / errors in variables problem

Aug 2011
31
0
Consider the following

y = b0 + x + b2 y(t-1) + e1

x= b3 x(t-1) + e2

where e1 and e2 are normal mean zero error terms.

Now we wish to find d E(y) / d y (t-1) where E(y) is the expectation value for y.

Now we have

d E(y) / d y (t-1) = b2 + d E(x) / d y (t-1)

But how do we find

d E(x) / d y (t-1) ?

If not for the b2 y(t-1) term in the first equation we could find it via the standard regression dilution equation.
 
Last edited: