Task 1:

When linearly interpolating time series, missing values are often replaced with the average of two neighbouring values. Suppose that we examine an AR(1) process y_{t }+ ay_{t-1}= e_{t , }

where e_{t }is white Gaussian noise with variance (sigma)^{2}, and that we wish to reconstruct a missing

value by using the two observations just before and just after the missing value, i.e., reconstructing yt using yt-1 and yt+1.

a) Compute the reconstruction of the missing value that minimises the variance of the reconstruction error.

b) Compute the minimum variance of the reconstruction error and also the improvement in error variance obtained by using the optimal reconstruction in (a) as compared to using a linear interpolation.

c) Are there AR(1) processes for which linear interpolation is optimal? Conversely, for which stationary AR(1) processes does the linear interpolation perform the worst?

d) Suppose that you wish to use the immediately surrounding measurements symmetrically when reconstructing a missing value, i.e., reconstructing yt using

yt-m; : : : ;yt-1;yt+1; : : : ;yt+m . Give the resulting linear system of equations that is used in order to determine the coefficients in the reconstructor for a general value of m.

Task 2:

Let y(t) = a1(t)y(t -1)+a2(t)y(t -2)+e(t);

where e(t) is normal distributed white noise with unit variance, and assume that

a1(t +1) = a1(t)+w1(t);

a2(t +1) = a2(t)+w2(t);

where w1(t) och w2(t) are sequences of mutually independent normal distrubuted white noise with variances (sigm1)^{2}and (sigm2)^{2}.

a) Give an optimal recursive reconstruction of a1(t) and a2(t).

b) Determine the one-step predictor ˆ y(t +1/t) and the variance for the prediction error.