# Math Help - Stationarity of AR(1) time series

1. ## Stationarity of AR(1) time series

i managed to talk myself into a complete mess trying to answer this thread

Which leads me to some questions on the stationarity of time series.

Wikipedia claims that the following process is weakly stationary (which requires, among other things, for it to have the same expected value at all times):

where a is constant, e_t is white noise and and |b| < 1.

is this right? it seems to me that

The only way i can see this being stationary is if a=0 so that E(x_t) = 0; but the wikipedia article claims that the mean is a/(1-b) everywhere. am i missing something?

2. Hello,

Actually, a process being stationary implies that the expectation is constant. Not necessarily equal to 0.
I said =0 in the other thread because I didn't see the 50 term.

But here it is indeed a/(1-b)

3. edit: nevermind

4. i see the mistake in my first post, phew.

But im still not seeing how the series can be stationary with a<>0 since the expectation is different for all the first (finitely many) terms. Derivation:

assuming i made no algebra error, the only difference between this and derivations claiming to show the series is stationary is that i gave my series a finite history. (with an infinite history, the geometric progression converges to its limit which is independant of t, so he expectation becomes independent of t).

Am i wrong to assume the series has a finite history, if so why?

latex if i need it later:
Spoiler:

y_t = \left\{
\begin{array}{l l}
a + by_{t-1} + e_t & \text{if } t \geq 1 \\
a + e_t & \text{if t = 0}\\
\end{array} \right.

\\ . \\

y_t = a + by_{t-1} + e_t \\
y_t = a + e_t + b(a + by_{t-2} + e_{t-1})

y_t = a(1+b) + e_t + be_{t-1} + b^2y_{t-2} )

$\text{ ... for any k less than t}$

\displaystyle
y_t = a\sum_{n=0}^k b^n + \sum_{n=0}^k b^n e_{t-n} + b^{k+1} y_{t-k-1} \\

\\

$\text {let k=t-1}$
\\

\displaystyle
y_t = a\sum_{n=0}^{t-1} b^n + \sum_{n=0}^{t-1} b^n e_{t-n} + b^{t} y_{0} \\

\displaystyle
y_t = a\sum_{n=0}^{t-1} b^n + \sum_{n=0}^{t-1} b^n e_{t-n} + b^{t}(a + e_0) \\

$\text{incorporate last terms into the existing sums}$

\displaystyle
y_t = a\sum_{n=0}^{t} b^n + \sum_{n=0}^{t} b^n e_{t-n} \\

$\text{ first sum is a geometirc progression}$

\displaystyle
y_t = a \frac{1-b^{t+1}}{1-b} + \sum_{n=0}^{t} b^n e_{t-n} \\

$\text{ take expectation$

\displaystyle
E(y_t) = a \frac{1-b^{t+1}}{1-b} + 0 \\

$\text{ which varies with t, so the series is not stationary unless a=0}$

5. Hi again,

I'm sorry, still can't find my lecture notes about time series. Anyway, I see a mistake here. Why do you assume that y_0=0 ? You write it in the beginning, but I haven't heard of such a thing. It could be another constant, it could be a random variable ? This makes the line before "incorporate the last terms..." false.

I'm quite pissed by the latex, I'll just write the raw code...
It gives :

y_t=a\cdot \frac{1-b^t}{1-b}+\sum_{n=0}^{t-1} b^n e_{t-n}+b^t y_0

By taking the expectations, and assuming E[y_t]=m, \forall t\geq 0 (stationarity), we get :

m=a\cdot \frac{1-b^t}{1-b}+b^t m

Under the condition that |b|<1 (\leq 1 because otherwise there is an undefined sum, and \neq 1 because otherwise we'd divide by 0), we finally have :

m=\frac{a}{1-b}

6. The reason i put that there was i thought these series are only defined for $t \geq 0$ in which case you cant have .

longer messier way of writing what you wrote.. :P

the expectation also works for y_0 since d=a/(1-b)

so...mystery solved, i think! thanks

PS:
i think that y_0 must be constant + e_0 otherwise var(Y_0) \neq var(y_1)

7. Nah, now I'm pretty sure, y_0 can be a random variable. And if you check the calculations, it works for the expectation being constant.
For the variance, it would depend on the variance of the white noise.

Ar(p) is not defined for negative indices, but it's just that you define y_t=a+by_{t-1}+...+e_t for t>0 and you leave y_0 as it is, without expressing it with respect to a or b or the white noise.

8. the latex bit was written on the assumption that y_0 is a constant (picked as it was one of the options you gave), although i reached the same conclusion as you while u were typing that (see the PS at the bottom)

9. Sorry, I indeed saw your PS but didn't say it clearly. Read the line
For the variance, it would depend on the variance of the white noise.
if you still consider it as a random variable and establish the equation the variance satisfies, you can obtain that the constant variance is \sigma^2/(1-b^2).
So I really think that y_0 is a rv ^^
I studied time series a little last semester and I don't remember we ever made the assumption that y_0 is a constant (while talking about ARMA processes in general)