Results 1 to 9 of 9

Math Help - Stationarity of AR(1) time series

  1. #1
    MHF Contributor
    Joined
    May 2010
    Posts
    1,027
    Thanks
    28

    Stationarity of AR(1) time series

    i managed to talk myself into a complete mess trying to answer this thread

    Which leads me to some questions on the stationarity of time series.

    Wikipedia claims that the following process is weakly stationary (which requires, among other things, for it to have the same expected value at all times):



    where a is constant, e_t is white noise and and |b| < 1.


    is this right? it seems to me that


    The only way i can see this being stationary is if a=0 so that E(x_t) = 0; but the wikipedia article claims that the mean is a/(1-b) everywhere. am i missing something?
    Last edited by SpringFan25; April 29th 2011 at 07:13 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hello,

    Actually, a process being stationary implies that the expectation is constant. Not necessarily equal to 0.
    I said =0 in the other thread because I didn't see the 50 term.

    But here it is indeed a/(1-b)
    Follow Math Help Forum on Facebook and Google+

  3. #3
    MHF Contributor
    Joined
    May 2010
    Posts
    1,027
    Thanks
    28
    edit: nevermind
    Last edited by SpringFan25; April 29th 2011 at 10:39 AM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    MHF Contributor
    Joined
    May 2010
    Posts
    1,027
    Thanks
    28
    i see the mistake in my first post, phew.

    But im still not seeing how the series can be stationary with a<>0 since the expectation is different for all the first (finitely many) terms. Derivation:



    assuming i made no algebra error, the only difference between this and derivations claiming to show the series is stationary is that i gave my series a finite history. (with an infinite history, the geometric progression converges to its limit which is independant of t, so he expectation becomes independent of t).


    Am i wrong to assume the series has a finite history, if so why?



    latex if i need it later:
    Spoiler:


    y_t = \left\{
    \begin{array}{l l}
    a + by_{t-1} + e_t & \text{if } t \geq 1 \\
    a + e_t & \text{if t = 0}\\
    \end{array} \right.

    \\ . \\

    y_t = a + by_{t-1} + e_t \\
    y_t = a + e_t + b(a + by_{t-2} + e_{t-1})

    y_t = a(1+b) + e_t + be_{t-1} + b^2y_{t-2} )


    \[ \text{ ... for any k less than t} \]

    \displaystyle
    y_t = a\sum_{n=0}^k b^n + \sum_{n=0}^k b^n e_{t-n} + b^{k+1} y_{t-k-1} \\

    \\

    \[ \text {let k=t-1} \]
    \\

    \displaystyle
    y_t = a\sum_{n=0}^{t-1} b^n + \sum_{n=0}^{t-1} b^n e_{t-n} + b^{t} y_{0} \\

    \displaystyle
    y_t = a\sum_{n=0}^{t-1} b^n + \sum_{n=0}^{t-1} b^n e_{t-n} + b^{t}(a + e_0) \\



    \[ \text{incorporate last terms into the existing sums} \]


    \displaystyle
    y_t = a\sum_{n=0}^{t} b^n + \sum_{n=0}^{t} b^n e_{t-n} \\


    \[ \text{ first sum is a geometirc progression} \]


    \displaystyle
    y_t = a \frac{1-b^{t+1}}{1-b} + \sum_{n=0}^{t} b^n e_{t-n} \\

    \[ \text{ take expectation \]

    \displaystyle
    E(y_t) = a \frac{1-b^{t+1}}{1-b} + 0 \\

    \[ \text{ which varies with $t$, so the series is not stationary unless a=0} \]

    Last edited by SpringFan25; April 29th 2011 at 12:09 PM.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Hi again,

    I'm sorry, still can't find my lecture notes about time series. Anyway, I see a mistake here. Why do you assume that y_0=0 ? You write it in the beginning, but I haven't heard of such a thing. It could be another constant, it could be a random variable ? This makes the line before "incorporate the last terms..." false.

    I'm quite pissed by the latex, I'll just write the raw code...
    It gives :

    y_t=a\cdot \frac{1-b^t}{1-b}+\sum_{n=0}^{t-1} b^n e_{t-n}+b^t y_0

    By taking the expectations, and assuming E[y_t]=m, \forall t\geq 0 (stationarity), we get :

    m=a\cdot \frac{1-b^t}{1-b}+b^t m

    Under the condition that |b|<1 (\leq 1 because otherwise there is an undefined sum, and \neq 1 because otherwise we'd divide by 0), we finally have :

    m=\frac{a}{1-b}
    Follow Math Help Forum on Facebook and Google+

  6. #6
    MHF Contributor
    Joined
    May 2010
    Posts
    1,027
    Thanks
    28
    The reason i put that there was i thought these series are only defined for t \geq 0 in which case you cant have .

    longer messier way of writing what you wrote.. :P


    the expectation also works for y_0 since d=a/(1-b)

    so...mystery solved, i think! thanks



    PS:
    i think that y_0 must be constant + e_0 otherwise var(Y_0) \neq var(y_1)
    Last edited by SpringFan25; May 1st 2011 at 11:40 AM.
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Nah, now I'm pretty sure, y_0 can be a random variable. And if you check the calculations, it works for the expectation being constant.
    For the variance, it would depend on the variance of the white noise.

    Ar(p) is not defined for negative indices, but it's just that you define y_t=a+by_{t-1}+...+e_t for t>0 and you leave y_0 as it is, without expressing it with respect to a or b or the white noise.
    Follow Math Help Forum on Facebook and Google+

  8. #8
    MHF Contributor
    Joined
    May 2010
    Posts
    1,027
    Thanks
    28
    the latex bit was written on the assumption that y_0 is a constant (picked as it was one of the options you gave), although i reached the same conclusion as you while u were typing that (see the PS at the bottom)
    Follow Math Help Forum on Facebook and Google+

  9. #9
    Moo
    Moo is offline
    A Cute Angle Moo's Avatar
    Joined
    Mar 2008
    From
    P(I'm here)=1/3, P(I'm there)=t+1/3
    Posts
    5,618
    Thanks
    6
    Sorry, I indeed saw your PS but didn't say it clearly. Read the line
    For the variance, it would depend on the variance of the white noise.
    if you still consider it as a random variable and establish the equation the variance satisfies, you can obtain that the constant variance is \sigma^2/(1-b^2).
    So I really think that y_0 is a rv ^^
    I studied time series a little last semester and I don't remember we ever made the assumption that y_0 is a constant (while talking about ARMA processes in general)
    Last edited by Moo; May 1st 2011 at 11:59 AM.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. stationarity of this time series
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: August 30th 2011, 08:10 AM
  2. Time Series - Stationarity (Weak) Proof - Help Needed
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: September 13th 2010, 08:12 PM
  3. Replies: 0
    Last Post: March 17th 2010, 12:49 PM
  4. Time series Help
    Posted in the Advanced Statistics Forum
    Replies: 3
    Last Post: March 13th 2007, 12:05 AM
  5. Help with Time series
    Posted in the Advanced Statistics Forum
    Replies: 7
    Last Post: March 11th 2007, 12:49 PM

/mathhelpforum @mathhelpforum