Hello,
Actually, a process being stationary implies that the expectation is constant. Not necessarily equal to 0.
I said =0 in the other thread because I didn't see the 50 term.
But here it is indeed a/(1-b)
i managed to talk myself into a complete mess trying to answer this thread
Which leads me to some questions on the stationarity of time series.
Wikipedia claims that the following process is weakly stationary (which requires, among other things, for it to have the same expected value at all times):
where a is constant, e_t is white noise and and |b| < 1.
is this right? it seems to me that
The only way i can see this being stationary is if a=0 so that E(x_t) = 0; but the wikipedia article claims that the mean is a/(1-b) everywhere. am i missing something?
i see the mistake in my first post, phew.
But im still not seeing how the series can be stationary with a<>0 since the expectation is different for all the first (finitely many) terms. Derivation:
assuming i made no algebra error, the only difference between this and derivations claiming to show the series is stationary is that i gave my series a finite history. (with an infinite history, the geometric progression converges to its limit which is independant of t, so he expectation becomes independent of t).
Am i wrong to assume the series has a finite history, if so why?
latex if i need it later:
Spoiler:
Hi again,
I'm sorry, still can't find my lecture notes about time series. Anyway, I see a mistake here. Why do you assume that y_0=0 ? You write it in the beginning, but I haven't heard of such a thing. It could be another constant, it could be a random variable ? This makes the line before "incorporate the last terms..." false.
I'm quite pissed by the latex, I'll just write the raw code...
It gives :
y_t=a\cdot \frac{1-b^t}{1-b}+\sum_{n=0}^{t-1} b^n e_{t-n}+b^t y_0
By taking the expectations, and assuming E[y_t]=m, \forall t\geq 0 (stationarity), we get :
m=a\cdot \frac{1-b^t}{1-b}+b^t m
Under the condition that |b|<1 (\leq 1 because otherwise there is an undefined sum, and \neq 1 because otherwise we'd divide by 0), we finally have :
m=\frac{a}{1-b}
The reason i put that there was i thought these series are only defined for in which case you cant have .
longer messier way of writing what you wrote.. :P
the expectation also works for y_0 since d=a/(1-b)
so...mystery solved, i think! thanks
PS:
i think that y_0 must be constant + e_0 otherwise var(Y_0) \neq var(y_1)
Nah, now I'm pretty sure, y_0 can be a random variable. And if you check the calculations, it works for the expectation being constant.
For the variance, it would depend on the variance of the white noise.
Ar(p) is not defined for negative indices, but it's just that you define y_t=a+by_{t-1}+...+e_t for t>0 and you leave y_0 as it is, without expressing it with respect to a or b or the white noise.
Sorry, I indeed saw your PS but didn't say it clearly. Read the line
if you still consider it as a random variable and establish the equation the variance satisfies, you can obtain that the constant variance is \sigma^2/(1-b^2).For the variance, it would depend on the variance of the white noise.
So I really think that y_0 is a rv ^^
I studied time series a little last semester and I don't remember we ever made the assumption that y_0 is a constant (while talking about ARMA processes in general)