26
May 17

## Stationary processes 1

Along with examples of nonstationary processes, it is necessary to know a couple of examples of stationary processes.

Example 1. In the model with a time trend, suppose that there is no time trend, that is, $b=0$. The result is white noise shifted by a constant $a$, and it is seen to be stationary.

Example 2. Let us change the random walk slightly, by introducing a coefficient $\beta$ for the first lag:

(1) $y_t=\beta y_{t-1}+u_t$

where $u_t$ is, as before, white noise:

(2) $Eu_t=0$$Eu_t^2=\sigma^2$ for all $t$ and $Eu_tu_s=0$ for all $t\ne s.$

This is an autoregressive process of order 1, denoted AR(1).

Stability condition$|\beta|<1$.

By now you should be familiar with recurrent substitution. (1) for the previous period looks like this:

(3) $y_{t-1}=\beta y_{t-2}+u_{t-1}.$

Plugging (3) in (1) we get $y_t=\beta^2y_{t-2}+\beta u_{t-1}+u_t.$ After doing this $k$ times we obtain

(4) $y_t=\beta^ky_{t-k}+\beta^{k-1}u_{t-k+1}+...+\beta u_{t-1}+u_t.$

To avoid errors in calculations like this, note that in the product $\beta^{k-1}u_{t-k+1}$ the sum of the power of $\beta$ and the subscript of $u$ is always $t$.

Here the range of time moments didn't matter because the model wasn't dynamic. In the other example we had to assume that in (1) $t$ takes all positive integer values. In the current situation we have to assume that $t$ takes all integer values, or, put it differently, the process $y_t$ extends infinitely to plus and minus infinity. Then we can take advantage of the stability condition. Letting $k\rightarrow\infty$ (and therefore $t-k\rightarrow-\infty$) we see that the first term on the right-hand side of (4) tends to zero and the sum becomes infinite:

(5) $y_t=...+\beta^{k-1}u_{t-k+1}+...+\beta u_{t-1}+u_t=\sum_{j=0}^\infty\beta^ju_{t-j}.$

We have shown that this representation follows from (1). Conversely, one can show that (5) implies (1). (5) is an infinite moving average, denoted MA($\infty$).

It can be used to check that (1) is stationary. Obviously, the first condition of a stationary process is satisfied: $Ey_t=0$. For the second one we have (use (2)):

(6) $Var(y_t)=Ey_t^2=E(...+\beta^{k-1}u_{t-k+1}+...+\beta u_{t-1}+u_t)(...+\beta^{k-1}u_{t-k+1}+...+\beta u_{t-1}+u_t)$

$=...+\beta^{2k-2}Eu^2_{t-k+1}+...+\beta^2Eu^2_{t-1}+Eu^2_t=(...+\beta^{2k-2}+...+\beta^2+1)\sigma^2=\frac{\sigma^2}{1-\beta^2},$

which doesn't depend on $t$.

Exercise. To make sure that you understand (6), similarly prove that

(7) $Cov(y_t,y_s)=\beta^{|t-s|}\frac{\sigma^2}{1-\beta^2}.$

Without loss of generality, you can assume that $t>s$. (7) is a function of the distance in time between $t,s$, as required.