16
Jun 17

## Moving average processes

Moving average processes: this time the intuition is mathematical and even geometric.

## Review and generalize

Science is a vertical structure, as I say in my book, and we have long past the point after which looking back is as important as looking forward. So here are a couple of questions for the reader to review the past material.

Q1. What is a stochastic process? (Answer: imagine a real line with a random variable attached to each integer point.)

Q2. There are good (stationary) processes and bad (all other) processes. How do you define the good ones? (Hint: Properties of means, covariances and variances are bread and butter of professionals.)

Q3. White noise is a simplest (after a constant) type of a stationary process. Give the definition, and don't hope for a hint. Do you realize that the elements of white noise don't interact with one another in the sense that covariance between each two of them is zero?

Idea. Define a class of stationary processes by forming linear combinations of elements of white noise.

Q4. A simple realization of this idea is given here. How do you generalize it?

(1) $y_t=u_t+\theta_1u_{t-1}+...+\theta_qu_{t-q}=u_t+\sum_{i=1}^q\theta_iu_{t-i},$

where $u_t$ is white noise, is called a moving average process of order $q$ and denoted MA(q).

Remarks. 1) The "moving average" name may be misleading. In Finance we use that name when the coefficients sum to one and are positive. Here the thetas do not necessarily sum to one and may change sign.

2) It would be better to say a "moving linear combination". The coefficients of the linear combination do not change but are applied to a moving segment of the white noise, starting from the element dated $t$ and going back to the element dated $t-q$. In this sense we say that (1) involves the segment $[t-q,t]$.

3) In Economics and Finance, the errors $u_t$ are treated as shocks. (1) tells us that the process is a result of the current shock and previous $q$ shocks.

## Moving average properties

First stationarity condition$Ey_t=0$, should be absolutely obvious by now.

Second stationarity condition. Variance does not depend on time:

$Var(y_t)=Ey_t^2=E(u_t+\sum_{i=1}^q\theta_iu_{t-i})(u_t+\sum_{i=1}^q\theta_iu_{t-i})=(1+\sum_{i=1}^q\theta_i^2)\sigma^2$

because only products $u_t^2, u_{t-1}^2,...$ have nonzero expectations.

Third stationarity condition. Here is where geometry is useful. If one linear combination involves the segment $[t-q,t]$ and the other - the segment $[s-q,s]$, then under what condition these segments do not overlap? Answer: if the distance $|s-t|$ between the points $s,t$ is larger than $q$. In this case the linear combinations do not have common elements and $Cov(y_t,y_s)$ is zero.

Exercise. (I leave you the tedious part). Calculate $Cov(y_t,y_s)$ for $|s-t|\le q$.

Conclusion. MA(q) is stationary, for any thetas.

### One Response for "Moving average processes"

1. […] idea behind autoregressive processes is to regress the variable on its own past values. In case of moving averages, we form linear combinations of elements of white noise. Combining the two ideas, we obtain the […]