## Solution to Question 1 from UoL exam 2020

The assessment was an open-book take-home online assessment with a 24-hour window. No attempt was made to prevent cheating, except a warning, which was pretty realistic. Before an exam it's a good idea to see my checklist.

**Question 1**. Consider the following ARMA(1,1) process:

(1)

where is a zero-mean white noise process with variance , and assume and , which together make sure is covariance stationary.

(a) [20 marks] Calculate the conditional and unconditional means of , that is, and

(b) [20 marks] Set . Derive the autocovariance and autocorrelation function of this process for all lags as functions of the parameters and .

(c) [30 marks] Assume now . Calculate the conditional and unconditional variances of that is, and

**Hint**: for the unconditional variance, you might want to start by deriving the unconditional covariance between the variable and the innovation term, i.e.,

(d) [30 marks] Derive the autocovariance and autocorrelation for lags of 1 and 2 as functions of the parameters of the model.

**Hint**: use the hint of part (c).

## Solution

### Part (a)

** Reminder**: The definition of a zero-mean white noise process is

(2) for all and for all

A variable indexed is known at moment and at all later moments and behaves like a constant for conditioning at such moments.

Moment is future relative to The future is unpredictable and the best guess about the future error is zero.

The recurrent relationship in (1) shows that

(3) does not depend on the information that arrives at time and later.

Hence, using also linearity of conditional means,

(4)

The law of iterated expectations (**LIE**): application of based on information available at time and subsequent application of based on no information, gives the same result as application of

Since is covariance stationary, its means across times are the same, so and

### Part (b)

With we get and from part (a) Using (2), we find variance

and first autocovariance

(5)

Second and higher autocovariances are zero because the subscripts of epsilons don't overlap.

Autocorrelation function: (this is always true),

for

This is characteristic of MA processes: their autocorrelations are zero starting from some point.

### Part (c)

If we replace all expectations in the definition of variance, we obtain the definition of conditional variance. From (1) and (4)

By the law of total variance

(6)

(an additive constant does not affect variance)

By the LIE and (3)

Here so

(7)

This equation leads to

and, finally,

(8)

### Part (d)

From (7)

(9)

It follows that

(a constant is not correlated with anything)

From (7) and from (9)

From (3)

Using also the white noise properties and stationarity of

we are left with

Hence,

and using (8)

The finish is close.

This simplifies to

(10)

By (7)

Finally, using (10)

A couple of errors have been corrected on June 22, 2021. Hope this is final.