## Vector autoregression (VAR)

Suppose we are observing two stocks and their respective returns are To take into account their interdependence, we consider a **vector autoregression**

(1)

Try to repeat for this system the analysis from Section 3.5 (Application to an AR(1) process) of the Guide by A. Patton and you will see that the difficulties are insurmountable. However, matrix algebra allows one to overcome them, with proper adjustment.

### Problem

A) Write this system in a vector format

(2)

What should be in this representation?

B) Assume that the error in (1) satisfies

(3) for with some symmetric matrix

What does this assumption mean in terms of the components of from (2)? What is if the errors in (1) satisfy

(4) for for all

C) Suppose (1) is stationary. The stationarity condition is expressed in terms of eigenvalues of but we don't need it. However, we need its implication:

(5) .

Find

D) Find

E) Find

F) Find

G) Find

**Solution**

A) It takes some practice to see that with the notation

the system (1) becomes (2).

B) The equations in (3) look like this:

Equalities of matrices are understood element-wise, so we get a series of scalar equations for

Conversely, the scalar equations from (4) give

for .

C) (2) implies or by stationarity or Hence (5) implies

D) From (2) we see that depends only on (information set at time ). Therefore by the LIE

E) Using the previous post

(by stationarity and (3)). Thus, and (see previous post).

F) Using the previous result we have

G) Similarly,

Autocorrelations require a little more effort and I leave them out.