## Different faces of vector variance: again visualization helps

In the previous post we defined variance of a column vector with components by

In terms of elements this is the same as:

(1)

## So why knowing the structure of this matrix is so important?

Let be random variables and let be numbers. In the derivation of the variance of the slope estimator for simple regression we have to deal with the expression of type

(2)

**Question 1**. How do you multiply a sum by a sum? I mean, how do you use summation signs to find the product ?

**Answer 1**. Whenever you have problems with summation signs, try to do without them. The product

should contain ALL products Again, a matrix visualization will help:

The product we are looking for should contain all elements of this matrix. So the answer is

(3)

Formally, we can write (the sum does not depend on the index of summation, this is another point many students don't understand) and then perform the multiplication in (3).

**Question 2**. What is the expression for (2) in terms of covariances of components?

**Answer 2**. If you understand Answer 1 and know the relationship between variances and covariances, it should be clear that

(4)

**Question 3**. In light of (1), separate variances from covariances in (4).

**Answer 3**. When we have which are diagonal elements of (1). Otherwise, for we get off-diagonal elements of (1). So the answer is

(5)

Once again, in the first sum on the right we have only variances. In the second sum, the indices are assumed to run from to , excluding the diagonal

**Corollary**. If are *uncorrelated*, then the second sum in (5) disappears:

(6)

This fact has been used (with a slightly different explanation) in the derivation of the variance of the slope estimator for simple regression.

**Question 4**. Note that the matrix (1) is symmetric (elements above the main diagonal equal their mirror siblings below that diagonal). This means that some terms in the second sum on the right of (5) are repeated twice. If you group equal terms in (5), what do you get?

**Answer 4**. The idea is to write

that is, to join equal elements above and below the main diagonal in (1). For this, you need to figure out how to write a sum of the elements that are above the main diagonal. Make a bigger version of (1) (with more off-diagonal elements) to see that the elements that are above the main diagonal are listed in the sum This sum can also be written as Hence, (5) is the same as

(7)

Unlike (6), this equation is applicable *when there is autocorrelation*.