### Useful facts about independence

In the one-dimensional case the economic way to define normal variables is this: define a standard normal variable and then a general normal variable as its linear transformation.

In case of many dimensions, we follow the same idea. Before doing that we state without proofs two useful facts about independence of random variables (real-valued, not vectors).

**Theorem 1**. Suppose variables have densities Then they are independent if and only if their joint density is a product of individual densities:

**Theorem 2**. If variables are normal, then they are independent if and only if they are uncorrelated:

The necessity part (independence implies uncorrelatedness) is trivial.

### Normal vectors

Let be independent standard normal variables. A standard normal variable is defined by its density, so all of have the same density. We achieve independence, according to Theorem 1, by defining their joint density to be a product of individual densities.

**Definition 1**. A **standard normal vector** of dimension is defined by

**Properties**. because all of have means zero. Further, for by Theorem 2 and variance of a standard normal is 1. Therefore, from the expression for variance of a vector we see that

**Definition 2**. For a matrix and vector of compatible dimensions a **normal vector** is defined by

**Properties**. and

(recall that variance of a vector is always nonnegative).

### Distributions derived from normal variables

In the definitions of standard distributions (chi square, t distribution and F distribution) there is no reference to any sample data. Unlike statistics, which by definition are functions of sample data, these and other standard distributions are theoretical constructs. Statistics are developed in such a way as to have a distribution equal or asymptotically equal to one of standard distributions. This allows practitioners to use tables developed for standard distributions.

**Exercise 1**. Prove that converges to 1 in probability.

**Proof**. For a standard normal we have and (both properties can be verified in Mathematica). Hence, and

Now the statement follows from the simple form of the law of large numbers.

Exercise 1 implies that for large the t distribution is close to a standard normal.