Definitions of the standard normal distribution and independence can be combined to produce definitions of chi-square, t statistic and F statistic. The similarity of the definitions makes them easier to study.
Independence of continuous random variables
Definition of independent discrete random variables easily modifies for the continuous case. Let be two continuous random variables with densities
, respectively. We say that these variables are independent if the density
of the pair
is a product of individual densities:
(1) for all
As in this post, equation (1) can be understood in two ways. If (1) is given, then are independent. Conversely, we if want them to be independent, we can define the density of the pair by equation (1). This definition readily generalizes for the case of many variables. In particular, if we want variables
to be standard normal and independent, we say that each of them has density defined here and the joint density
is a product of individual densities.
Definition of chi-square variable

Figure 1. chi-square with 1 degree of freedom
Let be standard normal and independent. Then the variable
is called a chi-square variable with
degrees of freedom. Obviously,
, which means that its density is zero to the left of the origin. For low values of degrees of freedom, the density is not bounded near the origin, see Figure 1.
Definition of t distribution

Figure 2. t distribution and standard normal compared
Let be standard normal and independent. Then the variable
is called a t distribution with
degrees of freedom. The density of the t distribution is bell-shaped and for low
has fatter tails than the standard normal. For high
, it approaches that of the standard normal, see Figure 2.
Definition of F distribution

Figure 3. F distribution with (1,m) degrees of freedom
Let be standard normal and independent. Then the variable
is called an F distribution with
degrees of freedom. It is nonnegative and its density is zero to the left of the origin. When
is low, the density is not bounded in the neighborhood of zero, see Figure 3.
The Mathematica file and video illustrate better the densities of these three variables.
Consequences
- If
and
are independent, then
is
(addition rule). This rule is applied in the theory of ANOVA models.
. This is an easy proof of equation (2.71) from Introduction to Econometrics, by Christopher Dougherty, published by Oxford University Press, UK, in 2016.
Leave a Reply
You must be logged in to post a comment.