### From independence of events to independence of random variables

One way to avoid complex Math is by showing the students simplified, plausible derivations which create appearance of rigor and provide enough ground for intuition. This is what I try to do here.

### Independence of random variables

Let be two random variables. Suppose takes values with probabilities . Similarly, takes values with probabilities . Now we want to consider a pair . The pair can take values where take values . These are joint events with probabilities denoted .

**Definition**. are called **independent** if for all one has

(1) .

Thus, in case of two-valued variables, their independence means independence of 4 events. Independence of variables is a more complex condition than independence of events.

### Properties of independent variables

**Property 1**. For independent variables, we have (**multiplicativity**). Indeed, by definition of the expected value and equation (1)

.

**Remark**. This proof is a good exercise to check how well students understand the definitions of the product and of the expectation operator. Note also that multiplicativity holds only under independence, unlike linearity , which is always true.

**Property 2**. Independent variables are uncorrelated: . This follows immediately from multiplicativity and the shortcut for covariance:

(2)

**Remark**. Independence is stronger than uncorrelatedness: variables can be uncorrelated but not independent.

**Property 3**. For independent variables, variance is **additive**: This easily follows from the general formula for and equation (2):

**Property 4**. Independence is such a strong property that it is preserved under *nonlinear transformations*. This means the following. Take two deterministic functions ; apply one to and the other to . The resulting random variables will be independent. Instead of the proof, I provide an application. If are two independent standard normals, then are two independent chi-square variables with 1 degree of freedom.

**Remark**. Normality is preserved only under linear transformations.

This post is an antithesis of the following definition from (Agresti and Franklin, p.540): Two categorical variables are independent if the population conditional distributions for one of them are identical at each category of the other. The variables are dependent (or associated) if the conditional distributions are not identical.

[…] Definition of independent discrete random variables easily modifies for the continuous case. Let be two continuous random variables with densities , respectively. We say that these variables are independent if the density of the pair is a product of individual densities: […]

[…] 3. For independent variables, we have (multiplicativity), which has important implications on its […]

[…] is close to independence, so the intuition is the same: one variable does not influence the other. You can also say that […]

[…] Similarly, the discount takes values with probabilities , . The joint events have joint probabilities denoted . The profit in the event is denoted . This information is summarized in Table […]

[…] 2. Assume that observations are independent. Then the joint density is a product of own densities: . Since the observations are fixed, the joint density is a function of just […]