Examples of distribution functions: the first two are used in binary choice models, the third one is applied in maximum likelihood.
Example 1. Distribution function of a normal variable
The standard normal distribution is defined by its probability density
It is nonnegative and integrates to 1 (the proof of this fact is not elementary). Going from density function to distribution function gives us the distribution function (cdf) of the standard normal:
for all real
.
Example 2. The logistic distribution
Here we go from distribution function to density function.
Consider the function
It's easy to check that it has the three characteristic properties of a distribution function: the limits at the right and left infinities and monotonicity.
1. When ,
goes to 1, so
tends to 1.
2. If ,
goes to
, and
tends to 0.
3. Finally, to check monotonicity, we can use the following sufficient condition: a function is increasing where its derivative is positive. (From the Newton-Leibniz formula we see that positivity of the derivative and
imply
). The derivative
(1)
is positive, so is increasing.
Thus, is a distribution function, and it generates a density (1).
Example 3. Distribution function and density of a discrete variable
The distribution function concept applies to all random variables, both discrete and continuous. For discrete variables, the distribution function is not continuous as in Figure 1 here; it has jumps at points that have a positive probability attached. We illustrate this using a Bernoulli variable such that
and
.
- For
we have
.
- For
we have
.
- Finally,
for
.
This leads us to Figure 1.

Figure 1. Distribution function of the Bernoulli variable
Now consider
(2)
To understand this equation, check that
Remark. For continuous random variables, the value of the density at a fixed point means nothing (in particular, it can be larger than 1). It is its integral that has probabilistic meaning. For (2) the value of the density at a fixed point IS probability.
Leave a Reply
You must be logged in to post a comment.