\(\renewcommand{\P}{\mathbb{P}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\Q}{\mathbb{Q}}\) \( \newcommand{\E}{\mathbb{E}} \)
  1. Random
  2. 2. Distributions
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9
  12. 10
  13. 11
  14. 12
  15. 13
  16. 14

9. General Distribution Functions

Our goal in this section is to define and study functions that play the same role for positive measures on \( \R \) that (cumulative) distribution functions do for probability measures on \( \R \). Of course probability measures on \( \R \) are usually associated with real-valued random variables. These general distribution functions are useful for constructing measures on \( \R \) and will appear in our study of integrals with respect to a measure in the next section, as well as non-homogeneous Poisson processes and general renewal processes.

Basic Theory

Throughout this section, our basic measure space is \( (\R, \mathscr{R}) \), where \( \mathscr{R} \) is the \( \sigma \)-algebra of Borel measurable subsets of \( \R \), and as usual, we will let \( \lambda \) denote Lebesgue measure on \( (\R, \mathscr{R}) \).

Distribution Functions and Their Measures

A function \( F: \R \to \R \) that satisfies the following properties is a distribution function on \( \R \)

  1. \( F \) is increasing: if \( x \le y \) then \( F(x) \le F(y) \).
  2. \( F \) is continuous from the right: \( \lim_{t \downarrow x} F(t) = F(x) \) for all \( x \in \R \).

Since \( F \) is increasing, the limit from the left at \( x \in \R \) exists in \( \R \) and is denoted \( F(x^-) = \lim_{t \uparrow x} F(t) \). Similarly \(F(\infty) = \lim_{x \to \infty} F(x) \) exists, as a real number or \( \infty \), and \(F(-\infty) = \lim_{x \to -\infty} F(x) \) exists, as a real number or \( -\infty \).

If \( F \) is a distribution function on \( \R \), then there exists a unique measure \( \mu \) on \( \mathscr{R} \) that satisfies \[ \mu(a, b] = F(b) - F(a), \quad a, \, b \in \R; a \le b \]

Proof:

Let \( \mathscr{I} \) denote the collection of subsets of \( \R \) consisting of intervals of the form \( (a, b] \) where \( a, \, b \in \R \) with \( a \le b \), and intervals of the form \((-\infty, a]\) and \( (a, \infty) \) where \( a \in \R \). Then \( \mathscr{I} \) is a semi-algebra. That is, if \( A, \, B \in \mathscr{I} \) then \( A \cap B \in \mathscr{I} \), and if \( A \in \mathscr{I} \) then \( A^c \) is the union of a finite number (actually one or two) sets in \( \mathscr{I} \). We define \( \mu \) on \(\mathscr{I}\) by \( \mu(a, b] = F(b) - F(a) \), \( \mu(-\infty, a] = F(a) - F(-\infty) \) and \( \mu(a, \infty) = F(\infty) - F(a) \). Note that \( \mathscr{I} \) contains the empty set via intervals of the form \( (a, a] \) where \( a \in \R \), but the definition gives \( \mu(\emptyset) = 0 \). Next, \( \mu \) is finitely additive on \( \mathscr{I} \). That is, if \( \{A_i: i \in I\} \) is a finite, disjoint collection of sets in \( \mathscr{I} \) and \( \bigcup_{i \in I} A_i \in \mathscr{I} \), then \[ \mu\left(\bigcup_{i \in I} A_i\right) = \sum_{i \in I} \mu(A_i) \] Next, \( \mu \) is countably subadditive on \( \mathscr{I} \). That is, if \( A \in \mathscr{I} \) and \( A \subseteq \bigcup_{i \in I} A_i \) where \( \{A_i: i \in I\} \) is a countable collection of sets in \( \mathscr{I} \) then \[ \mu(A) \le \sum_{i \in I} \mu(A_i) \] Finally, \( \mu \) is clearly \( \sigma \)-finite on \( \mathscr{I} \) since \( \mu(a, b] \lt \infty \) for \( a, \, b \in \R \) with \( a \lt b \), and \( \R \) is a countable, disjoint union of intervals of this form. Hence it follows from the basic extension and uniqueness theorems that \( \mu \) can be extended uniquely to a measure on the \( \mathscr{R} = \sigma(\mathscr{I}) \).

For the final uniqueness part, suppose that \( \mu \) is a measure on \( \mathscr{R} \) satisfying \( \mu(a, b] = F(b) - F(a) \) for \( a, \, b \in \R \) with \( a \lt b \). Then by the continuity theorem for increasing sets, \( \mu(-\infty, a] = F(a) - F(-\infty) \) and \( \mu(a, \infty) = F(\infty) - F(a) \) for \( a \in \R \). Hence \( \mu \) is the unique measure constructed above.

The measure \( \mu \) is called the Lebesgue-Stieltjes measure associated with \( F \), named for Henri Lebesgue and Thomas Joannes Stieltjes. A very rich variety of measures on \( \R \) can be constructed in this way. In particular, when the function \( F \) takes values in \( [0, 1] \), the associated measure \( \P \) is a probability measure. Another special case of interest is the distribution function defined by \( F(x) = x \) for \( x \in \R \), in which case \( \mu(a, b] \) is the length of the interval \( (a, b] \) and therefore \( \mu = \lambda \), Lebesgue measure on \( \mathscr{R} \). But although the measure associated with a distribution function is unique, the distribution function itself is not. Note that if \( c \in \R \) then the distribution function defined by \( F(x) = x + c\) for \( x \in \R \) also generates Lebesgue measure. This example captures the general situation.

Suppose that \( F \) and \( G \) are distribution functions that generate the same measure \( \mu \) on \( \R \). Then there exists \( c \in \R \) such that \( G = F + c \).

Proof:

For \( x \in \R \), note that \( F(x) - F(0) = G(x) - G(0) \). The common value is \( \mu(0, x] \) if \( x \ge 0 \) and \( -\mu(x, 0] \) if \( x \lt 0 \). Thus \( G(x) = F(x) - F(0) + G(0) \) for \( x \in \R \).

Returning to the case of a probability measure \( \P \) on \( \R \), the cumulative distribution function \( F \) that we studied in this chapter is the unique distribution function satisfying \( F(-\infty) = 0 \). More generally, having constructed a measure from a distribution function, let's now consider the complementary problem of finding a distribution function for a given measure. The proof of the last theorem points the way.

Suppose that \( \mu \) is a positive measure on \( (\R, \mathscr{R}) \) with the property that \( \mu(A) \lt \infty \) if \( A \) is bounded. Then there exists a distribution function that generates \( \mu \).

Proof:

Define \( F \) on \( \R \) by \[ F(x) = \begin{cases} \mu(0, x], & x \ge 0 \\ -\mu(x, 0], & x \lt 0 \end{cases} \] Then \( F: \R \to \R \) by the assumption on \( \mu \). Also \( F \) is increasing: if \( 0 \le x \le y \) then \( \mu(0, x] \le \mu(0, y] \) by the increasing property of a positive measure. Similarly, if \( x \le y \le 0 \), the \( \mu(x, 0] \ge \mu(y, 0] \), so \( -\mu(x, 0] \le -\mu(y, 0] \). Finally, if \( x \le 0 \le y \), then \(-\mu(x, 0] \le 0\) and \( \mu(0, y] \ge 0 \). Next, \( F \) is continuous from the right: Suppose that \( x_n \in \R \) for \( n \in \N_+ \) and \( x_n \downarrow x \) as \( n \to \infty \). If \( x \ge 0 \) then \( \mu(0, x_n] \downarrow \mu(0, x] \) by the continuity theorem for decreasing sets, which applies since the measures are finite. If \( x \lt 0 \) then \( \mu(x_n, 0] \uparrow \mu(x, 0] \) by the continuity theorem for increasing sets. So in both cases, \( F(x_n) \downarrow F(x) \) as \( n \to \infty \). So \( F \) is a distribution function, and it remains to show that it generates \( \mu \). Let \( a, \, b \in \R \) with \( a \le b \). If \( a \ge 0 \) then \( \mu(a, b] = \mu(0, b] - \mu(0, a] = F(b) - F(a) \) by the difference property of a positive measure. Similarly, if \( b \le 0 \) then \( \mu(a, b] = \mu(a, 0] - \mu(b, 0] = -F(a) + F(b) \). Finally, if \( a \le 0 \) and \( b \ge 0 \), then \( \mu(a, b] = \mu(a, 0] + \mu(0, b] = -F(a) + F(b) \).

In the proof of the last theorem, the use of 0 as a reference point is arbitrary, of course. Any other point in \( \R \) would do as well, and would produce a distribution function that differs from the one in the proof by a constant. If \( \mu \) has the property that \( \mu(-\infty, x] \lt \infty \) for \( x \in \R \), then it's easy to see that \( F \) defined by \( F(x) = \mu(-\infty, x] \) for \( x \in \R \) is a distribution function that generates \( \mu \), and is the unique distribution function with \( F(-\infty) = 0 \). Of course, in the case of a probability measure, this is the cumulative distribution function, as noted above.

Properties

General distribution functions enjoy many of the same properties as the cumulative distribution function (but not all because of the lack of uniqueness). In particular, we can easily compute the measure of any interval from the distribution function.

Suppose that \( F \) is a distribution function and \( \mu \) is the positive measure on \( (\R, \mathscr{R}) \) associated with \( F \). For \( a, \; b \in \R \) with \( a \lt b \),

  1. \( \mu[a, b] = F(b) - F(a^-) \)
  2. \( \mu\{a\} = F(a) - F(a^-) \)
  3. \( \mu(a, b) = F(b^-) - F(a) \)
  4. \( \mu[a, b) = F(b^-) - F(a^-) \)
Proof:

All of these results follow from the continuity theorems for a positive measure. Suppose that \( (x_1, x_2, \ldots) \) is a sequence of distinct points in \( \R \).

  1. If \( x_n \uparrow a \) as \( n \to \infty \) then \( (x_n, b] \uparrow [a, b] \) so \( \mu(x_n, b] \uparrow \mu[a, b] \) as \( n \to \infty \). But also \( \mu(x_n, b] = F(b) - F(x_n) \to F(b) - F(a^-) \) as \( n \to \infty \).
  2. This follows from (a) by taking \( a = b \)
  3. If \( x_n \uparrow b \) as \( n \to \infty \) then \( (a, x_n] \uparrow (a, b) \) so \( \mu(a, x_n] \uparrow \mu(a, b) \) as \( n \to \infty \). But also \( \mu(a, x_n] = F(x_n) - F(a) \to F(b^-) - F(a) \) as \( n \to \infty \).
  4. From (a) and (b) and the difference rule, \[ \mu[a, b) = \mu[a, b] - \mu\{b\} = F(b) - F(a^-) - \left[F(b) - F(b^-)\right] = F(b^-) - F(a^-) \]

Note that \( F \) is continuous at \( x \in \R \) if and only if \( \mu\{x\} = 0 \). In particular, \( \mu \) is a continuous measure (recall that this means that \( \mu\{x\} = 0 \) for all \( x \in \R \)) if and only if \( F \) is continuous on \( \R \). On the other hand, \( F \) is discontinuous at \( x \in \R \) if and only if \( \mu\{x\} \gt 0 \), so that \( \mu \) has an atom at \( x \). So \( \mu \) is a discrete measure (recall that this means that \( \mu \) has countable support) if and only if \( F \) is a step function.

Suppose again that \( F \) is a distribution function and \( \mu \) is the positive measure on \( (\R, \mathscr{R}) \) associated with \( F \). If \( a \in \R \) then

  1. \( \mu(a, \infty) = F(\infty) - F(a) \)
  2. \( \mu[a, \infty) = F(\infty) - F(a^-) \)
  3. \( \mu(-\infty, a] = F(a) - F(-\infty) \)
  4. \( \mu(-\infty, a) = F(a^-) - F(-\infty) \)
  5. \( \mu(\R) = F(\infty) - F(-\infty) \)
Proof:

The proofs, as before, just use the continuity theorems. Suppose that \( (x_1, x_2, \ldots) \) is a sequence of distinct points in \( \R \)

  1. If \( x_n \uparrow \infty \) as \( n \to \infty \) then \( (a, x_n] \uparrow (a, \infty) \) so \( \mu(a, x_n] \uparrow \mu(a, \infty) \) as \( n \to \infty \). But also \( \mu(a, x_n] = F(x_n) - F(a) \to F(\infty) - F(a) \) as \( n \to \infty \)
  2. Similarly, if \( x_n \uparrow \infty \) as \( n \to \infty \) then \( [a, x_n] \uparrow (a, \infty) \) so \( \mu[a, x_n] \uparrow \mu[a, \infty) \) as \( n \to \infty \). But also \( \mu[a, x_n] = F(x_n) - F(a^-) \to F(\infty) - F(a^-) \) as \( n \to \infty \)
  3. If \( x_n \downarrow -\infty \) as \( n \to \infty \) then \( (x_n, a] \uparrow (-\infty, a] \) so \( \mu(x_n, a] \uparrow \mu(-\infty, a] \) as \( n \to \infty \). But also \( \mu(x_n, a] = F(a) - F(x_n) \to F(a) - F(-\infty) \) as \( n \to \infty \)
  4. Similarly, if \( x_n \downarrow -\infty \) as \( n \to \infty \) then \( (x_n, a) \uparrow (-\infty, a) \) so \( \mu(x_n, a) \uparrow \mu(-\infty, a) \) as \( n \to \infty \). But also \( \mu(x_n, a) = F(a^-) - F(x_n) \to F(a^-) - F(-\infty) \) as \( n \to \infty \)
  5. \( \mu(\R) = \mu(-\infty, 0] + \mu(0, \infty) = \left[F(0) - F(-\infty)\right] + \left[F(\infty) - F(0)\right] = F(\infty) - F(-\infty) \).

Distribution Functions on \( [0, \infty) \)

The discrete case. Suppose that \( G \) is discrete, so that there exists a countable set \( C \subset [0, \infty) \) with \( G\left(C^c\right) = 0 \). Let \( g(t) = G\{t\} \) for \( t \in C \) so that \( g \) is the density function of \( G \) with respect to counting measure on \( C \). If \( u: [0, \infty) \to \R \) is locally bounded then \[ \int_0^t u(s) \, dG(s) = \sum_{s \in C \cap [0, t]} u(s) g(s) \]

A discrete measure
Discrete measure

In the discrete case, the distribution is often arithmetic. Recall that this means that the countable set \( C \) is of the form \( \{n d: n \in \N\} \) for some \( d \in (0, \infty) \).

The continuous case. Suppose that \( G \) is absolutely continuous with respect to Lebesgue measure on \( [0, \infty) \) with density function \( g: [0, \infty) \to [0, \infty) \). If \( u: [0, \infty) \to \R \) is locally bounded then \[ \int_0^t u(s) \, dG(s) = \int_0^t u(s) g(s) \, ds \]

A continuous measure
Continuous measure

The mixed case. Suppose that there exists a countable set \( C \subset [0, \infty) \) with \( G(C) \gt 0 \) and \( G\left(C^c\right) \gt 0 \), and that \( G \) restricted to subsets of \( C^c \) is absolutely continuous with respect to Lebesgue measure. Let \( g(t) = G\{t\} \) for \( t \in C \) and let \( h \) be a density with respect to Lebesgue measure of \( G \) restricted to subsets of \( C^c \). If \( u: [0, \infty) \to \R \) is locally bounded then, \[ \int_0^t u(s) \, dG(s) = \sum_{s \in C \cap [0, t]} u(s) g(s) + \int_0^t u(s) h(s) \, ds \]

A mixed measure
Mixed measure

The three special cases do not exhaust the possibilities, but are by far the most common cases in applied problems.