\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\Z}{\mathbb{Z}}\) \(\newcommand{\bs}{\boldsymbol}\) \( \newcommand{\cov}{\text{cov}} \) \( \newcommand{\var}{\text{var}} \) \( \newcommand{\sd}{\text{sd}} \)
  1. Random
  2. 15. Stochastic Processes
  3. 1
  4. 2

2. Processes with Stationary, Independent Increments

Basic Theory

Definitions

Suppose again that \( \bs{X} = \{X_t: t \in T\} \) is a random process with state space \( S \subseteq \R \) and where the index set \( T \) is either \( \N \) or \( [0, \infty) \). As usual, we interpret \( T \) as the time space, so that in the first case we have discrete time and in the second case we have continuous time. Once again, the state space \( S \) is either countable (often \(\N\) or \(\Z\)), so that \( X_t \) has a discrete distribution for \( t \in T \), or an interval of \( \R \) (often \([0, \infty)\) or \(\R\)) with \( X_t \) having a continuous distribution for \( t \in T \). Thus, in terms of the discrete/continuous dichotomy, we have all four cases: discrete time, discrete space; discrete time, continous space; continuous time, discrete space; continuous time, continuous space. We have seen the following definitions before.

Suppose that \( \bs{X} = \{X_t: t \in T\} \) is a random process as above.

  1. \( \bs{X} \) has stationary increments if for \( s, t \in T \) with \( s \le t \), the increment \( X_t - X_s \) has the same distribution as \( X_{t-s} \).
  2. \( \bs{X} \) has independent increments if for \( t_1, t_2, \ldots, t_n \in T \) with \( t_1 \lt t_2 \lt \cdots \lt t_n \), the increments \( X_{t_1}, X_{t_2} - X_{t_1}, \ldots, X_{t_n} - X_{t_{n-1}} \) are independent.

In discrete time when \( T = \N \), the process \( \bs{X} \) has stationary, independent increments if and only if \( \bs{X} \) is the partial sum process associated with a sequence of independent, identically distributed variables

The process \( \bs{X} = (X_0, X_1, X_2, \ldots) \) has stationary, independent increments if and only if there exists an independent, identically distributed process \( \bs{U} = (U_1, U_2, \ldots) \) such that \[ X_n = \sum_{i=1}^n U_i, \quad n \in \N \]

Proof:

Suppose first that \((U_1, U_2, \ldots)\) is a sequence of independent, identically distributed random variables taking values in \(S \subseteq \R\), and let \(X_n = \sum_{i=1}^n U_i\). If \(m \le n\) then \(X_n - X_m = \sum_{i=m+1}^n U_i\). Hence \(X_n - X_m\) has the same distribution as \(X_{n-m} = \sum_{i=1}^{n-m} U_i\) and is independent of \((U_1, U_2, \ldots, U_m)\) and hence also \((X_0, X_1, \ldots, X_m)\). Conversely, suppose that \((X_0, X_1, \ldots)\) has stationary, independent increments. Let \(U_i = X_i - X_{i-1}\) for \(i \in \N_+\). Then \((U_1, U_2, \ldots)\) is a sequence of independent, identically distributed variables, and \(X_n = \sum_{i=1}^n U_i\).

It's much harder to characterize proccesses in continuous time with stationary, independent increments. As we have seen before, random processes indexed by an uncountable set are much more complicated in a technical sense than random processes indexed by a countable set. In spite of the technical difficulties, however, many of the underlying ideas are the same. Moreover, we have do have important examples. The non-homogeneous Poisson counting process \( \bs{N} = \{N_t: t \in [0, \infty)\} \) has independent increments. If the process is in fact homogeneous, then it has stationary increments as well.

Distributions and Moments

For a process with stationary, independent increments, if we know the distribution of \( X_t \) on \( S \) for each \( t \in T \), then we can compute all of the finite-dimensional distributions. To state the theorem, suppose that \( X_t \) has probability density function \( f_t \) on \( S \) for \( t \in T \). Thus, \( f_t \) is the probability density function for a discrete distribution if \( S \) is countable, or a probability density function for a continuous distribution if \( S \) is an interval.

Suppose that \( \bs{X} = \{X_t: t \in T\} \) has stationary, independent increments. If \( t_1, t_2, \ldots, t_n \in T\) with \( t_1 \lt t_2 \lt \cdots \lt t_n \) then \( \left(X_{t_1}, X_{t_2}, \ldots, X_{t_n}\right) \) has probability density function \[ f_{t_1, t_2, \ldots, t_n}(x_1, x_2, \ldots, x_n) = f_{t_1}(x_1) f_{t_2 - t_1}(x_2 - x_1) \cdots f_{t_n - t_{n-1}}(x_n - x_{n-1}), \quad (x_1, x_2, \ldots, x_n) \in S^n \]

Proof:

Let \( t_0 = 0 \) and define \( U_i = X_{t_i} - X_{t_{i-1}} \) for \( i = 1, 2, \ldots, n \). Then by the assumption of stationary independent increments, \( \bs{U} = (U_1, U_2, \ldots, U_n) \) is a sequence of independent variables and \( U_i \) has PDF \( u \mapsto f_{t_i - t_{i-1}}(u) \). Let \( V_i = X_{t_i} \) and \( \bs{V} = (V_1, V_2, \ldots, V_n) \). Then \( V_i = U_1 + \cdots + U_i \) for each \( i \), or in matrix form, \( \bs{V} = A \bs{U} \) where \( A \) is the \( n \times n \) matrix with \( A_{ij} = 1 \) if \( i \ge j \) and 0 otherwise. The result follows from the change of variables theorem.

Recall that \( \bs{X} = \{X_t: t \in T\} \) is a second order process if \( \E\left(X_t^2\right) \lt \infty \) for each \( t \in T \). For a second order process, the mean and covariance functions are usually of basic importance.

Suppose that \( \bs{X} = \{X_t: t \in T\} \) is a second order process with stationary, independent increments. Then there exist constants \( \mu \in \R \) and \( \sigma^2 \in [0, \infty) \) such that the mean and covariance functions of \( \bs{X} \) are

  1. \( m(t) = \mu t \) for \( t \in T \)
  2. \( c(s, t) = \sigma^2 \min\{s, t\} \) for \( (s, t) \in T^2 \)
Proof:

Let \( v(t) = \var(X_t) \) for \( t \in T \). Now for \( s, \; t \in T \) note that \( X_{s+t} = X_s + (X_{t+s} - X_s) \). The second term is independent of the first and has the same distribution as \( X_{t} \). Taking expected values gives \( m(s + t) = m(s) + m(t) \) and taking variances gives \( v(s + t) = v(s) + v(t) \). That is, \(m\) and \(v\) both satisfy Cauchy's functional equation, named of course, for the ubiquitous Augustin Cauchy. Since the functions \(m\) and \(v\) are bounded on bounded subsets of \(T\), there exists \( \mu \in \R \) and \( \sigma^2 \in [0, \infty) \) such that \( m(t) = \mu t \) and \( v(t) = \sigma^2 t \) for \( t \in T \). Finally, let \( s, t \in T \) with \( s \le t \). Then \( X_t - X_s \) is independent of \( X_s \) and therefore \[ \cov(X_s, X_t) = \cov\left[X_s, X_s + (X_t - X_s)\right] = \cov(X_s, X_s) + \cov(X_s, X_t - X_s) = \var(X_s) + 0 = \sigma^2 s \]

Note from the proof of the last result that \(\mu = \E(X_1)\) and \(\sigma^2 = \var(X_1)\).

Recall that two stochastic processes (with the same time and state spaces) are equivalent in distribution if they have the same finite-dimensional distributions. Equivalence in distribution really is an equivalence relation on the class of stochastic processes with given state and time spaces. If a process with stationary independent increments is shifted forward in time and then centered in space, the new process is equivalent to the original.

Suppose that \(\bs{X} = \{X_t: t \in T\}\) has stationary, independent increments. Fix \(t_0 \in T\) and define \(Y_t = X_{t_0+t} - X_{t_0}\) for \(t \in T\). Then \(\bs{Y} = \{Y_t: t \in T\}\) is equivalent in distribution to \(\bs{X}\).

Proof:

Note that for \(s, t \in T\) with \(s \le t\), \(Y_t - Y_s = X_{t_0 + t} - X_{t_0 + s}\). It follows that \(\bs{Y}\) also has stationary, independent increments and the distribution of \(Y_t\) is the same as the distribution of \(X_t\) for each \(t \in T\). From the PDF result above, \(\bs{Y}\) has the same finite-dimensional distributions as \(\bs{X}\).

The Markov Property

The last result can be generalized to show that a process with stationary, independent increments is a Markov process. Let \(\mathscr{F}_t = \sigma\{X_s: s \in T, s \le t\}\) denote the \(\sigma\)-algebra generated by the process up to time \(t\). Roughly speaking, we can determine if an event \(A \in \mathscr{F}_t\) occurs by observing the process up to time \(t\). The collection of \(\sigma\)-algebras \(\mathfrak{F} = \{\mathscr{F}_t: t \in T\}\) is known as a filtration.

Suppose that \(\bs{X} = \{X_t: t \in T\}\) is a process with stationary independent increments, and that \(X_t\) has probability density function \(f_t\) (either in the discrete or continuous cases) for \(t \in T\). Then \(\bs{X}\) is a (time homogeneous) Markov process with transition probability density function \(p\) given by

\[ p_t(x, y) = f_t(y - x), \quad t \in T, \; x, y \in S \]
Proof:

Suppose that \(s, t \in T\). Note that \(X_{s+t} = X_s + (X_{s+t} - X_s)\), and the second term is independent of \(\mathscr{F}_s\) and has the same distribution as \(X_t\). Hence given \(X_s = x \in S\), \(X_{s+t}\) is independent of \(F_s\) and has the same distribution as \(x + X_t\), which has the PDF \(y \mapsto f_t(y - x)\). As usual, we have glossed over some of the technical details that arise when \(T = [0, \infty)\).

Recall that a stopping time is a random time \(\tau\) with values in \(T \cup \{\infty\}\) that satisfies \(\{\tau \le t\} \in \mathscr{F}_t\) for \(t \in T\). So roughly speaking, we can determine whether \(\tau \le t\) by observing the process up to time \(t\). The \(\sigma\)-algebra associated with \(\tau\) encodes the information from the random process up to the random time \(\tau\), analogous to \(\mathscr{F}_t\) that encodes the information from the process up to the deterministic time \(t\). The appropriate definition is

\[ \mathscr{F}_\tau = \{A \in \mathscr{F}: A \cap \{\tau \le t\} \in \mathscr{F}_t \text{ for all } t \in T\} \]

The strong Markov property is the Markov property applied to stopping times in addition to deterministic times. A discrete time process with stationary, independent increments is also a strong Markov process. The same is true in continuous time, with the addition of appropriate technical assumptions. Basically, these assumptions require that \(X_t(\omega)\) be sufficiently nice (in terms of measureability in \((t, \omega)\) and continuity in \(t\)) and that we use a filtration that is a slight variation on \(\mathfrak{F}\). A nice statement of the strong Markov property is a generalization of Theorem 4:

Suppose that \(\bs{X} = \{X_t: t \in T\}\) is a process with stationary, independent increments, and \(\tau\) is a stopping time. Then for \(t \in T\), \(X_{\tau + t} - X_\tau\) is independent of \(\mathscr{F}_\tau\) and has the same distribution as \(X_t\).

Examples

We have seen several examples of random processes with stationary, independent increments. The following exercises give a quick review.

Bernoulli Trials

Let \( \bs{X} = (X_1, X_2, \ldots) \) be sequence of Bernoulli trials with success parameter \( p \in (0, 1) \), so that \( X_i = 1 \) if trial \( i \) is a success, and 0 otherwise. Let \( Y_n = \sum_{i=1}^n X_i \) for \( n \in \N \), so \( Y_n \) is the number of successes in the first \( n \) trials, and hence has the binomial distribution with parameters \(n\) and \(p\).

The sequence \( \bs{Y} = (Y_0, Y_1, \ldots) \) is a second order process with stationary independent increments. The mean and covariance functions are given by

  1. \( m(n) = p n \) for \( n \in \N \)
  2. \( c(m, n) = p (1 - p) \min\{m, n\} \) for \( (m, n) \in \N^2 \)

Now let \( U_i \) be the number of trials between the \( (i - 1) \)st sucess and the \( i \)th success. Thus, \( \bs{U} = (U_1, U_2, \ldots) \) is a sequence of independent variables, each with the geometric distribution with parameter \( p \). Let \( V_n = \sum_{i=1}^n U_i \) for \( n \in \N \), so that \( V_n \) it the trial number of the \( n \)th success and hence has the negative binomial distribution with parameters \(n\) and \(p\).

The sequence \( \bs{V} = (V_0, V_1, \ldots) \) is a seond order process with stationary, independent incrmements. The mean and covariance functions are given by

  1. \( m(n) = \frac{1}{p} n \) for \( n \in \N \)
  2. \( c(m, n) = \frac{1 - p}{p^2} \min\{m, n\} \) for \( (m, n) \in \N^2 \)

Simple Random Walk

Closely related to the Bernoulli trials process is the simple random walk. Suppose that \( \bs{U} = (U_1, U_2, \ldots) \) is a sequence of independent random variables with \( \P(U_i = 1) = p \) and \( \P(U_i = -1) = 1 - p \) for each \( i \in \N_+ \), where \( p \in (0, 1) \) is a parameter. Let \( X_n = \sum_{i=1}^n U_i \) for \( n \in \N \), the position of the walker at time \( n \).

The simple random walk \( \bs{X} = (X_0, X_1, X_2, \ldots) \) is a second order process with stationary, independent increments. The mean and covariance functions are given by

  1. \( m(n) = (2 p - 1)n \) for \( n \in \N \)
  2. \( c(m, n) = 4 p (1 - p) \min\{m, n\} \) for \( (m, n) \in \N^2 \)

Renewal Processes

Consider a renewal process with interarrival times \(\bs{X} = (X_1, X_2, \ldots) \). By definition, this is a sequence of independent, identically distributed variables with values in \( [0, \infty) \). Let \( \mu \) and \( \sigma^2 \) denote the mean and variance of the interarrival times (assumed finite). Let \( T_n = \sum_{i=1}^n X_i \) for \( n \in \N \), so that \( T_n \) is the time of the \( n \)th arrival.

The sequence of arrival times \( \bs{T} = (T_0, T_1, \ldots) \) is a second order process with stationary, independent increments. The mean and covariance functions are given by

  1. \( m(n) = \mu n \) for \( n \in \N \)
  2. \( c(m, n) = \sigma^2 \min\{m, n\} \)

Poisson Processes

Consider the Poisson model of random points in \( [0, \infty) \), with rate parameter \( r \in (0, \infty) \). This is a special renewal process with interarrival times \(\bs{X} = (X_1, X_2, \ldots)\) that have the exponential distribution with rate parameter \( r \). The time of the \(n\)th arrival \(T_n = \sum_{i=1}^n X_i\) has the gamma distribution with parameters \(n\) and \(r\). The results of the previous exercise apply with \(\mu = 1 / r\) and \(\sigma^2 = 1 / r^2\).

The sequence of arrival times \(\bs{T} = (T_0, T_1, \ldots)\) is a second order process with stationary, independent increments. The mean and covariance functions given by

  1. \( m(n) = \frac{1}{r} n \) for \( n \in \N \)
  2. \( c(m, n) = \frac{1}{r^2} \min\{m, n\} \) for \( (m, n) \in \N^2 \)

Now let \( \bs{N} = \{N_t: t \in [0, \infty)\} \) denote the counting process. Thus \( N_t \) is the number of random points in \( [0, t] \), which has the Poisson distribution with parameter \(r t\).

The Poisson counting process \( \bs{N} \) is a second order process with stationary, independent increments. The mean and covaraince functions are given by

  1. \( m(t) = \lambda t \) for \( t \in [0, \infty) \)
  2. \( c(s, t) = \lambda \min\{s, t\} \) for \( (s, t) \in [0, t)^2 \)

Suppose now that \( r: [0, \infty) \to (0, \infty) \) is measurable and let \[ m(t) = \int_{(0, t]} r(s) \, d\lambda(s), \quad t \in [0, \infty) \] where as usual \( \lambda \) denotes Lebesgue measure on \( \R \). Consider the hon-homogeneous Poisson process with rate function \( r \) and hence mean function \( m \). As before, let \( T_n \) denote the time of the \( n \)th arrival for \( n \in \N \). Unless \( r \) is contant (so that the Poisson process is in fact homogeneous), the arrival time sequence \( \bs{T} = (T_0, T_1, \ldots) \) has independent increments, but not stationary increments. Similarly, if \( N_t \) denotes the number of arrivals in \( (0, t] \) for \( t \in [0, \infty) \), then the counting process \( \bs{N} = \{N_t: t \in [0, \infty)\}\) has independent, but not stationary increments.

Summary

The examples involving Bernoulli trials and the simple random walk have discrete time and state spaces. In the Poisson model, the arrival time process has a discrete time space and a continuous state space, while the counting process has a continuous time space and a discrete state space. We are missing an example of a process with stationary, independent increments and with continuous time and state spaces. Brownian motion, one of the most important random processes, will fill the gap.