\(\newcommand{\P}{\mathbb{P}}\)
\(\newcommand{\E}{\mathbb{E}}\)
\(\newcommand{\R}{\mathbb{R}}\)
\(\newcommand{\N}{\mathbb{N}}\)
\(\newcommand{\Z}{\mathbb{Z}}\)
\(\newcommand{\bs}{\boldsymbol}\)

A continuous-time birth-death chain is a simple class of Markov chains on a subset of \( \Z \) with the property that the only possible transitions are to increase the state by 1 (birth) or decrease the state by 1 (death). It's easiest to define the birth-death process in terms of the exponential transition rates, part of the basic structure of continuous-time Markov chains.

Suppose that \( S \) is an integer interval (that is, a set of consecutive integers), either finite or infinite. The birth-death chain with birth rate function \( \alpha: S \to [0, \infty) \) and death rate function \( \beta: S \to [0, \infty) \) is the Markov chain \( \bs{X} = \{X_t: t \in [0, \infty)\} \) on \( S \) with transition rate \( \alpha(x) \) from \( x \) to \( x + 1 \) and transition rate \( \beta(x) \) from \( x \) to \( x - 1 \), for \( x \in S \).

If \( S \) has a minimum element \( m \), then of course we must have \( \beta(m) = 0 \). If \( \alpha(m) = 0 \) also, then the boundary point \( m \) is absorbing. Similarly, if \( S \) has a maximum element \( n \) then we must have \( \alpha(n) = 0 \). If \( \beta(n) = 0 \) also then the boundary point \( n \) is absorbing. If \( x \in S \) is not a boundary point, then typically we have \( \alpha(x) + \beta(x) \gt 0 \), so that \( x \) is stable. If \( \beta(x) = 0 \) for all \( x \in S \), then \( \bs{X} \) is a pure birth process, and similarly if \( \alpha(x) = 0 \) for all \( x \in S \) then \( \bs{X} \) is a pure death process. From the transition rates, it's easy to compute the parameters of the exponential holding times in a state and the transition matrix of the embedded, discrete-time jump chain.

Consider again the birth-death chain \( \bs{X} \) on \( S \) with birth rate function \( \alpha \) and death rate function \( \beta \). As usual, let \( \lambda \) denote the exponential parameter function and \( Q \) the transition matrix for the jump chain.

- \( \lambda(x) = \alpha(x) + \beta(x) \) for \( x \in S \)
- If \( x \in S \) is stable, so that \( \alpha(x) + \beta(x) \gt 0 \), then \[ Q(x, x + 1) = \frac{\alpha(x)}{\alpha(x) + \beta(x)}, \quad Q(x, x - 1) = \frac{\beta(x)}{\alpha(x) + \beta(x)} \]

Note that jump chain \( \bs{Y} = (Y_0, Y_1, \ldots) \) is a discrete-time birth death chain. The probability functions \( p \), \( q \), and \( r \) of \( \bs Y \) are given as follows: If \( x \in S \) is stable then \begin{align*} p(x) & = Q(x, x + 1) = \frac{\alpha(x)}{\alpha(x) + \beta(x)}\\ q(x) & = Q(x, x - 1) = \frac{\beta(x)}{\alpha(x) + \beta(x)}\\ r(x) & = Q(x, x) = 0 \end{align*} If \( x \) is absorbing then of course \( p(x) = q(x) = 0 \) and \( r(x) = 1 \). Except for the initial state, the jump chain \( \bs{Y} \) is deterministic for a pure birth process, with \( Q(x, x) = 1 \) if \( x \) is absorbing and \( Q(x, x + 1) = 1 \) if \( x \) is stable. Similarly, except for the initial state, \( \bs{Y} \) is deterministic for a pure death process, with \( Q(x, x) = 1 \) if \( x \) is absorbing and \( Q(x, x - 1) = 1 \) if \( x \) is stable. Note that the Poisson process with rate parameter \( r \in (0, \infty) \), viewed as a continuous-time Markov chain, is a pure birth process on \( \N \) with birth function \( \alpha(x) = r \) for each \( x \in \N \). More generally, a birth death process with \( \lambda(x) = \alpha(x) + \beta(x) = r \) for all \( x \in S \) is also subordinate to the Poisson process with rate \( r \).

Note that \( \lambda \) is bounded if and only if \( \alpha \) and \( \beta \) are bounded (always the case if \( S \) is finite), and in this case the birth-death chain \( \bs X = \{X_t: t \in [0, \infty)\} \) is uniform. If \( \lambda \) is unbounded, then \( \bs X \) may not even be regular, as an example below shows. Recall that a sufficient condition for \( \bs X \) to be regular when \( S \) is infinite is \[ \sum_{x \in S_+} \frac{1}{\lambda(x)} = \sum_{x \in S_+} \frac{1}{\alpha(x) + \beta(x)} = \infty \] where \( S_+ = \{x \in S: \lambda(x) = \alpha(x) + \beta(x) \gt 0\} \) is the set of stable states. Except for the aforementioned example, we will restrict our study to regular birth-death chains.

Suppose again that \( \bs X = \{X_t: t \in [0, \infty)\} \) is a continuous-time birth-death chain on an interval \( S \subseteq \Z \) with birth rate function \( \alpha \) and death rate function \( \beta \). As usual, we will let \( P_t \) denote the transition matrix at time \( t \in [0, \infty) \) and \( G \) the infinitesimal generator. As always, the infinitesimal generator gives the same information as the exponential parameter function and the jump transition matrix, but in a more compact and useful form.

The generator matrix \( G \) is given by \[ G(x, x) = -[\alpha(x) + \beta(x)], \; G(x, x + 1) = \alpha(x), \; G(x, x - 1) = \beta(x), \quad x \in S \]

This follows from the general theory, since \( G(x, x) = -\lambda(x) \) for \( x \in S \) and \( G(x, y) = \lambda(x) Q(x, y) \) for \( (x, y) \in S^2 \) with \( x \ne y \).

The Kolmogorov backward and forward equations are

- \( \frac{d}{dt} P_t(x, y) = -[\alpha(x) + \beta(x)] P_t(x, y) + \alpha(x) P_t(x + 1, y) + \beta(x) P_t(x - 1, y) \) for \( (x, y) \in S^2 \).
- \( \frac{d}{dt} P_t(x, y) = -[\alpha(y) + \beta(y)] P_t(x, y) + \alpha(y - 1) P_t(x, y - 1) + \beta(y + 1) P_t(x, y + 1) \) for \( (x, y) \in S^2 \)

These results follow from the generator matrix \( G \) above.

- The backward equation is \( \frac{d}{dt} P_t = G P_t \).
- The forward equation is \( \frac{d}{dt} P_t = P_t G \).

For our discussion of limiting behavior, we will consider first the important special case of a continuous-time birth-death chain \( \bs{X} = \{X_t: t \in [0, \infty)\} \) on \( S = \N \) and with \( \alpha(x) \gt 0 \) for all \( x \in \N \) and \( \beta(x) \gt 0 \) for all \( x \in \N_+ \). For the jump chain \( \bs{Y} = \{Y_n: n \in \N\} \), recall that \[ p(x) = Q(x, x + 1) = \frac{\alpha(x)}{\alpha(x) + \beta(x)}, \; q(x) = Q(x, x - 1) = \frac{\beta(x)}{\alpha(x) + \beta(x)}, \quad x \in \N\] The jump chain \( \bs{Y} \) is a discrete-time birth-death chain, and our notation here is consistent with the notation that we used in that section. Note that \( \bs{X} \) and \( \bs{Y} \) are irreducible. We first consider transience and recurrence.

The chains \( \bs{X} \) and \( \bs{Y} \) are recurrent if and only if \[ \sum_{x=0}^\infty \frac{\beta(1) \cdots \beta(x)}{\alpha(1) \cdots \alpha(x)} = \infty \]

Recall that \( \bs{X} \) is recurrent if and only if \( \bs{Y} \) is recurrent. In our study of discrete-time birth-death chains we saw that \( \bs{Y} \) is recurrent if and only if \[ \sum_{x=0}^\infty \frac{q(1) \cdots q(x)}{p(1) \cdots p(x)} = \infty \] But trivially, \[ \frac{q(1) \cdots q(x)}{p(1) \cdots p(x)} = \frac{\beta(1) \cdots \beta(x)}{\alpha(1) \cdots \alpha(x)} \]

Next we consider positive recurrence and invariant distributions. It's nice to look at this from different points of view.

The function \( g: \N \to (0, \infty) \) defined by \[ g(x) = \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)}, \quad x \in \N \] is invariant for \( \bs{X} \), and is the only invariant function, up to multiplication by constants. Hence \( \bs{X} \) is positive recurrent if and only if \( B = \sum_{x = 0}^\infty g(x) \lt \infty \), in which case the (unique) invariant probability density function \( f \) is given by \( f(x) = \frac{1}{B} g(x) \) for \( x \in \N \). Moreover, \( P_t(x, y) \to f(y) \) as \( t \to \infty \) for every \(x, \, y \in \N\)

From our study of discrete-time birth-death chains, we know that the function \( h: \N \to (0, \infty) \) defined by \[ h(x) = \frac{p(0) \cdots p(x - 1)}{q(1) \cdots q(x)}, \quad x \in \N \] is invariant for \( \bs{Y} \), and is the only positive invariant function up to multiplication by positive constants. It then follows from our study of invariant functions for continuous-time chains that the function \( h / \lambda \) is invariant for \( \bs{X} \), and again is the only positive invariant function up to multiplication by positive constants. But it's simple to see that \[ \frac{h(x)}{\lambda(x)} = \frac{h(x)}{\alpha(x) + \beta(x)} = \frac{\alpha(1) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} = \frac{1}{\alpha(0)} g(x) \] where \( g \) is the function given in the theorem. The remaining parts of the theorem follow from the general theory.

A function \( g: \N \to (0, \infty) \) is invariant for \( \bs{X} \) if and only if it satisfies the balance equation \( g G = 0 \). For our birth-death chain, this reduces to \begin{align*} \alpha(0) g(0) & = \beta(1) g(1) \\ [\alpha(x) + \beta(x)] g(x) & = \alpha(x - 1) g(x - 1) + \beta(x + 1) g(x + 1), \quad x \in \N_+ \end{align*} Substituting the equation with \( x = 0 \) on the left into the equation with \( x = 1 \) on the left gives \( \alpha(1) g(1) = \beta(2) g(2) \). Substituting this into the equation with \( x = 2 \) on the left gives \( \alpha(2) g(2) = \beta(3) g(3) \). In general, the balance equations imply \[ \alpha(x) g(x) = \beta(x + 1) g(x + 1), \quad x \in \N \] Solving these new balance equations recursively gives \[ g(x) = g(0) \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} \] Letting \( g(0) = 1 \) gives the particular invariant function in the theorem. Again, the remaining parts follow from the general theory.

Here is a summary of the classification:

For the continuous-time birth-death chain \( \bs X \), let \[ A = \sum_{x = 0}^\infty \frac{\beta(1) \cdots \beta(x)}{\alpha(1) \cdots \alpha(x)}, \; B = \sum_{x = 0}^\infty \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} \]

- \( \bs X \) is transient if \( A \lt \infty \).
- \( \bs X \) is null recurrent if \( A = \infty \) and \( B = \infty \).
- \( \bs X \) is positive recurrent if \( B \lt \infty \).

Suppose now that \( n \in \N_+ \) and that \( \bs X = \{X_t: t \in [0, \infty)\}\) is a continuous-time birth-death chain on the integer interval \( \N_n = \{0, 1, \ldots, n\} \). We assume that \( \alpha(x) \gt 0 \) for \( x \in \{0, 1, \ldots, n - 1\} \) while \( \beta(x) \gt 0 \) for \( x \in \{1, 2, \ldots n\} \). Of course, we must have \( \beta(0) = \alpha(n) = 0 \). With these assumptions, \( \bs X \) is irreducible, and since the state space is finite, positive recurrent. So all that remains is to find the invariant distribution. The result is essentially the same as when the state space is \( \N \).

The invariant probability density function \( f_n \) is given by \[ f_n(x) = \frac{1}{B_n} \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} \text{ for } x \in \N_n \text{ where } B_n = \sum_{x=0}^n \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} \]

Define \[ g_n(x) = \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)}, \quad x \in \N_n \] The proof thet \( g_n \) is invariant for \( \bs X \) is the same as before. The constant \( B_n \) is the normalizing constant.

Note that \( B_n \to B \) as \( n \to \infty \), and if \( B \lt \infty \), \( f_n(x) \to f(x) \) as \( n \to \infty \) for \( x \in \N \). We will see this type of behavior again. Results for the birth-death chain on \( \N_n \) often converge to the corresponding results for the birth-death chain on \( \N \) as \( n \to \infty \).

Often when the state space \( S = \N \), the state of a birth-death chain represents a population of individuals of some sort (and so the terms birth and death have their usual meanings). In this case state 0 is absorbing and means that the population is extinct. Specifically, suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is a regular birth-death chain on \( \N \) with \( \alpha(0) = \beta(0) = 0 \) and with \( \alpha(x), \, \beta(x) \gt 0 \) for \( x \in \N_+ \). Thus, state 0 is absorbing and all positive states lead to each other and to 0. Let \( T = \min\{t \in [0, \infty): X_t = 0\} \) denote the time until absorption, where as usual, \( \min \emptyset = \infty \). Many of the results concerning extinction of the continuous-time birth-death chain follow easily from corresponding results for the discrete-time birth-death jump chain.

One of the following events will occur:

- Population extinction: \( T \lt \infty \) or equivalently, \( X_s = 0 \) for some \( s \in [0, \infty) \) and hence \( X_t = 0 \) for all \( t \in [s, \infty)\).
- Population explosion: \( T = \infty \) or equivalently \( X_t \to \infty \) as \( t \to \infty \).

Part (b) follows from the general theory, since 0 is absorbing, and all positive states lead to each other and to 0. Thus the positive states are transient and we know that with probability 1, the jump chain will visit a transient state only finitely often. Thus \( T = \infty \) is equivalent to \( X_t \to \infty \) as \( t \to \infty \). Without the assumption that the chain is regular, population explosion could occur in finite time.

Naturally we would like to find the probability of these complementary events, and happily we have already done so in our study of discrete-time birth-death chains. The absorption probability function \( v \) is defined by \[v(x) = \P(T \lt \infty) = \P(X_t = 0 \text{ for some } t \in [0, \infty) \mid X_0 = x), \quad x \in \N \]

As before, let \[A = \sum_{i=0}^\infty \frac{\beta(1) \cdots \beta(i)}{\alpha(1) \cdots \alpha(i)}\]

- If \( A = \infty \) then \( v(x) = 1 \) for all \( x \in \N \).
- If \( A \lt \infty \) then \[ v(x) = \frac{1}{A} \sum_{i=x}^\infty \frac{\beta(1) \cdots \beta(i)}{\alpha(1) \cdots \alpha(i)}, \quad x \in \N \]

The continuous-time chain is absorbed into 0 if and only if the discrete-time jump chain is absorbed into 0. So the result follows from the corresponding result for discrete-time birth-death chains. Recall again that \( q(x) / p(x) = \beta(x) / \alpha(x) \) for \( x \in \N_+ \)

The mean time to extinction is considered next, so let \( m(x) = \E(T \mid X_0 = x) \) for \( x \in \N \). Unlike the probability of extinction, computing the mean time to extinction cannot be easily reduced to the corresponding discrete-time computation. However, the *method* of computation does extend.

The mean absorption function is given by \[ m(x) = \sum_{j=1}^x \sum_{k=j-1}^\infty \frac{\alpha(j) \cdots \alpha(k)}{\beta(j) \cdots \beta(k+1)}, \quad x \in \N \]

The time required to go from state \( x \in \N_+ \) to \( x - 1 \) has the same distribution as the time required to go from state 1 to 0, except with parameters \( \alpha(y), \, \beta(y) \) for \( y \in \{x, x + 1, \ldots\} \) instead of parameters \( \alpha(y), \, \beta(y) \) for \( y \in \{1, 2, \ldots\} \). So by the additivity of expected value, we just need to compute \( m(1) \) as a function of the parameters. Starting in state 1, the chain will be absorbed in state 0 after a random number of returns to state 1 without absorption. Whenever the chain is in state 1, absorption occurs at the next transition with probability \( q(1) \) so it follows that the number of times that the chain is in state 1 before absorption has the geometric distribution on \( \N_+ \) with success parameter \( q(1) \). The mean of this distribution is \( 1 / q(1) = [\alpha(1) + \beta(1)] / \beta(1) \). On the other hand, starting in state 1, time until the chain is in state 1 again (without absorption) has the same distribution as the return time to state 0, starting in state 0 for the irreducible birth-death chain on \( \N \) with birth and death rates \( \alpha^\prime \) and \( \beta^\prime \) given by \( \alpha^\prime(x) = \alpha(x + 1) \) for \( x \in \N \) and \( \beta^\prime(x) = \beta(x + 1) \) for \( x \in \N_+ \). Thus, let \[ \mu = \frac{1}{\alpha(1) + \beta(1)}\sum_{k=0}^\infty \frac{\alpha(1) \cdots \alpha(k)}{\beta(2) \cdots \beta(k+1)} \] Then \( \mu \) is the mean return time to state 0 for the chain \( \bs{X}^\prime \). Specifically, note that if \( \mu = \infty \) then \( \bs{X}^\prime \) is either transient or null recurrent. If \( \mu \lt \infty \) then \( 1 / \mu \) is the invariant PDF at 0. So, it follows that \[ m(1) = \frac{1}{q(1)} \mu = \sum_{k=0}^\infty \frac{\alpha(1) \cdots \alpha(k)}{\beta(1) \cdots \beta(k + 1)} \] By our argument above, the mean time to go from state \( x \) to \( x - 1 \) is \[ \sum_{k=x-1}^\infty \frac{\alpha(x) \cdots \alpha(k)}{\beta(x) \cdots \beta(k + 1)} \]

In particular, note that \[ m(1) = \sum_{k=0}^\infty \frac{\alpha(1) \cdots \alpha(k)}{\beta(1) \cdots \beta(k + 1)} \] If \( m(1) = \infty \) then \( m(x) = \infty \) for all \( x \in S \). If \( m(1) \lt \infty \) then \( m(x) \lt \infty \) for all \( x \in S \)

Next we will consider a birth-death chain on a finite integer interval with both endpoints absorbing. Our interest is in the probability of absorption in one endpoint rather than the other, and in the mean time to absorption. Thus suppose that \( n \in \N_+ \) and that \( \bs X = \{X_t: t \in [0, \infty)\} \) is a continuous-time birth-death chain on \( \N_n = \{0, 1, \ldots, n\} \) with \( \alpha(0) = \beta(0) = 0 \), \( \alpha(n) = \beta(n) = 0 \), and \( \alpha(x) \gt 0 \), \( \beta(x) \gt 0 \) for \( x \in \{1, 2, \ldots, n - 1\} \). So the endpoints 0 and \( n \) are absorbing, and all other states lead to each other and to the endpoints. Let \( T = \inf\{t \in [0, \infty): X_t \in \{0, n\}\} \), the time until absorption, and for \( x \in S \) let \( v_n(x) = \P(X_T = 0 \mid X_0 = x) \) and \( m_n(x) = \E(T \mid X_0 = x) \). The definitions make sense since \( T \) is finite with probability 1.

The absorption probability function for state 0 is given by \[ v_n(x) = \frac{1}{A_n} \sum_{i=x}^{n-1} \frac{\beta(1) \cdots \beta(i)}{\alpha(1) \cdots \alpha(i)} \text{ for } x \in \N_n \text{ where } A_n = \sum_{i=0}^{n-1} \frac{\beta(1) \cdots \beta(i)}{\alpha(1) \cdots \alpha(i)} \]

The jump chain \( \bs Y = \{Y_n: n \in \N\} \) is a discrete-time birth-death chain on \( \N_n \) with \( 0 \) and \( n \) absorbing. Also, \( \bs X \) is absorbed into 0 or \( n \) if and only if \( \bs Y \) is absorbed into 0 or \( n \), respectively. So the result follows from the corresponding result for \( \bs Y \), since \( q(x) / p(x) = \beta(x) / \alpha(x) \) for \( x \in \{1, 2, \ldots, n - 1\} \).

Note that \( A_n \to A \) as \( n \to \infty \) where \( A \) is the constant above for the absorption probability at 0 with the infinite state space \( \N \). If \( A \lt \infty \) then \( v_n(x) \to v(x) \) as \( n \to \infty \) for \( x \in \N \).

Essentially, every irreducible continuous-time birth-death chain is reversible.

Suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is a positive recurrent birth-death chain on an integer interval \( S \subseteq \Z \) with birth rate function \( \alpha: S \to [0, \infty) \) and death rate function \( \beta: S \to \infty \). Assume that \( \alpha(x) \gt 0 \), except at the maximum value of \( S \), if there is one, and similarly that \( \beta(x) \gt 0 \), except at the minimum value of \( X \), if there is one. Then \( \bs X \) is reversible.

Note that \( \bs X \) is irreducible. As usual, let \( G \) denote the generator matrix. It's easy to see that under the assumptions, \( G(x, y) = 0 \) implies \( G(y, x) = 0 \) for \( (x, y) \in S^2 \), and that the Kolmogorov cycle condition is satisfied: For every \( n \in \N_+ \) and every sequence \( (x_1, x_2, \ldots x_n) \in S^n \), \[ G(x_1, x_2) \cdots G(x_{n-1}, x_n) G(x_n, x_1) = G(x_1, x_n), G(x_n, x_{n-1} \cdots G(x_2, x_1) \]

In the important special case of a birth-death chain on \( \N \), we can verify the balance equations directly.

Suppose that \( \bs{X} = \{X_t: t \in [0, \infty)\} \) is a continuous-time birth-death chain on \( S = \N \) and with birth rate \( \alpha(x) \gt 0 \) for all \( x \in \N \) and death rate \( \beta(x) \gt 0 \) for all \( x \in \N_+ \). Then \( \bs X \) is reversible.

We just need to show that the balance equation for a reversible chain holds, and this was actually done in the result above. As before, let \( g: \N \to (0, \infty) \) be the function given by \[ g(x) = \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)}, \quad x \in \N \] The only nontrivial case of the balance equation \( g(x) G(x, y) = g(y) G(y, x) \) for \( (x, y) \in S^2 \) is \[g(x) G(x, x + 1) = g(x + 1) G(x + 1, x) ) = \frac{\alpha(0) \cdots \alpha(x)}{\beta(1) \cdots \beta(x)}, \quad x \in \N \] It follows from the general theory that \( g \) is invariant for \( \bs X \) and that \( \bs X \) is reversible with respect to \( g \). Since we actually know from our work above that \( g \) is the only positive invariant function, up to multiplication by positive constants, we can simply say that \( \bs X \) is reversible.

In the positive recurrent case, it follows that the birth-death chain is stochastically the same, forward or backward in time, if the chain has the invariant distribution.

Our first exercise gives two pure birth chains, each with an unbounded exponential parameter function. One is regular and one is irregular.

Consider the pure birth process \( \bs{X} = \{X_t: t \in [0, \infty)\} \) on \( \N_+ \) with birth rate function \( \alpha \).

- If \( \alpha(x) = x^2 \) for \( x \in \N_+ \), then \( \bs{X} \) is not regular.
- If \( \alpha(x) = x \) for \( x \in \N_+ \), then \( \bs{X} \) is regular.

The jump chain \( \bs Y \) is deterministic, except for the initial state. Given \( Y_0 = x \in \N_+ \), we have \( Y_n = n + x \). Hence

- \( \sum_{n=0}^\infty \frac{1}{\lambda(Y_n)} = \sum_{n=0}^\infty \frac{1}{(n + x)^2} \lt \infty\)
- \( \sum_{n=0}^\infty \frac{1}{\lambda(Y_n)} = \sum_{n=0}^\infty \frac{1}{n + x} = \infty\)

So the results follow from the general theory.

Our next examples consider birth-death chains with constant birth and death rates, except perhaps at the endpoints. Note that such chains will be regular since the exponential parameter function \( \lambda \) is bounded.

Suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is the birth-death chain on \( \N \), with constant birth rate \( \alpha \in (0, \infty) \) on \( \N \) and constant death rate \( \beta \in (0, \infty) \) on \( \N_+ \).

- \( \bs X \) is transient if \( \beta \lt \alpha \).
- \( \bs X \) is null recurrent if \( \beta = \alpha \).
- \( \bs X \) is positive recurrent if \( \beta \gt \alpha \). The invariant distribution is the geometric distribution on \( \N \) with parameter \( \alpha / \beta \) \[ f(x) = \left( 1 - \frac{\alpha }{\beta} \right) \left( \frac{\alpha}{\beta} \right)^x, \quad x \in \N \]

Note that \( \bs X \) is irreducible since the birth rate is positive on \( \N \) and the death rate is positive on \( \N_+ \). The series in the results above are geometric series: \[ \frac{\beta(1) \cdots \beta(x)}{\alpha(1) \cdots \alpha(x)} = \left(\frac{\beta}{\alpha}\right)^x, \; \frac{\alpha(0) \cdots \alpha(x - 1)}{\beta(1) \cdots \beta(x)} = \left(\frac{\alpha}{\beta}\right)^x, \quad x \in \N \]

Next we consider the chain with \( 0 \) absorbing. As in the general discussion above, let \( v \) denote the function that gives the probability of absorption and \( m \) the function that gives the mean time to absorption.

Suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is the birth-death chain in \( \N \) with constant birth rate \( \alpha \in (0, \infty) \) on \( \N_+ \), constant death reate \( \beta \in (0, \infty) \) on \( \N_+ \), and with 0 absorbing. Then

- If \( \beta \ge \alpha \) then \( v(x) = 1 \) for \( x \in \N \). If \( \beta \lt \alpha \) then \( v(x) = (\beta / \alpha)^x \) for \( x \in \N \).
- If \( \alpha \ge \beta \) then \( m(x) = \infty \). If \( \alpha \lt \beta \) then \(m(x) = x / (\beta - \alpha)\) for \( x \in \N \).

Next let's look at chains on a finite state space. Let \( n \in \N_+ \) and define \( \N_n = \{0, 1, \ldots, n\} \).

Suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is a continuous-time birth-death chain on \( \N_n \) with constant birth rate \( \alpha \in (0, \infty) \) on \( \{0, 1, \ldots, n - 1\} \) and constant death rate \( \beta \in (0, \infty) \) on \( \{1, 2, \ldots n\} \). The invariant probability density function \( f_n \) is given as follows:

- If \( \alpha \ne \beta \) then \[ f_n(x) = \frac{(\alpha / \beta)^x (1 - \alpha / \beta)}{1 - (\alpha / \beta)^{n+1}}, \quad x \in \N_n \]
- If \( \alpha = \beta \) then \( f_n(x) = 1 / (n + 1) \) for \( x \in \N_n \)

Note that when \( \alpha = \beta \), the invariant distribution is uniform on \( \N_n \). Our final exercise considers the absorption probability at 0 when both endpoints are absorbing. Let \( v_n \) denote the function that gives the probability of absorption into 0, rather than \( n \).

Suppose that \( \bs X = \{X_t: t \in [0, \infty)\} \) is the birth-death chain on \( \N_n \) with constant birth rate \( \alpha \) and constant death rate \( \beta \) on \( \{1, 2, \ldots, n - 1\} \), and with 0 and \( n \) absorbing.

- If \( \alpha \ne \beta \) then \[ v_n(x) = \frac{(\beta / \alpha)^x - (\beta / \alpha)^n}{1 - (\beta / \alpha)^n}, \quad x \in \N_n \]
- If \( \alpha = \beta \) then \( v_n(x) = (n - x) / n \) for \( x \in \N_n \).

For our next discussion, consider individuals that act identically and independently. Each individual splits into two at exponential rate \( a \in (0, \infty) \) and dies at exponential rate \( b \in (0, \infty) \).

Let \( X_t \) denote the population at time \( t \in [0, \infty) \). Then \( \bs X = \{X_t: t \in [0, \infty)\} \) is a regular, continuous-time birth-death chain with birth and death rate functions given by \( \alpha(x) = a x \) and \( \beta(x) = b x \) for \( x \in \N \).

The fact that \( \bs X \) is a continuous-time Markov chain follows from the assumptions. Moreover, since the individuals act independently, the overall birth and death rates when the population is \( x \in \N \) is simple \( x \) times the individual birth and death rates. The chain is regular since \[ \sum_{x=1}^\infty \frac{1}{(a + b) x} = \infty \]

Note that \( 0 \) is absorbing since the population is extinct, so as usual, our interest is in the probability of absorption and the mean time to absorption as functions of the initial state. The probability of absorption is the same as for the chain with constant birth and death rates discussed above.

The absorption probability function \( v \) is given as follows:

- \( v(x) = 1 \) for all \( x \in \N \) if \( b \ge a \).
- \( v(x) = (b / a)^x \) for \( x \in \N \) if \( b \lt a \).

These results follow from the general results above since \( \beta(x) / \alpha(x) = b / a \) for \( x \in \N_+ \). Hence for \( x \in \N \), \[ \sum_{i=x}^\infty (b / a)^i = \begin{cases} \infty & b \ge a \\ \frac{(b/a)^x}{1 - b / a} & b \lt a \end{cases}\]

The mean time to absorption is more interesting.

The mean time to absorption function \( m \) is given as follows:

- If \( a \ge b \) then \( m(x) = \infty \) for \( x \in \N_+ \).
- If \( a \lt b \) then \[m(x) = \sum_{j=1}^x \frac{b^{j-1}}{a^j} \int_0^{a/b} \frac{u^{j-1}}{1 - u} du, \quad x \in \N\]

- From the general results above, note that \[ m(1) = \sum_{k=0}^\infty \frac{1}{(k + 1) b} \left(\frac{a}{b}\right)^k\] The sum is infinite if \( a \ge b \).
- If \( \alpha \lt \beta \) then again from the general formula above, \[ m(x) = \sum_{j=1}^x \sum_{k=j-1}^\infty \frac{1}{(k + 1) b} \left(\frac{a}{b}\right)^{k - j + 1} = \sum_{j=1}^x \frac{1}{b} \left(\frac{b}{a}\right)^j \sum_{k=j-1}^\infty \frac{1}{k + 1}\left(\frac{a}{b}\right)^{k+1} \] The inner series converges absolutely. Moreover, for \( k \in \N \), \[ \frac{1}{k + 1} \left(\frac{a}{b}\right)^{k+1} = \int_0^{a/b} u^k du \] Substituting and interchanging the sum and integral gives \[ m(x) = \sum_{j=1}^x \frac{b^{j-1}}{a^j}\int_0^{a/b} \left( \sum_{k=j-1}^\infty u^k \right) du = \sum_{j=1}^x \frac{b^{j-1}}{a^j} \int_0^{a/b} \frac{u^{j-1}}{1 - u} du \]

For small values of \( x \in \N \), the integrals in the case \( a \lt b \) can be done by elementary methods. For example, \begin{align*} m(1) & = - \frac{1}{a} \ln\left(1 - \frac{a}{b}\right) \\ m(2) & = m(1) -\frac{1}{a} - \frac{b}{a^2} \ln\left(1 - \frac{a}{b}\right) \\ m(3) & = m(2) - \frac{1}{2 a} - \frac{b}{a^2} - \frac{b^2}{a^3} \ln\left(1 - \frac{a}{b}\right) \end{align*} However, a general formula requires the introduction of a special function that is not much more helpful than the integrals themselves. The Markov chain \( \bs X \) is actually an example of a branching chain. We will revisit this chain in that section.

We continue our previous discussion but generalizing a bit. Suppose again that we have individuals that act identically and independently. An individual splits into two at exponential rate \( a \in [0, \infty) \) and dies at exponential rate \( b \in [0, \infty) \). Additionally, new individuals enter the population at exponential rate \( c \in [0, \infty) \). This is the immigration effect, and when \( c = 0 \) we have the birth-death chain in the previous discussion.

Let \( X_t \) denote the population at time \( t \in [0, \infty) \). Then \( \bs X = \{X_t: t \in [0, \infty)\} \) is a regular, continuous-time birth-death chain with birth and death rate functions given by \( \alpha(x) = a x + c \) and \( \beta(x) = b x \) for \( x \in \N \).

The fact that \( \bs X \) is a continuous-time Markov chain follows from the assumptions. Moreover, since the individuals act independently, the overall birth rate when the population is \( x \in \N \) is \( a x + c \) while the death rate is \( b x \). The chain is regular since \[ \sum_{x=1}^\infty \frac{1}{((a + b) x + c} = \infty \]

The infinitesimal matrix \( G \) is given as follows, for \( x \in \N \):

- \( G(x, x) = -[(a + b) x + c] \)
- \( G(x, x + 1) = a x + c \)
- \( G(x, x - 1) = bx \)

The backward and forward equations are given as follows, for \( (x, y) \in \N^2 \) and \( t \in (0, \infty) \)

- \( \frac{d}{dt} P_t(x, y) = -[(a + b)x + c] P_t(x, y) + (a x + c) P_t(x + 1, y) + b x P_t(x - 1, y) \)
- \( \frac{d}{dt} P_t(x, y) = -[(a + b)y + c] P_t(x, y) + [a (y - 1) + c] P_t(x, y - 1) + b(y + 1) P_t(x, y + 1 \))

We can use the forward equation to find the expected population size. Let \( M_t(x) = \E(X_t, \mid X_0 = x) \) for \( t \in [0, \infty) \) and \( x \in \N \).

For \( t \in [0, \infty) \) and \( x \in \N \), the mean population size \( M_t(x) \) is given as follows:

- If \( a = b \) then \( M_t(x) = c t + x \).
- If \( a \ne b \) then \[ M_t(x) = \frac{c}{a - b}\left[e^{(a - b)t} - 1 \right] + x e^{(a - b) t} \]

First note that \( M_t(x) = \sum_{y=0}^\infty y P_t(x, y) \) for \( x \in \N \). Multiplying the forward equation above by \( y \) and summing over \( y \in \N \) gives \begin{align*} \sum_{y=0}^\infty y \frac{d}{dt} P_t(x, y) = & a \sum_{y=2}^\infty y(y - 1) P_t(x, y - 1) + c \sum_{y=1}^\infty y P_t(x, y - 1) \\ & -(a + b) \sum_{y=0}^\infty y^2 P_t(x, y) - c \sum_{y=0}^\infty y P_t(x, y) + b \sum_{y=0}^\infty y (y + 1) P_t(x, y + 1) \end{align*} Re-indexing the sums and using some algebra gives the first-order differential equation \[ \frac{d}{dt} M_t(x) = c + (a - b) M_t(x), \quad x \in \N, \, t \in (0, \infty) \] with initial condition \( M_0(x) = x \). Solving the differential equation gives the result.

Note that \( b \gt a \), so that the individual death rate exceeds the birth rate, then \( M_t(x) \to c / (b - a) \) as \( t \to \infty \) for \( x \in \N \). If \( a \ge b \) so that the birth rate equals or exceeds the death rate, then \( M_t(x) \to \infty \) as \( t \to \infty \) for \( x \in \N_+ \).

Next we will consider the special case with no births, but only death and immigration. In this case, the invariant distribution is easy to compute, and is one of our favorites.

Suppose that \( a = 0 \) and that \( b, \, c \gt 0 \). Then \( \bs X \) is positive recurrent. The invariant distribution is Poisson with parameter \( c / b \): \[ f(x) = e^{-c/b} \frac{(c / b)^x}{x!}, \quad x \in \N \]

In terms of the general theory above, note that the invariant function \( g \), unique up to multiplication by positive constants, is given by \[ g(x) = \frac{\alpha(0) \cdots \alpha(x)}{\beta(1) \cdots \beta(x)} = \frac{c^x}{b^x x!} = \frac{(c/b)^x}{x!}, \quad x \in \N \] Hence \( B = \sum_{x=0}^\infty g(x) = e^{c/b} \lt \infty \) and therefore the chain is positive recurrent with invariant PDF \[ f(x) = \frac{1}{B} g(x) = e^{-c/b} \frac{(c / b)^x}{x!}, \quad x \in \N \] This is the PDF of the Poisson distribution with parameter \( c / b \).

Consider a population that fluctuates between a minimum value \( m \in \N_+ \) and a maximum value \( n \in \N_+ \), where of course, \( m \lt n \). Given the population size, the individuals act independently and identically. Specifically, if the population is \( x \in \{m, m + 1, \ldots, n\} \) then an individual splits in two at exponential rate \( a (n - x) \) and dies at exponential rate \( b (x - m) \), where \( a, \, b \in (0, \infty) \). Thus, an individual's birth rate decreases linearly with the population size from \( a (n - m) \) to \( 0 \) while the death rate increases linearly with the population size from \( 0 \) to \( b (n - m) \). These assumptions lead to the following definition.

The continuous-time birth-death chain \( \bs X = \{X_t: t \in [0, \infty)\} \) on \( S = \{m, m + 1, \ldots, n\} \) with birth rate function \( \alpha \) and death rate function \( \beta \) given by \[ \alpha(x) = a x (n - x), \; \beta(x) = b x (x - m), \quad x \in S \] is the logistic chain on \( S \) with parameters \( a \) and \( b \).

Since the individuals act independently and identically, the overall birth rate and death rates when the population is \( x \in S \) is simply \( x \) times the birth and death rate for an individual.

Note that the logistics chain is a stochastic counterpart of the logistics differential equation, which typically has the form \[ \frac{dx}{dt} = c (x - m)(n - x) \] where \( m, \, n, \, c \in (0, \infty) \) and \( m \lt n \). Starting in \( x(0) \in (m, n) \), the solution remains in \( (m, n) \) for all \( t \in [0, \infty) \). Of course, the logistics differential equation models a system that is continuous in time and space, whereas the logistics Markov chain models a system that is continuous in time and discrete is space.

For the logistics chain

- The exponential parameter function \( \lambda \) is given by \[ \lambda(x) = a x (n - x) + b x (x - m), \quad x \in S \]
- The transition matrix \( Q \) of the jump chain is given by \[ Q(x, x - 1) = \frac{b (x - m)}{a (n - x) + b (x - m)}, \, Q(x, x + 1) = \frac{a (n - x)}{a (n - x) + b (x - m)}, \quad x \in S \]

In particular, \( m \) and \( n \) are reflecting boundary points, and so the chain is irreducible.

The generator matrix \( G \) for the logistics chain is given as follows, for \( x \in S \):

- \( G(x, x) = - x[a (n - x) + b (x - m)] \)
- \(G(x, x - 1) = b x (x - m)\)
- \( G(x, x + 1) = a x (n - x) \)

Since \( S \) is finite, \( \bs X \) is positive recurrent. The invariant distribution is given next.

Define \( g: S \to (0, \infty) \) by \[ g(x) = \frac{1}{x} \binom{n - m}{x - m} \left(\frac{a}{b}\right)^{x - m}, \quad x \in S \] Then \( g \) is invariant for \( \bs X \).

Since we know that \( \bs X \) is reversible, we just need to show that \( g(x) G(x, y) = g(y) G(y, x) \) for \( (x, y) \in S^2 \). For the logistics chain, the only non-trivial equation is \( g(x) G(x, x + 1) = g(x + 1) G(x + 1, x) \) for \( x \in S \). Simple substitution and algebra show that both sides reduce to \[\frac{(n - m)!}{(x - m)! (n - x - 1)!} \frac{a^{x - m + 1}}{b^{x - m}}\]

Of course it now follows that the invariant probability density function \( f \) for \( \bs X \) is given by \( f(x) = g(x) / c \) for \( x \in S \) where \( c \) is the normalizing constant \[ c = \sum_{x=m}^n \frac{1}{x} \binom{n - m}{x - m} \left(\frac{a}{b}\right)^{x - m} \] The limiting distribution of \( \bs X \) has probability density function \( f \).

There are a number of special birth-death chains that are studied in other sections, because the models are important and lead to special insights and analytic tools. These include

- Queuing chains
- The pure death branching chain
- The Yule process, a pure birth branching chain
- The general birth-death branching chain