In this section and the next, a family of polynomials will play a central role.
Define the family of polynomials \(\{P_n: n \in \N\}\) by the following initial conditions and recursion relation: \begin{align} & P_0(t) = 1, \; P_1(t) = t, \quad t \in \R \\ & P_{n + 1}(t) = t P_n(t) - P_{n - 1}(t), \quad n \in \N_+, \, t \in \R \end{align}
The first few members of the family are \begin{align*} P_0(t) & = 1 \\ P_1(t) & = t \\ P_2(t) & = t^2 - 1 \\ P_3(t) & = t^3 - 2 t \\ P_4(t) & = t^4 - 3 t^2 + 1 \end{align*} This family of polynomials is closely related to a family of Chebyshev polynomials, and with this connection, the important properties follow easily.
\(P_n(t) = U_n(t / 2)\) for \(n \in \N\) and \(t \in \R\) where \(U_n\) is the Chebyshev polynomial of the second kind with degree \(n\).
This follows directly from the defining conditions for \(\{U_n: n \in \N\}\): \begin{align*} & U_0(t) = 1, \; U_1(t) = 2 t, \quad t \in \R \\ & U_{n + 1}(t) = 2 t U_n(t) - U_{n - 1}(t), \quad n \in \N_+, \, t \in \R \end{align*}
For \(n \in \N_+\), the roots of \(P_n\) are \[2 \cos\left(\frac{k}{n + 1} \pi\right), \quad k \in \{1, 2, \ldots, n\}\]
Note that \(P_n\) has \(n\) distinct (and hence simple) roots in the interval \((-2, 2)\).
For \(n \in \N\) and \(\theta \in (0, \pi)\), \[P_n(2 \cos \theta) = \frac{\sin[(n + 1) \theta]}{\sin \theta}\]
The following definition is the graph of interest in this section.
Define the relation \(\lfrta\) on \(\N\) by \(x \lfrta x + 1\) for \(x \in \N\) and \(x \lfrta x - 1\) for \(x \in \N_+\). The graph \((\N, \lfrta)\) is the symmetric, undirected simple path \((0, 1, 2, \ldots)\) (with no loops). The graph is periodic with period 2.
Clearly a walk from \(x \in \N\) back to \(x\) must have even length, and there is such a walk of length 2.
The graph \((\N, \lfrta)\) is clearly a basic and interesting algebraic structure. Since the graph is discrete, we have our usual reference space \((\N, \ms P(\N), \#)\) in the background.
The \(\sigma\)-algebra associated with \((\N, \lfrta)\) is the reference \(\sigma\)-algebra \(\ms P(\N)\).
By definition, the \(\sigma\)-algebra associated with the graph is \(\ms A = \sigma\{A_x: x \in \N\}\) where \(A_x = \{y \in \N: x \lfrta y\}\) is the neighbor set of \(x \in \N\). So \(A_x = \{x - 1, x + 1\}\) for \(x \in \N_+\) and \(A_0 = \{1\}\). So \(A_x \cap A_{x + 2} = \{x + 1\} \in \ms A\) for \(x \in \N\). It then follows that \(\N_+ \in \ms A\) and hence \(\{0\} \in \ms A\). So \(\{x\} \in \ms A\) for very \(x \in \N\).
The walk functions for \((\N, \lfrta)\) are given recursively as follows.
For \(x \in \N\), the generating function \(U(x, t)\) for \((\N, \lfrta)\) is defined for \(|t| \lt 1 /2\) and then satisfies
\begin{align*}
U(0, t) &= 1 + t U(1, t) \\
U(x, t) &= 1 + t U(x + 1, t) + t U(x - 1, t), \quad x \in \N_+, \
\end{align*}
First, it's easy to see that for \(x \in \N\), the generating function converges for \(|t| \lt \frac 1 2\). From part (b) of proposition we have \(t^{n + 1} u_n(1) = t^{n + 1} u_{n + 1}(0)\) for \(|t| \lt \frac 1 2\) and \(n \in \N\). Summing over \(n \in \N\) gives
\[t U(1, t) = U(0, t) - U_0(0) = U(0, t) - 1\]
From part (c) of proposition we have
\[t^{n + 1} u_n(x + 1) = t^{n + 1} u_{n + 1}(x) - t^{n + 1} u_n(x - 1), \quad n \in \N, \, x \in \N_+, \, |t| \lt \frac 1 2\]
Summing over \(n \in \N\) gives
\[t U(x + 1, t) = U(x, t) - u_0(x) - t U(x - 1, t) = U(x, t) - 1 - t U(x - 1, t), \quad x \in \N_+, \, |t| \lt \frac 1 2 \]Details:
For the following result, \(L\) denotes the adjacency kernel of \((\N, \lfrta)\) and \(P_n\) denotes the polynomial of degree \(n \in \N\) in .
For \(\beta \in \R\), the solution of the equation \(L g = \beta g\) is given by \(g(x) = g(0) P_x(\beta)\) for \(x \in \N\).
For \(\beta \in \R\), the equation \(L g = \beta g\) becomes \begin{align*} g(1) & = \beta g(0) \\ g(x + 1) & = \beta g(x) - g(x - 1), \quad x \in \N_+ \end{align*} Hence \(g(x) = g(0) P_x (\beta)\) for \(x \in \N\) by defintion of the polynomials.
So on the vector space of all functions from \(\N\) to \(\R\), every \(\beta \in \R\) is an eigenvalue of \((\N, \lfrta)\), and a corresponding eigenfunction is given by \(x \mapsto P_x(\beta)\). But usually we are interested in eigenvalues and eigenfunctions of \((\N, \lfrta)\) for the space \(\ms L_1\), where probability density functions live.
Suppose that \(X\) is a random variable in \(\N\) with density function \(f\). The reliability function \(F\) of \(X\) for \((\N, \lfrta)\) is given by \[F(0) = f(1), \quad F(x) = f(x - 1) + f(x + 1), \; x \in \N_+\]
As usual, we have our basic support assumption that \(F(x) \gt 0\) for all \(x \in \N_+\). So in particular, \(f(1) \gt 0\), and for every \(x \in \N_+\), either \(f(x - 1) \gt 0\) or \(f(x + 1) \gt 0\). The reliability function does in fact determines the distribution, and we can characterize reliability functions:
The graph \((\N, \lfrta)\) is stochastic. Suppose that \(f\) is a probability density function on \(\N\) and that \(F\) is the corresponding reliability function for \((\N, \lfrta)\). Then
Conversely, if \(F\) is a positive function on \(\N\) such that \(1 \lt \sum_{x = 0}^\infty F(x) \lt 2\) and such that the right side of (b) is positive for \(x \in \N\) and the right side of (c) is positive for \(x \in \N_+\), then \(F\) is the reliability function for \((\N, \lfrta)\) of a distribution on \(\N\) with probability density function given by (a), (b), and (c).
Conversely, suppose that \(F\) is a positive function on \(\N\) such that \(1 \lt \sum_{x=0}^\infty F(x) \lt 2\) and with the right sides of (b) and (c) positive. Define \(f\) by (a), (b), and (c). Then \(f(x) \ge 0\) for \(x \in \N\) and it's easy to see from the definitions that \(F(0) = f(1)\) and \(F(x) = f(x - 1) + f(x + 1)\) for \(x \in \N_+\). So we have \[\sum_{x = 0}^\infty F(x) = f(0) + 2 \sum_{x = 1}^\infty f(x)\] Therefore \[f(0) + \sum_{x = 0}^\infty F(x) = 2 \sum_{x = 0}^\infty f(x)\] But by definition of \(f(0)\), the left side is \(2\). It follows that \(f\) is a density function on \(\N\) and \(F\) is the probability function of \(f\) for \((\N, \lfrta)\).
Here is our main result:
The graph \((\N, \lfrta)\) does not have a constant rate distribution. However, the geometric distribution on \(\N\) has constant rate on \(\N_+\) and is the only such distribution.
Suppose that \(X\) is a random variable with value in \(\N\) having constant rate \(\alpha\). From our support assumptions, \(\P(X = 0) \gt 0\) and \(\P(X \in \N_+) \gt 0\). It follows that \(\E[u(X)] \in (1, 2)\) where as usual, \(u\) is the walk function of order 1. But \(\E[u(X)] = 1 / \alpha\) from the basic moment result in Section 1.5 so \(\alpha \in \left(\frac 1 2, 1\right)\). Suppose next that \(X\) has probability density function \(f\). By the constant rate property, \(L f = \frac 1 \alpha f\), so that
The geometric distribution is almost the most random way to put points in the infinite path \((\N, \lfrta)\).
Returning to the general case, we will consider random walk on \((\N, \lfrta)\) next.
Suppose that \(X\) is a random variable with probability density function \(f\) on \(\N\). Let \(\bs{Y} = (Y_1, Y_2, \ldots)\) denote the random walk on \((\N, \lfrta)\) associated with \(X\). Then \(Y_1\) has density function \(f\) and \(\bs Y\) has transition kernel \(P\) given by \(P(0, 1) = 1\) and \begin{align*} P(x, x - 1) & = \frac{f(x - 1)}{f(x -1) + f(x + 1)}, \quad x \in \N_+ \\ P(x, x + 1) & = \frac{f(x + 1)}{f(x - 1) + f(x + 1)}, \quad x \in \N_+ \end{align*}
This follows from the definition of the random walk: \(P(x, y) = f(y) / F(x)\) for \(x \lfrta y\) where \(F\) is the reliability function of \(X\) for \((\N, \lfrta)\).
Hence \(\bs Y\) is a birth-death chain on \(\N\) with reflecting boundary point 0.
The random walk \(\bs{Y}\) is periodic with period 2, and is positive recurrent and reversible. The invariant probability density function \(g\) given by \(g(0) = f(0) f(1) / c\) and \(g(x) = f(x)[f(x - 1) + f(x + 1)] / c\) for \(x \in \N_+\) where \[c = 2 \sum_{x = 0}^\infty f(x) f(x + 1)\]
The random walk is periodic with period 2 since the underlying graph has this property. The other results follow from general results in Section 1.4. Recall that an invariant function is \( \varphi = f F\) where again, \(F\) is the reliability function of \(X\) for \((\N, \lfrta)\). The normalizing constant is \[c = \sum_{x = 0}^\infty \varphi(x) = 2 \sum_{x = 0}^\infty f(x) f(x + 1)\]
Of course, the general theory of birth-death chains would lead to the same conclusions.
Suppose that \(X\) has the geometric distribution on \(\N\) with success parameter \(p\) and that \(\bs Y\) is the random walk on \((\N, \lfrta)\) associated with \(X\).