This section has the most important definitions and results of the chapter.
Our starting point is a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\). For \(n \in \N_+\) the corresponding product space is \((S^n, \ms S^n, \lambda^n)\). Suppose now that \((S, \rta)\) is a measurable graph. If \(X\) is a random variable in \(S\), recall the definitions of the reliability function \(F\) and the cumulative rate function \(R_n\) of order \(n \in \N_+\) of \(X\) for \((S, \rta)\) given in Section 3. Recall also the path function \(\gamma_n\) of order \(n \in N_+\) for \((S, \rta)\) defined in Section 1. Here are the main definitions, stated so that we avoid division by 0 problems.
Relative to the graph \((S, \rta)\),
Here is our first trivial result:
If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \(X\) has constant average rate \(\alpha^n\) of order \(n\) for each \(n \in \N_+\).
If \(r(x) = \alpha\) for \(x \in S\) then from the definition, the cumulative rate function \(R_n\) of order \(n \in N_+\) is given by \(R_n(x) = \alpha^n \gamma_n(x)\) for \(x \in S\).
So as the name suggests, the rate function \(r = f / F\) is the constant \(\alpha\). Recall that the reference measure \(\lambda\) is fixed in advance, and is usually a natural measure in some sense for the measurable space \((S, \ms S)\). For example, counting measure is always used for discrete graphs and Lebesgue measure for graphs on measurble subsets of \(\R^n\), or more generally, Hausdorff measure on Euclidean manifolds. Trivially, if \(X\) has constant rate \(\alpha \in (0, \infty)\) with respect to \(\lambda\) and if \(c \in (0, \infty)\) then \(X\) has constant rate \(\alpha / c\) with respect to the measure \(c \lambda\). But more importantly, if we can choose the measure a posteriori, then every distribution has constant rate.
Suppose that \(X\) is supported by \((S, \rta)\) and let \(\mu\) be the measure on \((S, \ms S)\) defined by \[\mu(A) = \E[1 / F(X), X \in A], \quad A \in \ms S\] Then \(X\) has constant rate \(1\) for \((S, \rta)\) with respect to \(\mu\).
Note first that since \(F\) is positive, \(\mu\) is a \(\sigma\)-finite positive measure. By definition, \(\mu\) has density function \(1 / F\) with respect to the distribution of \(X\), and hence the distribution of \(X\) has density function \(F\) with respect to \(\mu\). That is, \(\P(X \in A) = \int_A F(x) d \mu(x)\) for \(A \in \ms S\).
For the remainder of this section, we assume again that \(\lambda\) is a fixed reference measure. Suppose that \(X\) has density function \(f\) with respect to \(\lambda\), and let \(\mu\) denote the measure in Proposition and \(P\) the distribution of \(X\). In the notation of Radon-Nikodym derivatives, we have \(d \mu = (1 / F) \, d P\) and \(d P = f \, d \lambda\). It follows that \(d \mu = (f / F) \, d \lambda = r \, d \lambda\) where \(r\) is the rate function of \(X\) for \((S, \rta)\) with respect to \(\lambda\). So theorem is hardly surprising: \(X\) has constant rate 1 with respect to the measure \(\mu\) whose density is the rate function of \(X\) with respect to \(\lambda\).
Let \(L\) denote the adjacency kernel of the graph \((S, \rta)\), considered as an operator on the space \(\ms L_1\).
Recall that if \(f\) is a probability density function then the corresponding reliability function for \((S, \rta)\) is \(F = L f\).
In the finite case, we know the answer to the existence question.
Suppose that \((S, \rta)\) is a finite, strongly connected graph. Then there exists a unique constant rate distribution. The rate constant is the reciprocal of the largest eigenvalue and the probability density function is the normalized positive eigenfunction.
This is just the Peron-Frobenius theorem in disguise. The adjacency matrix \(L\) has unique positive eigenvalue with multiplicity 1 (the largest eigenvalue), and a corresponding eigenfunction that is strictly positive.
Suppose that \((S, \rta)\) is a discrete graph with the property that \(x \rta x\) for some \(x \in S\). If there exists a distribution with constant rate \(\alpha\) then \(\alpha \in (0, 1]\).
Suppose that \(x \rta x\) and that \(X\) has density function \(f\) and has reliability function \(F\) for \((S, \rta)\). If \(X\) has constant rate \(\alpha \in (0, \infty)\) then \[f(x) = \alpha F(x) = \alpha \left[f(x) + \sum_{x \rta y, \, y \ne x} f(y) \right]\] By the support assumption, \(F(x) \gt 0\) and hence \(f(x) \gt 0\). Rearranging the displayed equation we have \[(1 - \alpha) f(x) = \alpha \sum_{x \rta y, \, y \ne x} f(y)\] If \(\alpha \gt 1\), the left side is negative, but of course the right side is nonnegative.
In particular, this result applies to any discrete reflexive graph, and in particular, to any discrete partial order graph. The following result shows that mixtures of distributions with the same constant rate also have the common constant rate. From the point of view of linear algebra, this corresponds to an eigenvalue with multiplicity greater than one.
Suppose that \(F\) and \(G\) are reliability functions for distributions supported by \((S, \rta)\), each with constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). For \(p \in (0, 1)\), the function \(H = p F + (1 - p)G\) is also the reliability function for a distribution with constant rate \(\alpha\) for \((S, \rta)\)
Recall that if \(f\) and \(g\) are probability density functions with corresponding reliability functions \(F\) and \(G\) for \((S, \rta)\), respectivley, and if \(p \in (0, 1)\), then the density function \(h = p f + (1 - p) g\) has reliability function \(H = p F + (1 - p) G\). In this case, \(f = \alpha F\) and \(g = \alpha G\) so \[h = p f + (1 - p) g = p \alpha F + (1 - p) \alpha G = \alpha [p F + (1 - p) G] = \alpha H\]
The following result gives a simple condition for the uniform distribution to have constant rate. Recall that the graph \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) if \(\lambda\{y \in S: x \rta y\} = k\) for all \(x \in S\).
Suppose \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\). If \(\lambda(S) \lt \infty\) then the uniform distribution on \(S\) has constant rate \(1 / k\).
Suppose that \(\lambda(S) \lt \infty\) and let \(f(x) = 1 / \lambda(S)\) for \(x \in S\) so that \(f\) is the density of the uniform distribution on \(S\). Then the reliability function is \[F(x) = \int_{x \rta y} \frac{1}{\lambda(S)} \, d\lambda(y) = \frac{\lambda\{y \in S: x \rta y\}}{\lambda(S)} = \frac{k}{\lambda(S)}\] and so \(f = \frac{1}{k} F\).
Combining with we see that if \((S, \rta)\) is a finite, strongly connected, right regular graph, then the uniform distribution on \(S\) is the unique constant rate distribution. More generally, if \((S, \rta)\) is right regular of degree \(k \in (0, \infty)\) and we know that the eigenspace of the eigenvalue \(1\) has dimension 1, then \((S, \rta)\) has a unique constant rate distribution (the uniform distribution) if \(\lambda(S) \lt \infty\) and has no constant rate distribution if \(\lambda(S) = \infty\). However, it's trivial to see that not every finite graph has a constant rate distribution.
Let \((S, \rta)\) be a finite graph with vertices \(x, \, y \in S\) satisfying the following properties:
Then \((S, \rta)\) does not support a constant rate distribution.
Note that there is a loop at \(x\) and this is the only edge from \(x\). There is a loop at \(y\) and at least one other edge from \(y\) (to \(w\)). Suppose \(f\) is a probability density function on \(S\) with constant rate, and let \(F\) denote the reliability function of \(f\) for \((S, \rta)\). Then \(F(x) = f(x)\) so the rate constant must be 1. But then \(f(y) = F(y)\) and also \(F(y) \ge f(y) + f(w)\). So we have \(f(w) = F(w) = 0\) and this contradicts the support assumption that \(F(t) \gt 0\) for all \(t \in S\).
For the following proposition, recall that if \((S, \preceq)\) is a discrete partial order graph with covering relation \(\upa\), then \(\upa^n\) denotes the \(n\)-fold composition power of \(\upa\) for \(n \in \N\), where by convention, \(\upa^0\) is the equality relation \(=\).
Suppose that \((S, \preceq)\) is a discrete, locally finite, uniform partial order graph. with the property that if \(x \upa^n y\) then \(\#\{u \in S; x \upa u, u \upa^{n-1} y\} = a_n \in \N_+\) for \(n \in \N_+\), independent of \(x, \, y \in S\). If \(X\) has constant rate \(\beta \in (0, \infty)\) for \((S, \upa)\) then
Let \(f\) denote the density function of \(X\). Let \(F\) denote the reliability function of \(X\) for \((S, \preceq)\) and for \(n \in \N\), let \(G_n\) denote the reliability function of \(X\) for \((S, \upa^n)\).
Note that series in is finite if \(\beta \gt 1\).
Suppose that \((S, \rta)\) is a transitive graph, and for \(x \in S\) let \(A_x = \{y \in S: x \rta y\}\) (the right neighbor set of \(x\)). If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then the conditional distribution of \(X\) given \(x \rta X\) has constant rate \(\alpha\) for \((A_x, \rta)\) for each \(x \in S\).
This is an immediate consequence of the result in Section 3 since the rate function of the conditional distribution for \((A_x, \rta)\) is the restriction of the rate function of \(X\) to \(A_x\).
In particular, this proposition holds for a partial order graph \((S, \preceq)\).
The basic moment result in Section 3 simplifies significantly when \(X\) has constant rate. Once again, \(L\) denotes the adjacency kernel of \((S, \rta)\).
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) and that \(g: S \to [0, \infty)\) is measurable. Then \[\E[(g L^n)(X)] = \frac{1}{\alpha^n} \E[g(X)], \quad n \in \N\]
By assumption, \(f = \alpha F\) is a density of \(X\), so \[\int_S (g L^n)(x) F(x) \, d\lambda(x) = \frac{1}{\alpha} \int_S (g L^n)(x) f(x) = \frac{1}{\alpha} \E[(g L^n)(X)]\] Hence from the basic moment result \[\frac{1}{\alpha} \E[(g L^n)(X)] = \E[(g L^{n+1})(X)]\] The result then follows since \(L^0 = I\), the identity kernel.
The result also holds if \(g: S \to \R\) is measurable and \(E[|g(X)|] \lt \infty\). Recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \[\gamma_n(x) = \bs 1 L^n(x) = \lambda^n\{(x_1, x_2, \ldots, x_n) \in S^n: x_1 \rta x_2 \rta \cdots \rta x_n \rta x\}, \quad x \in S\] the measure of the initial parts of the paths of length \(n\) that terminate in \(x\).
If \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\) then \[\E[\gamma_n(X)] = \frac{1}{\alpha^n}, \quad n \in \N\]
From note that \[\int_S \alpha^n \gamma_n(x) f(x) \, d\lambda(x) = 1\] so it follows that the function \(f_n\) given by \(f_n(x) = \alpha^n \gamma^n(x) f(x)\) for \(x \in S\) is a probability density function. We will return to this point shortly.
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then
Recall that the graph \((S, \rta)\) supports the distribution of \(X\).
Part (a) means that if a graph has a constant rate distribution, then necessarily the graph is left finite. The assumption in part (b) means that \((S, \rta)\) is left regular of degree \(c\). The generating function result also simplifies when the distribution has constant rate.
Suppose that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Then the generating function \(\Lambda\) of \(X\) is given by \[\Lambda(t) = \frac{\alpha}{\alpha - t}, \quad |t| \lt \alpha\]
Recall that \(\Lambda(t) = \E[\Gamma(X, t)]\) where \(\Gamma(x, t) = \sum_{n=0}^\infty \gamma_n(x) t^n\) is the generating function of \((S, \rta)\). Hence using and Fubini's theorem we have \[\Lambda(t) = \E[\Gamma(X, t)] = \sum_{n=0}^\infty \E\left[\gamma_n(X)\right] t^n = \sum_{n=0}^\infty (t / \alpha)^n = \frac{1}{1 - t / \alpha}, \quad |t| \lt \alpha\]
Constant rate distributions maximize entropy among distributions satisfying a certain moment condition. This is an indication that a constant rate distribution governs the most random way to put points in a graph, but we will see a stronger expression of that idea shortly.
Suppose that \(X\) has constant rate and reliability function \(F\) for the graph \((S, \rta)\). Then \(X\) maximizes entropy among all random variables \(Y\) on \(S\) satisfying \[\E\left[\ln F(Y) \right] = \E\left[\ln F(X)\right]\]
Suppose that \(X\) has constant rate \(\alpha\), so that \(f = \alpha F\) is a density of \(X\). Suppose that \(Y\) is a random variable taking values in \(S\), with density function \(g\). From the general entropy inequality in Section 3 in we have \begin{align*} H(Y) &= -\int_S g(x) \ln[g(x)] d \lambda(x) \le -\int_S g(x) \ln[f(x)] d \lambda(x) \\ &= -\int_S g(x) \left(\ln \alpha + \ln[F(x)]\right) d \lambda(x) \\ & = -\ln \alpha - \int_S g(x) \ln[F(x)] d \lambda(x) = -\ln \alpha - \E\left(\ln[F(Y)]\right) \\ & = -\ln \alpha - \E\left(\ln[F(X)]\right) \end{align*} Of course, the entropy of \(X\) achieves the upper bound.
Note that since the reliability function \(F\) typically has an exponential
form of some sort, \(\E(\ln[F(Y)])\) often reduces to a natural moment condition.
Results related to the random walk also simplify significantly when the underlying distribution has constant rate. For the next two results, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with random variable \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Recall also that \(\gamma_n\) denotes the path function of order \(n \in \N\).
The transition density \(P_n\) of order \(n \in \N_+\) for \(\bs Y\) is given by \[P^n(x, y) = \alpha^n \frac{F(y)}{F(x)} L^n(x, y), \quad (x, y) \in S^2\]
Recall from Section 4 that \[P^n(x, y) = \frac{f(y)}{F(x)} R^{(n)}(x, y), \quad (x, y) \in S^2\] where \[R^{(n)}(x, y) = \int_{x \rta x_1 \rta \cdots \rta x_{n-1} \rta y} r(x_1) \cdots r(x_{n-1}) d\lambda^n(x_1, \ldots, x_{n-1}), \quad (x, y) \in S^2\] Since \(X\) has constant rate \(\alpha\), \(f(x) = \alpha F(x)\) and \(r(x) = \alpha\) for \(x \in S\). Hence \[R^{(n)}(x, y) = \alpha^{n-1} \lambda^{n-1}\{(x_1, \ldots, x_{n-1}): x \rta x_1 \rta \cdots \rta x_{n-1} \rta y\} = \alpha^{n-1} L^n(x, y)\]
Let \(n \in \N_+\).
The results follow easily from the constant rate property.
For \(n \in \N_+\), the distribution of \(Y_n\) depends only on the rate parameter \(\alpha\), the path function \(\gamma_{n - 1}\) and the reliability function \(F\). Although a simple corollary, part (c) is one of our fundamental results, so we will restate it for emphasis:
The random walk \(\bs Y\) associated with a constant rate distribution (if such a distribution exists) is the most random way to put points in the graph \((S, \rta)\).
In the special case of a finite, strongly connected graph \((S, \rta)\), it follows from that the normalized eigenfunction corresponding to the largest eigenvalue of the adjacency matrix \(L\) defines a distribution whose random walk is the most random way to put points in the graph. Here is another simple result that follows from our general theory of random walks.
Suppose that \((S, \lfrta)\) is a symmetric graph and that \(X\) has constant rate \(\alpha \in (0, \infty)\) for \((S, \lfrta)\), with reliability function \(F\). Let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random waslk on \((S, \lfrta)\) associated with \(X\). Then
From the general theory in Section 4, \(f F\) is invariant for \(\bs Y\), where \(f\) is the density function of \(X\). Since \(X\) has constant rate \(\alpha\) it follows that \(f = \alpha F\) and hence \(\alpha F^2\) is invariant for \(\bs Y\). But then \(F^2\) is also invariant for \(\bs Y\). The normalizing constant is \[c := \int_S F^2(x) \, d\lambda(x) = \frac{1}{\alpha} \E[F(X)]\] Note that \(c \le 1\) since \(F \le 1\) so \(h = F^2 / c\) is an invariant probability density function for \(X\)
The next result refers to the point process \(\bs N = \{N_A: A \in \ms S\}\) corresponding to the random walk \(\bs Y\). The expected number of points in a set has a simple expression in terms of the generating function when the underlying distribution has constant rate.
Suppose that \((S, \rta)\) has generating function \(\Gamma\) and that \(X\) has right constant rate \(\alpha \in (0, \infty)\) for \((S, \rta)\). Let \(\bs Y\) denote the random walk on \((S, \rta)\) associated with \(X\). For \(A \in \ms S\), \[\E(N_A) = \E[\Gamma(X, \alpha); X \in A]\]
Recall that \(\E(N_A) = \E[V(X); X \in A]\) where \(V(x) = \sum_{n = 0}^\infty R_n(x)\) for \(x \in S\). Since \(X\) has constant rate \(\alpha\) we have \[V(x) = \sum_{n=0}^\infty R_n(x) = \sum_{n = 0}^\infty \alpha^n \gamma_n(x) = \Gamma(x, \alpha), \quad x \in S\]
Our next topic is thinning the point process. As above, suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on \((S, \rta)\) associated with \(X\), where \(X\) has constant rate \(\alpha \in (0, \infty)\) and reliability function \(F\) (and hence density \(f = \alpha F\)). Let \(N\) have the geometric distribution on \(\N_+\) with success parameter \(p \in (0, 1)\) so that \[\P(N = n) = p(1 - p)^{n - 1}, \quad n \in \N_+\] As we will see in Chapter 4, \(N\) has constant rate \(p\) for the graph \((\N_+, \le)\). Moreover, we assume that \(N\) and \(\bs Y\) are independent. The basic idea is that we accept a point with probability \(p\) and reject the point with probability \(1 - p\), so that \(Y_N\) is the first point accepted. We are interested in the distribution of \(Y_N\). As before, let \(\Gamma\) denote the path generating function of \((S, \rta)\).
The denstiy function \(g\) of \(Y_N\) is is given by \[g(x) = p \Gamma[x, \alpha (1 - p)] f(x), \quad x \in S\]
As usual, let \(\gamma_n\) denote the path function of \((S, \rta)\) of order \(n \in \N\). Then \begin{align*} g(x) &= \E[f_N(x)] = \sum_{n = 1}^\infty p (1 - p)^{n - 1} f_n(x) = \sum_{n = 1}^\infty p (1 - p)^{n - 1} \alpha^n \gamma_{n - 1}(x) F(x) \\ &= \alpha p F(x) \sum_{n = 1}^\infty [\alpha (1 - p)]^{n - 1} \gamma_{n - 1}(x) = \alpha p F(x) \Gamma[x, \alpha (1 - p)] \\ &= p \Gamma[x, \alpha (1 - p)] f(x), \quad x \in S \end{align*}
In general, \(Y_N\) does not have constant rate for \((S, \rta)\).
For the standard continuous graph \(([0, \infty), \le)\), recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \(\gamma_n(x) = x^n / n!\) for \(x \in [0, \infty)\) and the path generating function \(\Gamma\) is given by \(\Gamma(x, t) = e^{x t}\) for \(x \in [0, \infty)\) and \(t \in \R\).
Of course, all of this is well known. The random walk \(\bs Y\) is the sequence of arrival times of the Poisson process with rate \(\alpha\). The standard continuous graph is studied in detail in Chapter 3.
For the standard discrete graph \((\N, \le)\) recall that the path function \(\gamma_n\) of order \(n \in \N\) is given by \(\gamma_n(x) = \binom{x + n}{n}\) for \(x \in \N\) and the path generating function \(\Gamma\) is given by \(\Gamma(x, t) = 1 / (1 - t)^{x+1}\) for \(x \in \N\) and \(t \in (-1, 1)\).
These results are also well known. For \(n \in \N_+\), random variable \(Y_n\) is the number of failures before the \(n\)th success in a Bernoulli trials sequence with success parameter \(\alpha\), so \(\bs Y\) can be thought of as the sequence of arrival times in a renewal process. The standard discrete graph is studied in detail in Chapter 4.
Return to the general setting of a \(\sigma\)-finite measure space \((S, \ms S, \lambda)\).
Let \((S, \equiv)\) denote the complete reflexive graph, so that \(x \equiv y\) for every \(x, \, y \in S\).
Recall that \((S, \ne)\) is the complete irreflexive graph. Suppose that \(\lambda\{x\} = c \in [0, \infty)\) for all \(x \in S\).
Suppose that \(f\) is a probability density function on \(S\). The corresponding reliability function \(F\) is given by \[F(x) = \int_{S - \{x\}} f(y) \, d\lambda(y) = 1 - \int_{\{x\}} f(y) \, d\lambda(y) = \lambda\{x\} f(x) = c f(x) \] So the condition for the distribution defined by \(f\) to have constant rate \(\alpha \in (0, \infty)\) is \(f(x) = \alpha [1 - c f(x)]\) for \(x \in S\) or equivalently, \(f(x) = \alpha / (1 + \alpha c)\). If \(\lambda(S) \lt \infty\) then \(\alpha = 1 / [\lambda(S) - c]\) and \(f\) is the density of the uniform distribution on \(S\). If \(\lambda(S) = \infty\) then no choice of \(\alpha \in (0, \infty)\) makes \(f\) a valid density function. Note also that \((S, \ne)\) is regular of degree \(\lambda(S) - c\), so applies.
In the context of , if \(S\) is uncountable and the reference measure \(\lambda\) is continuous, then \(c = 0\) and the rate constant in part (a) is \(1 / \lambda(S)\), just as for the complete reflexive graph in . If \(S\) is countable with counting measure \(\lambda = \#\), then \(c = 1\). If \(\#(S) = n \in \{2, 3, \ldots\}\), then the rate constant of the uniform distribution in part (a) is \(1 / (n - 1)\).
Suppose that \(S\) is countable, with counting measure \(\#\) as the reference measure. For the equality graph \((S, =)\), every distribution has constant rate 1.
If \(f\) is a probability density function on \(S\), then trivially, the corresponding reliability function for \((S, =)\) is \(F = f\).
Recall the diamond graph \((S, \lfrta)\) defined in Section 1.
Open the simulation of the constant rate distribution for the diamond graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the diamond graph \((S, \lfrta)\) corresponding to the constant rate distribution.
Open the simulation of the random walk on the diamond graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.
Recall the directed version of the diamond graph \((S, \rta)\) described in Section 1.
Open the simulation of the constant rate distribution on the directed diamond graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the directed diamond graph \((S, \rta)\) corresponding to the constant rate distribution.
Open the simulation of the random walk on the directed diamond graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.
Recall the bull graph \((S, \lfrta)\) defined in Section 1.
Open the simulation of the constant rate distribution on the bull graph. Note the shape of the density function. Run the simulation 1000 times and compare the empirical density function with the probability density function.
Suppose that \(\bs Y = (Y_1, Y_2, \ldots)\) is the random walk on the bull graph \((S, \lfrta)\) corresponding to the constant rate distribution.
Open the simulation of the random walk on the bull graph. Run the simulation 1000 times and compare the empirical density function with the limiting probability density function.
For \(k \in \{3, 4, \ldots\}\), let \((\N_k, \lfrta)\) denote the wheel graph of order \(k\) as defined in Section 1.
Find the rate constant \(\alpha\) and the probability density function \(f\) of the unique constant rate distribution for \((\N_k, \lfrta)\).
As noted in Section 3 the largest eigenvalue of the adjacency matrix \(L\) is \(1 + \sqrt{k + 1}\), and moreover a corresponding eigenfunction \(g\) is given by \(g(0) = \sqrt{k + 1} - 1\) and \(g(x) = 1\) for \(x \in \{1, 2, \ldots, k\}\). Hence rate constant is \[\alpha = \frac{1}{1 + \sqrt{k + 1}} = \frac{\sqrt{k + 1} - 1}{k}\] The corresponding density function \(f\) is obatined by normalizing \(g\). Since \(c = \sum_{x \in S} g(x) = k - 1 + \sqrt{k + 1}\), \begin{align*} f(0) & = \frac{\sqrt{k + 1} - 1}{k - 1 + \sqrt{k + 1}}\\ f(x) & = \frac{1}{k - 1 + \sqrt{k + 1}}, \quad x \in \{1, 2, \ldots, k\} \end{align*}
Open the simulation of the constant rate distribution on the wheel graph. Vary \(k\) and note the shape of the probability density function and the value of the rate constant. Run the simulation 1000 times and compare the empirical density function with the probability density function.
Find the rate constant \(\alpha\) and the density function \(f\) of the constant rate distribution explicitly for the wheel graph \((\N_3, \lfrta)\).
From Example , \(\alpha = 1 / 3\) and \(f(x) = 1 / 4\) for \(x \in \N_3\). But we already knew this since \((\N_3, \lfrta)\) is regular with degree \(3\).
Find the rate constant \(\alpha\) and the density function \(f\) of the constant rate distribution explicitly for the wheel graph \((\N_4, \lfrta)\).
From Example , \(\alpha = \left(\sqrt{5} - 1\right) / 4 \approx 0.309\). For the constant rate density function, \(f(0) = \sqrt 5 - 2\) and \(f(x) = \left(3 - \sqrt 5\right) / 4\) for \(x \in \{1, 2, 3, 4\}\).
Suppose that \(X_k\) has the constant rate distribution on the wheel graph \(\N_k, \lfrta)\). Show that the distribution of \(X_k / k\) converges to the uniform distribution on \([0, 1]\) as \(k \to \infty\).
Let \(p_k\) denote the common value of \(f_k(j)\) for \(j \in \{1, 2, \ldots, k\}\) as given above. So \(f(0) = 1 - k p_k\). Then for \(x \in [0, 1]\), \begin{align*} \P(X_k / k \le x) & = \sum_{j = 0}^k \bs{1}(j / k \le x) f_k(j) = f_k(0) + p_k \sum_{j = 1}^k \bs{1}(j / k \le x) \\ & = f_k(0) + k p_k \frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \end{align*} But as \(k \to \infty\), \(f_k(0) \to 0\), \(k p_k \to 1\), and \(\frac{1}{k} \sum_{j = 1}^k \bs{1}(j / k \le x) \to \int_0^x 1 dt = x\).
Suppose that \(X\) has the constant rate distribution on the wheel graph \((\N_k, \lfrta)\) and let \(\bs Y = (Y_1, Y_2, \ldots)\) denote the random walk on the graph associated with \(X\).
Open the simulation of the random walk on the wheel graph. Vary \(k\) and note the shape of the invariant probability density function. Run the simulation 1000 times and compare the empirical density function with the invariant probability density function.