In this section we study an interesting collection of graphs on \(\R^n\) that are induced (in the sense of Section 1.7 ) by the standard graph \(([0, \infty), \le)\) and the standard norm functions. For \(n \in \N_+\), we give \(\R^n\) the usual Borel \(\sigma\)-algebra \(\ms R^n\) and \(n\)-dimensional Lebesgue measure \(\lambda^n\) as the reference measure. We use \(\mu^{n - 1}\) to denote Hausdorff measure on a smooth manifold of Hasudorff dimension \(n - 1\) in \(\R^n\) (so a surface in \(\R^n\) in the general sense). In this section, \(\Gamma\) denotes the ordinary gamma special function.
First we need a review.
Recall the standard norms on \(\R^n\) for \(n \in \N_+\):
Part (c) justifies the notation for the infinity norm. When \(n = 1\), the norms are all the same: \(\|x\|_k = |x|\) for \(k \in [1, \infty]\) and \(x \in \R\). When \(n \in \{2, 3, \ldots\}\), the norms are all different, of course. For \(n \in \N_+\) and \(k \in [1, \infty]\), the mapping \(\|\cdot\|_k\) is a (Lipschitz) continuous (and hence measurable) function from \(\R^n\) onto \([0, \infty)\). The \(k = 1\) norm on \(\R^2\) is also referred to as the taxi cab norm since it corresponds to distance measured along a perfectly square street grid.
For \(n \in \N_+\) and \(k \in [1, \infty]\), define \(b_{n, k} = \lambda^n\{\bs{x} \in \R^n: \|\bs x\|_k \le 1\}\), the volume of the unit ball in \(\R^n\) under the \(k\) norm. Then \begin{align*} b_{n, k} & = 2^n \frac{\Gamma^n(1 + 1 / k)}{\Gamma(1 + n / k)}, \quad k \in [1, \infty) \\ b_{n, \infty} & = 2^n \end{align*}
The volume formulas are standard results.
Of course, the term volume is used in a general sense. Note that \(b_{n, k} \to b_{n, \infty}\) as \(k \to \infty\) for \(n \in \N_+\), again justifying the notation for the infinity norm. The extension to arbitrary radius is straightforword.
\(b_{n, k} t^n = \lambda^n \{\bs x \in \R^n: \|\bs x\|_k \le t\}\) is the volume of the ball in \(\R^n\) of radius \(t\) in the \(k\) norm, where \(n \in \N_+\), \(k \in [1, \infty]\), and \(t \in [0, \infty)\),
Here is the collection of fundamental graphs that we will study.
For \(n \in \N_+\) and \(k \in [1, \infty]\), let \((\R^n, \Rta_k)\) denote the graph induced by \(\|\cdot\|_k\) and the standard graph \(([0, \infty), \le)\), so that \(\bs x \Rta_k \bs y\) if and only if \(\|\bs x\|_k \le \|\bs y\|_k\) for \(\bs x, \, \bs y \in \R^n\). Let \(\R^n_k(t) = \{\bs x \in \R^n: \|x\|_k = t\}\), the surface of the ball of radius \(t \in [0, \infty)\) relative to the \(k\) norm, and the inverse image at \(t\) of the function \( \|\cdot\|_k\).
Note that the relation \(\Rta_k\) is reflexive and transitive. If we define \(\equiv_k\) on \(\R^n\) by \(\bs x \equiv_k \bs y\) if and only if \(\|\bs x\|_k = \|\bs y\|_k\), then \(\equiv_k\) is an equivalence relation with \(\{\R^n_k(t): t \in [0, \infty)\}\) as the collection of equivalence classes. Then \(\Rta_k\) can be extended to a partial order on this collection. The basic assumption in Section 1.7 on induced graphs is satisfied:
Let \(v_{n, k}\) denote the Jacobian of the \(k\)-norm function. If \(g: \R^n \to [0, \infty)\) is measurable then \[\int_{\R^n} g(\bs x) v_{n, k}(\bs x) \, d\lambda^n(x) = \int_0^\infty \left(\int_{\R^n_k(t)} g(\bs x) \, d\mu^{n - 1}(\bs x)\right) \, dt\]
This is simply the co-area formula (an extension of Fubini's theorem) applied to the \(k\)-norm function \(\bs x \mapsto \|\bs x\|_k\) on \(\R^n\). The meaning of the term Jacobian depends on the context but here, \(v_{n, k} = \|\nabla_{n, k}\|_2\) where \(\nabla_{n, k}\) is the gradient of \(\|\cdot\|_k\) on \(\R^n\) (the vector of first partial derivatives). The Jacobian is critical for the transition from an integral with respect to Lebesgue measure \(\lambda^n\) on the left to the iterated integral on the right over the level sets \(\R^n_k(t)\) with respect to Hausdorff measure \(\mu^{n - 1}\), and then over \(t\) with respect to Lebesgue measure. Suppose that \(h: \R^n \to [0, \infty)\) is measurable and we let \(g = h / v_{n, k}\) in the co-area formula. Then \[\int_{\R^n} h(\bs x) \, d\lambda^n(x) = \int_0^\infty \left(\int_{\R^n_k(t)} \frac{h(\bs x)}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x)\right) \, dt\] Specializing further, let \(h = \bs 1_A\) where \(A \in \ms R^n\). Then \[\lambda^n(A) = \int_0^\infty \left(\int_{\R^n_k(t) \cap A} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) \right) \, dt \]
From the details in , we have the following definition in the context of Section 1.7:
The reference measure on \(\R^n_k(t)\) is the measure \(\nu_{n, k}\) that has density \(1 / v_{n, k}\) with respect to Hausdorff measure \(\mu^{n - 1}\).
Our next result gives the functions that play a critical role (again in the context of Section 1.7) in the construction of constant rate distributions on the induced graph.
For \(n \in \N_+\) and \(k \in [1, \infty]\), define \[\beta_{n, k}(t) = \nu_{n, k}[\R^n_k(t)] = \int_{\R^n_k(t)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x), \quad t \in [0, \infty)\] Then \[\beta_{n, k}(t) = n b_{n, k} t^{n - 1}, \quad t \in [0, \infty)\]
The key fact in the argument is that the gradient \(\nabla_{n, k}\) of the \(k\) norm function satisfies \(\nabla_{n, k} (t \bs x) = \nabla_{n, k}(\bs x)\) for \(t \in (0, \infty)\) and hence by a change of variables, \[\int_{\R^n_k(t)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) = t^{n - 1} \int_{\R^n_k(1)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x), \quad t \in (0, \infty)\] Now, from the co-area formula applied to the unit ball in \(\R^n\) relative to the \(k\) norm (see the details of ) we have \[b_{n, k} = \int_0^1 \int_{\R^n_k(s)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) \, ds = \int_0^1 s^{n - 1} \int_{\R^n_k(1)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) \, ds = \frac{1}{n} \int_{\R^n_k(1)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x)\] So it follows that \[\int_{\R^n_k(1)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) = n b_{n, k}\] and then \[ n b_{n, k} t^{n - 1} = t^{n - 1} \int_{\R^n_k(1)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) = \int_{\R^n_k(t)} \frac{1}{v_{n, k}(\bs x)} \, d\mu^{n - 1}(\bs x) = \beta_{n, k}(t)\]
Note that \(\beta_{n, k}(t)\) is the derivative with respect to \(t\) of \(\beta_{n, k} t^n\), the volume of the ball in \(\R^n\) of radius \(t\) relative to the \(k\) norm. That is, relative to the reference measure \(\nu_{n, k}\), the surface area of the ball of radius \(t\) is the derivative of the volume of the ball. In the cases \(k = 1\), \(k = 2\), and \(k = \infty\), the Jacobian is a constant:
For \(n \in \N_+\),
From we can give the surface area of the unit ball in \(\R^n\) in the \(k\) norm relative to Hausdorff measure for \(k = 1, \, 2, \, \infty\). So for \(n \in \N_+\) and \(k \in [1, \infty]\), let \(c_{n, k} = \mu^{n - 1}[R^n_k(1)]\). Then \begin{align*} c_{n, 1} & = \frac{2^n \sqrt{n}}{(n - 1)!} \\ c_{n, 2} & = \frac{2 \pi^{n / 2}}{\Gamma(n / 2)} \\ c_{n, \infty} & = n 2^n \end{align*} But except for these special cases, the Jacobian is a complicated function and there is no simple closed formula for the surface area of the unit ball in \(\R^n\) relative to Hausdorff measure.
The left walk function \(u_m\) of order \(m \in \N_+\) for the graph \((\R^n, \Rta_k)\) is given by \[u_m(\bs x) = \frac{b^m_{n, k}}{m!} \|\bs x\|^{m n}_k, \quad \bs x \in \R^n\]
This follows from \(u_0 = 1\) and the recursion relation
\[u_{m + 1}(\bs x) = \int_{\bs y \Rta \bs x} u_m(\bs y) \, d\lambda^n(\bs y), \quad \bs x \in \R^n\]
The integral can be evaluated using the co-area formula in . But there is also a simple combinatorial
argument. For \(\bs x \in \R^n\), the measure of the set
\[\{(\bs y_1, \bs y_2, \ldots \bs y_m): \bs y_i \in \R^n, \, \|\bs y_i\|_k \le \|\bs x\|_k \text{ for each } i \in \{1, 2, \ldots, m\}\}\]
is of course \(b^m_{n,k} \|\bs x\|_k^{m n}\). But another algorithm for constructing this set is to select a walk \((\bs x_1, \bs x_2, \ldots, \bs x_m)\) of length \(m\) ending in \(\bs x\) for the graph \((\R^n, \Rta_k)\) and then select an ordering \((\bs y_1, \bs y_2, \ldots, y_m)\) of \((\bs x_1, \bs x_2, \ldots \bs x_m)\). By definition, the measure of the set of walks is \(u_m(\bs x)\), and there are \(m!\) permutations of each path, so it follows that
\[m! u_m(\bs x) = b^m_{n,k} \|\bs x\|_k^{m n}\]
Note that \(u_m(\bs x)\) is the left walk function \(t \mapsto t^m / m!\) of order \(m\) for the standard graph \(([0, \infty), \le)\), evaluated at \(b_{n,k} \|\bs x\|_k^n\) which is the volume of the ball of radius \(\|\bs x\|_k\) in the \(k\) norm. We will see this pattern repeatedly. In particular, it follows that the left generating function of \((\R^n, \Rta_k)\) is the left generating function of \(([0, \infty), \le)\) evaluated at \(b_{n,k} \|\bs x\|_k^n\):
The left generating function \(U\) of \((\R^n, \Rta_k)\) is given by \[U(\bs x, t) = \exp\left(b_{n,k} \|\bs x\|^n_k t\right), \quad \bs x \in \R^n, \, t \in \R\]
This follows easily from the definition: \[U(\bs x, t) = \sum_{m = 0}^\infty u_m(\bs x) t^m = \sum_{m = 0}^\infty \frac{\left(b_{n, k} \|x\|^n_k\right)^m} {m!}, \quad \bs x \in \R^n, \, t \in \R\]
Suppose now that \(\bs X = (X_1, X_2, \ldots, X_n)\) is a random variable in \(\R^n\), so that \(\|\bs X\|_k\) is the corresponding induced variable in \([0, \infty)\).
Basic functions:
These follow from the general results of Section 1.7
We assume that \(\hat f(t) \gt 0\) for almost all \(t \in [0, \infty)\), so that \(\hat F(t) \gt 0\) for almost all \(t \in [0, \infty)\) and \(F(\bs x) \gt 0\) for almost all \(\bs x \in \R^n\). For \(t \in [0, \infty)\), the conditional distribution of \(\bs X\) given \(\|\bs X\|_k = t\) has density function \(\bs x \mapsto f(\bs x \mid t) = f(\bs x) / \hat f(t)\) on \(\R^n_k(t)\) (relative to the reference measure \(\nu_{n, k}\)). The general results of Section 1.7 apply, of course, but as usual we are most interested in constant rate distributions. Here are the main results:
For \(n \in \N_+\) and \(k \in [1, \infty]\), the graph \((\R^n, \Rta_k)\) has a unique distribution with constant rate \(\alpha\) for each \(\alpha \in (0, \infty)\). If random variable \(\bs X\) has this distribution then
The results follow from Section 1.7. Specifically, \(\bs X\) has constant rate \(\alpha \in (0, \infty)\) for \((\R^n, \Rta_k)\) if and only if \(\|\bs X\|_k\) has rate function \(\hat r\) for \(([0, \infty), \le)\) where \(\hat r(t) = \alpha \beta_{n,k}(t) = \alpha n b_{n,k} t^{n-1}\) for \(t \in [0, \infty)\). So \(\|\bs X\|_k\) has reliability function \(\hat F\) for \(([0, \infty), \le)\) given by \[\hat F(t) = \exp\left(-\int_0^t \hat r(s) \, ds\right) = \exp\left(-\alpha b_{n, k} t^n\right), \quad t \in [0, \infty)\] which we recognize as the Weibull distribution in (a). And then \(\bs X\) has reliability function \(F\) for \((\R^n, \Rta_k)\) given by \(F(\bs x) = \hat F(\|\bs x\|_k)\) for \(\bs x \in \R^n\), and has density function \(f\) given by \(f(\bs x) = \alpha \hat F(\|\bs x\|_k)\) for \(\bs x \in \R^n\).
Note that \(f(\bs x)\) is the ordinary exponential density function for the standard graph \(([0, \infty), +)\), with rate \(\alpha\), evaluated at \(b_{n, k} \|\bs x\|_k^n\), which again is the volume of the ball of radius \(\|\bs x\|_k\) in the \(k\) norm. From , the conditinal distribution in part (c) is uniform with respect Hausdorff measure \(\mu^{n - 1}\) when \(k = 1\), \(k = 2\) or \(k = \infty\). All of the standard results for the Weibull distribution hold for the random variable \(\|\bs X\|_k\) in . In particular,
Suppose that \(\bs X\) has the distribution with constate rate \(\alpha \in (0, \infty)\) for the graph \(\R^n, \Rta_k\) where \(n \in \N_+\) and \(k \in [1, \infty]\). Then \[\E(\|\bs X\|_k^m) = \frac{\Gamma(1 + m / n)}{(\alpha b_{n, k})^{m / n}}, \quad m \in (0, \infty)\]
In particular, note that \(\E(\|\bs X\|_k^n) = 1 / (\alpha b_{n, k})\).
Suppose again that \(\bs X = (X_1, X_2, \ldots, X_n\) has the distribution with constate rate \(\alpha \in (0, \infty)\) for the graph \(\R^n, \Rta_k\) where \(n \in \N_+\) and \(k \in [1, \infty]\). Then
These results follow from the density function in part (b) of .
Let \(n \in \N_+\) and \(k \in [1, \infty]\) and consider the family of constant rate distributionn on \((\R^n, \Rta_k)\).
Suppose again that \(\bs X\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^n, \Rta_k)\). The entropy of \(\bs X\) is \(H(\bs X) = 1 - \ln \alpha\)
From the general result in Section 1.5, \(H(\bs X) = - \ln \alpha - \E[\ln F(\bs X)]\), where \(F\) is the reliability function. But from the remark following , \(-\E[\ln F(\bs X)] = \E[\alpha b_{n, k} \|\bs X\|_k^n] = 1 \).
So the entropy of \(\bs X\) is the same as for the exponential distribution with rate \(\alpha\), and in particular does not depend on \(n \in \N_+\) or \(k \in [1, \infty]\).
The case where the norm index is the same as the dimension is particularly interesting. In this case, abbreviate \(b_{n,n}\) by \(b_n\) so that \[b_n = \left[\frac{2}{n} \Gamma\left(\frac{1}{n}\right)\right]^n = 2^n \Gamma^n\left(1 + \frac{1}{n}\right), \quad n \in \N_+\] where again \(\Gamma\) is the ordinary gamma special function.
Let \(n \in \N_+\) and suppose that \(\bs X = (X_1, X_2, \ldots, X_n)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^n, \Rta_n)\). Then \(\bs X\) is a sequence of independent, identically distributed variables with common density function \(g\) on \(\R\) define by \[g(x) = \alpha^{1/n} \exp(-\alpha b_n |x|^n), \quad x \in \R\]
The distributions on \(\R\) defined by the density function in (b) are generalized normal distributions (see the papers by Goodman and by Sinz et al.) The distribution actually makes sense for any \(n \in (0, \infty)\), not just positive integers, and forms an interesting class of special distributions. We will use terminology that is appropriate for our setting:
For \(n, \, \alpha \in (0, \infty)\), the distribution defined in is the generalized normal distribution with index \(n\) and rate \(\alpha\).
For \(n = 1, \, 2\) the generalized normal distributions reduce to standard special distributions:
Suppose that \(X\) has the generalized normal distribution on \(\R\) with index \(n \in (0, \infty)\) and rate \(\alpha \in (0, \infty)\).
As in , \(\alpha^{-1 / n}\) is a scale parameter of the family of distributions. That is, if \(Z\) has the standard distribution with index \(n\) and rate \(1\), and so with density function \(z \mapsto \exp(-b_n |z|^n)\) then \(X = \alpha^{-1 / n} Z\) has the distribution with index \(n\) and rate \(\alpha\) and with density function \(g\) in . The following theorems gives some basic properties:
Let \(g\) denote the density function of the generalized normal distribution on \(\R\) with index \(n \in (0, \infty)\) and rate \(\alpha \in (0, \infty)\), given in . Then
Part (a) is clear. Parts (b)–(d) follow from standard calculus.
Note that the inflection points in (d) converge to \(\pm 1 / 2\) as \(n \to \infty\).
Suppose again that \(X\) has the generalized normal distribution on \(\R\) with index \(n \in (0, \infty)\) and rate \(\alpha \in (0, \infty)\), with density function \(g\).
The results follow from known results for the generalized normal distribution, but we will give separate proofs for completeness. Suppose that \(Z\) has the standard distribution with index \(n\) and rate \(1\) so that \(X = \alpha^{-1 / n}\) has the generalized normal distribution with index \(n\) are rate \(\alpha\).
In particular, the variance and kurtosis are given by \[\var(X) = \alpha^{-2 / n} \left(\frac{n}{2}\right)^2 \frac{\Gamma(3 / n)}{\Gamma^3(1 / n)}, \; \kurt(X) = \frac{\Gamma(5 / n) \Gamma(1 / n)}{\Gamma^2(3 / n)}, \quad n \in \N_+\] This class of distributions on \(\R\) is interesting because it generalizes the Laplace, the 0 mean normal, and because of the surprising convergence result in part (d). Since the distribution of \(X\) is symmetric about \(0\), it also follows that \(|X|\) has density function \(2 g\) on \([0, \infty)\), \(\sgn(X)\) is uniformly distributed on \(\{-1, 1\}\), and that \(|X|\) and \(\sgn(X)\) are independent.
The app below is a simulation of the generalized normal distribution. The parameters \(n\) and \(\alpha\) can be varied with the scrollbars.
The (ordinary) distribution function can be expressed in terms of the gamma and (upper) incomplete gamma functions, but is not particularly helpful:
Suppose again that \(X\) has the generalized normal distribution on \(\R\) with index \(n \in (0, \infty)\) and rate \(\alpha \in (0, \infty)\). The distribution function \(G\) is given by \[G(y) = 1 - \frac{\Gamma\left[1 / n, 2^n y^n \Gamma^n(1 + 1 / n)\right]}{2 \Gamma(1 / n)}, \quad y \ge 0\] and by symmetry, \(G(y) = 1 - G(-y)\) for \(y \lt 0\).
We can rephrase parts (a) and (b) of :
Constant rate distributions for \(n = 1, \, 2\).
Return to the simulation of the generalized normal distribution in . Set \(n = 1\) and \(n = 2\) and note the shape of the graph. Run the simulation in both cases and compare the empirical density function to the probability density function.
Part (b) of suggests that the uniform distribution on \([-\frac 1 2, \frac 1 2]\) has constant rate for the graph corresponding to \(n = \infty\). That is indeed true, properly understood. Let \(I = [-\frac 1 2, \frac 1 2]\) and consder \(I^\infty\) with norm \[\|\bs x\|_\infty = \sup\{|x_i|: i \in \N_+\}, \quad \bs x = (x_1, x_2, \ldots) \in I^\infty\] The graph \((I^\infty, \Rta_\infty)\) induced by the function \(\bs x \mapsto \|\bs x\|_\infty\) and the standard graph \(([0, \infty), \le)\) is defined in the usual way: \(\bs x \Rta_\infty \bs y\) if and only if \(\|\bs x\|_\infty \le \|\bs y\|_\infty\) for \(\bs x, \, \bs y \in I^\infty\). In terms of the measure structure, we give \(I\) the usual Borel \(\sigma\)-algebra \(\ms I\) and \(I^\infty\) the corresponding product \(\sigma\)-algebra \(\ms I^\infty \), namely the \(\sigma\)-algebra generated by sets of the form \(\prod_{i = 1}^\infty A_i\) where \(A_i \in \ms I\) for each \(i \in \N_+\) and \(A_i = I\) for all but finitely many \(i \in \N_+ \). Finally, for the reference measures, let \(\lambda\) denote Lebesgue measure on \((I, \ms I)\) (which is simply the uniform probability distribution) and let \(\lambda^\infty\) the corresponding product measure on \((I^\infty, \ms I^\infty)\).
Suppose that \(\bs X = (X_1, X_2, \ldots)\) is a sequence of independent variables, each uniformly distributed on \([-\frac 1 2, \frac 1 2]\). Then \(\bs X\) has constant rate 1 for the graph \((I^\infty, \Rta_\infty)\).
Random variable \(X_i\) has constant density function \(1\) on \(I\) and similarly \(\bs X\) also has constant density function \(1\) on \(I^\infty\). On the other hand, the reliability function \(F\) of \(\bs X\) for \((I^\infty, \Rta_\infty)\) is \[F(\bs x) = \P(\bs x \Rta_\infty \bs X) = \P(\|\bs x\|_\infty \le \|\bs X\|_\infty), \quad \bs x \in I^\infty\] But \(\P(\|\bs X\|_\infty = \frac 1 2) = 1\). To see this, note that if \(\epsilon \gt 0\) then with probability \(1\), \(|X_i| \gt \frac 1 2 - \epsilon\) for some (and in fact infinitely many) \(i \in \N_+\). Hence, \(F\) is also the constant function 1 on \(I^\infty\) so \(F = f\).
Return again to the simulation of the generalized normal distribution in . Starting with \(n = 1\), increase \(n\) to the maximum value and note the change in shape of the graph.
There are two interesting and tractible cases in the bivariate setting (\(n = 2\)) corresponding to the extreme values of the norm parameter (\(k = 1\) and \(k = \infty\)). In both cases, various distribution functions are best expressed in terms of the error function and its complement.
The error function \(\erf\) is defined by \[ \erf(x) = \frac{2}{\sqrt{\pi}} \int_0^x e^{-t^2} \, dt, \quad x \in \R \] and the complementary error function is \(\erfc = 1 - \erf\).
The error function is closely related to the standard normal distribution function \(\Phi\): \[ \Phi(x) = \frac 1 2 + \frac 1 2 \erf\left(\frac{x}{\sqrt{2}}\right), \quad x \in \R \]
Suppose that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_1)\). Then \((X, Y)\) has density function \(f\) given by \[f(x, y) = \alpha \exp[-2 \alpha (|x| + |y|)^2], \quad (x, y) \in \R^2\]
Of course, the density is constant on the the squares \(|x| + |y| = t \) for \(t \in (0, \infty)\) in contrast to a bivariate nomral density, for example, which is constant on ellipses. From the general results in , \(X\) and \(Y\) are uncorrelated and identically distributed, and the distribution is symmetric about 0. So in particular, the variables have mean 0 and skewness 0. The family of distributions is a scale family with scale parameter \(1 / \sqrt{\alpha}\).
Suppose again that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_1)\). The common density \(g\) of \(X\) and \(Y\) is given by \[g(x) = \sqrt{\frac{\pi \alpha}{2}} \erfc\left(\sqrt{2 \alpha} |x|\right), \quad x \in \R \] The variance is \(1 / 6 \alpha\) and the kurtosis is \(18 / 5\).
The density function \(g\) in has the following properties:
This follows from standard calculus:
So this distribution might be considered a second-order version of the Laplace distribution.
The app below is a simulation of the norm distribution with \(n = 2\) and \(k = 1\). The rate parameter \(\alpha\) can be varied with the scrollbar.
Suppose again that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_1)\). The common distribution function \(G\) of \(X\) and \(Y\) is given by \[G(y) = 1 - \frac{1}{2} \exp\left(-2 \alpha y^2\right) + \sqrt{\frac{\alpha \pi}{2}} y \, \erfc\left(\sqrt{2 \alpha} \, y\right), \quad y \in [0, \infty)\] and \(G(y) = 1 - G(-y)\) for \(y \in (-\infty, 0)\).
Suppose that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_\infty)\). Then \((X, Y)\) has joint density function \(f\) given by \[f(x, y) = \alpha \exp\left[-4 \alpha \left(|x| \vee |y|\right)^2\right], \quad(x, y) \in \R^2\]
This density is constant on the squares \(|x| \vee |y| = t\) for \(t \in (0, \infty)\). As in the general case, \(X\) and \(Y\) are uncorrelated (but dependent) with identical distributions that are symmetric about 0. So in particular, the variables have mean 0 and skewness 0.
Suppose again that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_\infty)\). Then \(X\) and \(Y\) have common density function \(g\) given by \[g(x) = 2 \alpha |x| \exp\left(-4 \alpha x^2\right) + \frac{\sqrt{\pi \alpha}}{2} \erfc\left(2 \sqrt{\alpha} |x|\right), \quad x \in \R \] The common variance is \(1 / 6 \alpha\) and the common kurtosis is \(27 / 10\).
It's interesting that the variance is the same as in the case \(n = 2\), \(k = 1\).
The density function \(g\) in has the following properties:
This follows from standard calculus:
The app below is a simulation of the norm distribution with \(n = 2\) and \(k = \infty\). The rate parameter \(\alpha\) can be varied with the scrollbar.
Suppose again that \((X, Y)\) has the distribution with constant rate \(\alpha \in (0, \infty)\) for the graph \((\R^2, \Rta_\infty)\). The common distribution function \(G\) of \(X\) and \(Y\) is given by \[G(y) = 1 - \frac{1}{2} \exp\left(-4 \alpha y^2\right) + \frac{\sqrt{\alpha \pi}}{2} y \, \erfc\left(2 \sqrt{\alpha} \, y\right), \quad y \in [0, \infty)\] and \(G(y) = 1 - G(-y)\) for \(y \in (-\infty, 0)\).
Returning to the general case of \(n \in \N_+\) and \(k \in [1, \infty]\), consider the random walk \((\bs Y_1, \bs Y_2, \ldots)\) on \((\R^n, \Rta_k)\) associated with \(\bs X\), where \(\bs X\) has constant rate \(\alpha \in (0, \infty)\) for \((\R^n, \Rta_k)\). From our general results and the left walk function in we have
For \(m \in \N_+\), random variable \(\bs Y_m\) has density function \(f_m\) given by \[f_m(\bs x) = \alpha^m \frac{1}{(m - 1)!} (b_{n,k} \|\bs x\|^n_k)^{m - 1} \exp\left(-\alpha b_{n,k} \|x\|^n_k\right), \quad \bs x \in \R^n\]
Note that \(f_m(\bs x)\) is the ordinary gamma density function for the standard graph \(([0, \infty), \le)\), (with order \(m\) and rate \(\alpha\)), evaluated at \(b_{n,k} \|\bs x\|_k^n\).