Quite a large number of multivariate reliabilitiy models,
and in particular multivariate exponential distributions
have been formulated over the years. Our goal in this section and the next is not an exhaustive review of this literature, but rather to consider a few that can be generalized to, or have special connections to, the semigroup and graph theory in this text. We start once again with the standard space \(([0, \infty), +, \lambda)\) where \(\lambda\) is Lebesgue measure on the \(\sigma\)-algebra \(\ms B\) of Borel subsets of \([0, \infty)\). The corresponding graph, of course, is \(([0, \infty), \le)\). For \(n \in \N_+\), let \(\left([0, \infty)^n, +, \lambda^n\right)\) denote the power space of order \(n\). The corresponding graph \(\left([0, \infty)^n, \le^n\right)\) is the power graph of \(([0, \infty), \le)\) of order \(n\). Since \(\le\) is a total order \([0, \infty)\), our usual lattice notation becomes \(x \wedge y = \min\{x, y\}\) and \(x \vee y = \max\{x, y\}\) for \(x, \, y \in [0, \infty)\).
In the most general sense, a random vector \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a multivariate exponential distribution on \([0, \infty)^n\) if \(X_i\) has an exponential distribution on \(([0, \infty), +)\) for each \(i \in \{1, 2, \ldots, n\}\).
That is, \(X_i\) has an ordinary exponential distribution on \([0, \infty)\) for each \(i \in \{1, 2, \ldots, n\}\). Of course, the statement that \(\bs{X}\) has a multivariate exponential distribution on \([0, \infty)^n\) does not mean that \(\bs{X}\) has an exponential distribution on the semigroup \( \left([0, \infty)^n, +\right) \), except of course when \(n = 1\). We know the full story:
Random vector \(\bs{X}\) has an exponential distribution on \( \left([0, \infty)^n, +\right) \) if and only if \((X_1, X_2, \ldots, X_n)\) are independent and \(X_i\) has an exponential distribution on \(([0, \infty), +)\) for each \(i \in \{1, 2, \ldots, n\}\).
Definition is too weak and proposition is too strong for an interesting multivariate reliability model. In the context of , other conditions need to be imposed on \(\bs{X}\), often multivariate memoryless conditions of some sort. Among the best known of the multivariate exponential distributions are the Marshall-Olkin distributions. These distributions were discussed in Section 2.7 because these distributions can be defined in the setting of a general positive semigroup \((S, \cdot)\) whose partial order graph \((S, \preceq)\) is a lattice. In this section we review the results for the standard setting above (the orginal setting of Marshall and Olkin), including results that do not make sense in the general semigroup setting. We start with the bivariate case where the notation and results are simplest before moving on to the general multivariate case.
Suppose that \(U, \, V, \, W\) are independent and have exponential distributions on \(([0, \infty), +)\), with rates \(\alpha, \, \beta, \, \delta \in (0, \infty)\) respectively. Let \(X = U \wedge W\) and \(Y = V \wedge W\). Then \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \(\alpha, \, \beta, \, \delta\).
Let \(F_1, \, F_2, \, F_3\) denote the reliability functions of \(U, \, V, \, W\) respectively for \(([0, \infty), \le)\). Then \(F_1(t) = e^{-\alpha t}, \, F_2(t) = e^{-\beta t}, \, F_3(t) = e^{-\delta t}\) for \(t \in [0, \infty)\).
The motivation for definition is a two-component system. Random variable \(U\) is the arrival time of a shock
that is fatal for component 1, but not component 2; random variable \(V\) is the arrival time of a shock that is fatal for component 2, but not component 1; and random variable \(W\) is the arrival time of an event that is fatal for both components.
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \( \alpha, \, \beta, \, \delta \in (0, \infty) \). Then each of the following has an exponential distribution on \( ([0, \infty), +) \).
These results are well known in elementary probability, but also follow from the more general results in Section 2.7. In the notation of ,
In particular, \( (X, Y) \) has a bivariate exponential distribution in the sense of definition .
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \( \left([0, \infty)^2, +\right) \) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then \((X, Y)\) has reliability function \(H\) for \( \left([0, \infty)^2, +\right) \) given by \[H(x, y) = \exp[-\alpha x - \beta y - \delta (x \vee y)], \quad (x, y) \in [0, \infty)^2 \]
This also follows from the general theory in Section 2.7.. In the notation of , the reliability function \(H\) of \((X, Y)\) is given by \(H(x, y) = F_1(x) F_2(y) F_3(x \wedge y)\) for \((x, y) \in [0, \infty)^2\).
Suppose that \((X, Y)\) has reliability function \(H\) on \( \left([0, \infty)^2, + \right) \). Then \((X, Y)\) has a Marshall-Olkin distribution if and only if \((X, Y)\) has a bivariate exponential distribution as in definition and satisfies the partial memoryless property
\[ H(t + x, t + y) = H(t, t) H(x, y), \quad x, \, y, \, t \in [0, \infty) \]This also follows from the general theory in Section 2.7, but also follows easily from .
Let \(S_1 = \{(x, y) \in [0, \infty)^2: x \lt y\}\), \(S_2 = \{(x, y) \in [0, \infty)^2: y \lt x\}\) and \(\Delta = \{(x, x): x \in [0, \infty)\}\), the diagonal of \([0, \infty)^2\). So \((S_1, +)\) and \((S_2, +)\) are sub-semigroups of \(([0, \infty)^2, +)\) (with invariant measure \(\lambda^2\)), while \((\Delta, +)\) is a subsemigroup isomorphic to \(([0, \infty), +)\) with invariant measure \(\lambda_1\) given by \(\lambda_1(A) = \lambda(A_1)\) for measurable \(A \subseteq \Delta\) where \(A_1 = \{x \in [0, \infty): (x, x) \in A\}\). A Marshall-Olkin distribution on \([0, \infty)^2\) has a mixed distribution, with positive probability on \(\Delta\).
Suppose that \((X, Y)\) has the Marshall-Olkin distribution on \(([0, \infty)^2, +)\) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then
This follows from standard results on independent exponential variables, by considering the 6 possible orderings of \((U, V, W)\):
Parts (a) and (b) can also be computed from the reliability function \(H\) in via \[h(x, y) = \frac{\partial^2}{\partial x \, \partial y} H(x, y), \quad (x, y) \in S_1 \cup X_2\]
So \(h\) is the density of the absolutely continuous part of the distribution on \(S_1 \cup S_2\) and \(h\) the density of the singular part of the distribution on \(\Delta\). In particular \[\P(X = Y) = \frac{\delta}{\alpha + \beta + \delta}\] The rate function of \((X, Y)\) (with respect to the appropriate measures) is constant on the three parts of the domain in .
Suppose again that \((X, Y)\) has the Marshall-Olkin distribution on \(([0, \infty)^2, +)\) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then relative to this semigroup,
So if \(\alpha = \beta\) then \((X, Y)\) has constant rate \(\alpha (\alpha + \delta)\) on \(S_1 \cup S_2\) (with respect to \(\lambda^2)\). If \(\alpha \in (0, 1)\) and \(\delta = \alpha^2 / (1 - \alpha)\) then \((X, Y)\) has constant rate on all of \([0, \infty)^2\) (but of course relative to different measures on \(S_1 \cup S_2\) and \(\Delta\)).
Suppose again that \((X, Y)\) has the Marshall-Olkin distribution on \(([0, \infty)^2, +)\) with parameters \(\alpha, \, \beta, \, \delta \in (0, \infty)\). Then
Parts (a) and (b) follow immediately from . Part (c) can be computed using the density function in , but a conditioninng argument, as in that proposition, also works: \begin{align*} \E(XY, U \lt V \lt W) &= \frac{\alpha^2 \beta + 3 \alpha \beta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, V \lt U \lt W) &= \frac{\alpha \beta^2 + 3 \alpha^2 \beta + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, U \lt W \lt V) &= \frac{\alpha^2 \delta + 3 \alpha \delta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, V \lt W \lt U) &= \frac{\beta^2 \delta + 3 \beta \delta^2 + 3 \alpha \beta \delta}{(\alpha + \beta + \delta)^2 (\beta + \delta)^2} \\ \E(XY, W \lt U \wedge V) &= \frac{2 \delta}{(\alpha + \beta + \delta)^2} \\ \end{align*} Summing and simplifying gives \[\E(X Y) = \frac{1}{\alpha + \beta + \delta} \left(\frac{1}{\alpha + \delta} + \frac{1}{\beta + \delta}\right)\] and then part (c) follows.
It's interesting that \(\cor(X, Y) = \P(X = Y)\).
The extension of the Marshall-Olkin distribution to higher dimensions is a bit complicated and requires some additional notation to state the definition and results cleanly, just as in Section 2.7 in the abstract setting. For this subsection, fix \(n \in \N_+\). Let \(B_n\) denote the set of bit strings \(b = b_1 b_2 \cdots b_n\) of length \(n\), excluding the all 0 string.
Suppose that \(\{Z_b: b \in B_n\}\) is a collection of independent variables, and that \(Z_b\) has the exponential distribution on \(([0, \infty), +)\) with rate \(\alpha_b \in (0, \infty)\). Define \[ X_i = \min\{Z_b: b \in B_n, b_i = 1\}, \quad i \in \{1, 2, \ldots, n\} \] Then \((X_1, X_2, \ldots, X_n)\) has the Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b: b \in B_n\}\).
So a collection of \(2^n - 1\) independent, exponential variables on \(([0, \infty), +)\) is required for the construction of the Marshall-Olkin variable on \(([0, \infty)^n, +)\). The marginal distributions are of the same type.
Suppose that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\), and that \((j_1, j_2, \ldots, j_k)\) is a subsequence of \((1, 2, \ldots, n)\). Then
In partricular, \(X_i\) has an exponential distribution on \(([0, \infty), +)\) with rate \(\sum\{\alpha_b: b \in B_n, b_i = 1\}\), so \(\bs{X}\) has a mutivariate exponential distribution in the sense of definition . From part (c), \(\bs{X}\) has positive probability on each of the \(2^n - n - 1\) hyperplanes in \([0, \infty)^n\).
Suppose that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\). Let \(H\) denote the reliability function of \(\bs{X}\) on \((S^n, \cdot)\). Then \[H(x_1, x_2, \ldots, x_n) = \exp\left(-\sum_{b \in B_n} \alpha_b \, \max\{x_i: b_i = 1\}\right), \quad (x_1, x_2, \ldots, x_n) \in [0, \infty)^n \]
On the \(n!\) regions of \([0, \infty)^n\) defined by the strict orderings of the variables \((x_1, x_2, \ldots, x_n)\), the density function of \(\bs X\) (with respect to \(\lambda^n\)) can be found by differentiating the reliability function in : \[h(x_1, x_2, \ldots, x_n) = (-1)^n \frac{\partial^n}{\partial x_1 \, \partial x_2 \cdots \partial x_n} H(x_1, x_2, \ldots, x_n)\] The generalization of the partial memoryless property is straightforward.
Suppose again that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with reliability function \(H\). Then \(\bs{X}\) has the partial memoryless property \[ H(t + x_1, t + x_2, \ldots, t + x_n) = H(t, t, \ldots, t) H(x_1, x_2, \ldots, x_n), \quad t \in [0, \infty), \, (x_1, x_2, \ldots, x_n) \in [0, \infty)^n \]
Suppose again that \(\bs{X} = (X_1, X_2, \ldots, X_n)\) has a Marshall-Olkin distribution on \(([0, \infty)^n, +)\) with parameters \(\{\alpha_b \in (0, \infty): b \in B_n\}\). For distinct \(i, \, j \in \{1, 2, \ldots, n\}\), \[\cor(X_i, X_j) = \P(X_i = X_j) = \frac{\sum\left\{\alpha_b: b \in B_n, b_i = b_j = 1\}\right\}} {\sum\left\{\alpha_b: b \in B_n, b_i = 1 \text{ or } b_j = 1\right\}}\]