1. Geometric Sums: The Rényi Theorem
Assume that all the random variables and processes noted below are defined on one and the same probability space
. Let
be independent identically distributed random variables. Let
be a random variable with the geometric distribution
Assume that
is independent of
Consider the random variables
The random variables
and
are called
geometric (random) sums.
The distribution function of a random variable X will be denoted as .
Everywhere in what follows, let
E denote the random variable with the standard exponential distribution:
The uniform distance between the distributions of random variables
X and
Y will be denoted as
,
In what follows, the symbol will stand for the distribution of a random variable X.
The symbol ∘ will denote the product of independent random variables.
The statement of the problem considered in this paper goes back to the mid-1950s, when A. Rényi noticed that any renewal point process iteratively subjected to the operation of elementary rarefaction followed by an appropriate contraction of time tends to the Poisson process [
1,
2].The operation of elementary rarefaction assumes that each point of the point process, independently of other points, is either removed with probability
or remains as it is with probability
p (
). The limit Poisson process is characterized by the concept that the distribution of time intervals between successive points is exponential. Moreover, it is easy to see that, at each iteration of rarefaction, the time interval between successive points in the rarefied process is representable as the sum of a geometrically distributed random number of independent random variables in which the number of summands is independent of the summands. These objects are called geometric sums. Geometric sums proved to be important mathematical models in many fields, e.g., risk theory and insurance, reliability theory, etc. It suffices to note the famous Pollaczek–Khinchin formula for the ruin probability in a classical risk process and some recent publications [
3,
4] dealing with important applications of geometric random sums and their generalizations to modeling counting processes.
The publication of [
5] in 1984 strongly stimulated interest in analytic and asymptotic properties of geometric sums. In that paper, the notions of geometric infinite divisibility and geometric stability were introduced.
The geometric stability of a random variable
X means that if
are independent identically distributed random variables with the same distribution as that of
X, and
is the random variable with geometric distribution (
1) independent of
, then for each
there exists a constant
such that
In [
5] it was shown that geometrically stable distributions and only they can be limiting for geometric random sums. (For the case of nonnegative summands, this statement was earlier proven by I. N. Kovalenko [
6] who, in terms of Laplace transforms, introduced the class of distribution that, as turned out later, actually coincides with the class of geometrically stable distributions on
.)
A significant contribution to the theory of geometric summation was made by V. V. Kalashnikov. The results were summarized in his wonderful and widely cited book [
7]. That book was followed by many other important publications, for example, [
8,
9,
10,
11].
Formally, the Rényi theorem states that, as
(or, which is the same, the expectation of the sum infinitely increases), the distribution of a geometric sum being normalized by its expectation converges to the exponential law:
2. Convergence Rate Bounds in the Classical Rényi Theorem
The first result on convergence rate in the Rényi theorem was obtained by A. D. Solovyev [
12]. He considered the case of nonnegative summands
and proved that for
The result of Solovyev was extended by V. V. Kalashnikov and S. Yu. Vsekhsvyatskii to the case
in [
13], where they showed that
with
being a finite absolute constant; also see [
14].
In [
15], it was proven that if
, then
In [
7], cited above, the bounds for the rate of convergence in the Rényi theorem were formulated in terms of the Zolotarev
-metric. To make the importance of the
-metric more clear, recall that, by the definition of weak convergence, random variables
are said to converge to a random variable
Y weakly, if
as
for any
, where the set
contains all continuous bounded functions. However, for the construction of convergence rate bounds, it is not convenient to use the quantities
, because the set
is too wide. V. M. Zolotarev noticed that for this purpose it is more appropriate to consider the convergence only on some special sub-classes of
. He suggested narrowing the set
to the class of differentiable bounded functions with Lipschitz derivatives. This suggestion resulted in the definition of the ‘ideal’
-metric.
The formal definition of the
-metric is as follows. Let
. The number
s can be uniquely represented as
where
m is an integer and
. Let
be the set of all real-valued bounded functions
f on
that are
m times differentiable and
In 1976, V. M. Zolotarev [
16] introduced the
-metric
in the space of probability distributions by the equality
also see [
17,
18]. In particular, it can be proved that
e.g., see the derivation of Equation (1.4.23) in [
18].
The following properties of the
-metrics will be used below. First of all, any probability metric satisfies the triangle inequality and therefore,
for any random variables
X,
Y, and
Z. Some other properties of
-metrics will be presented in the form of lemmas.
Lemma 2. Let be random variables such that both X and Y are independent of Z. Then For the proofs of these statements, see that of Theorem 1.4.2 in [
18]. The property of
-metrics stated by Lemma 1 is called
homogeneity of order
s of the metric
, whereas its property stated in Lemma 2 is called
regularity. The proof of regularity of the metric
given in [
18] can be easily extended to sums of an arbitrary number of independent random variables. Namely, for any
let
and
be two sets of independent random variables. Then
In what follows, we will sometimes use the following semi-additivity property of the -metric.
Lemma 3. Let X and Y be random variables with the distribution functions and , respectively. Let and be conditional distribution functions of X and Y given , respectively, so thatThen For the proof, see Proposition 2.1.2 in [
7] or Lemma 3 in [
19].
The tractability of -metrics in terms of the weak convergence and their attractive properties of -metrics inspired Zolotarev to call these metrics ideal.
Return to the discussion of the convergence rate estimates in the Rényi theorem.
The results presented in [
7,
14] concern geometric sums of not necessarily nonnegative summands and are as follows. Let
. Then
These results actually present estimates of the geometric stability of the exponential distribution.
It should be noted that the definition of the
-metric used in [
7] was more general than that used by Zolotarev, so that the boundedness of functions of the class
was not assumed.
I. G. Shevtsova and M. A. Tselishchev [
20] proved a general result for independent and not necessarily identically distributed random summands
but with identical nonzero expectations (say, equal to
a) and finite second moments that implies the bounds
and
In [
19], it was proved that if
and
, then for
Inequalities (
10) and (
12) establish best known moment bounds for the convergence rate in the classical Rényi theorem in terms of the
-metrics of the first and second orders.
3. Generalizations of the Rényi Theorem
The normalization of a sum of random variables by its expectation in the classical Rényi theorem is traditional for the laws of large numbers. Therefore, it is possible to regard the Rényi theorem as the law of large numbers for geometric sums. In its general form, the law of large numbers for random sums in which the summands are independent and identically distributed random variables was proven in [
21]. It was demonstrated in that paper that the distribution of a non-randomly normalized random sum converges to some distribution if and only if the distribution of the the number of summands under
the same normalization converges to the same distribution (up to a scale parameter).
Inequalities (
11) and (
12) were obtained as particular cases of a more general result concerning mixed Poisson random sums since, as is known, the geometric distribution of the random variable
can be represented as a mixed Poisson law:
with
. Representation (
13) points at a natural direction of development of the studies related to the Rényi theorem leading to a more general class of possible limit laws and, correspondingly, to a more general set of possible distributions of the number of summands.
A direct way to construct generalizations of the geometric distribution is to replace mixing exponential distribution in (
13) by a more general distribution from some class
containing the exponential distribution
.
Let
be the standard Poisson process (that is, the Poisson process with unit intensity) independent of the random variable
for each
. Let, for each
, the random variable
be defined as
The random variable
so defined has a
mixed Poisson distribution:
where
is the distribution function of
. Mixed Poisson distributions constitute a very wide class.
In [
19], it was proven that if
is finite and
is some nonnegative random variable, then for
we have
If, in addition,
, then
In particular,
Of course, the first idea is to replace the exponential mixing distribution in (
13) by the gamma distribution. Let
be a random variable with the gamma distribution with parameters
r and
corresponding to the probability density function
Let
where
,
,
. Then
has the negative binomial distribution with parameters
r and
:
We have
so that for each
that is, the limit distribution in the the law of large numbers for negative binomial random sums (or “generalized Rényi theorem”) is gamma with shape and scale parameters equal to
r, and for
the following bound holds:
In particular,
If
, then
.
To make the parametrization of the distribution of
more traditional by using the parameters
r and
, let us denote this random variable in an alternative way:
. In these terms, (
18) and (
19) can be rewritten as
and
With
, bounds (
20) and (
21) turn into (
11) and (
12), respectively.
For the case
in this problem, a bound more accurate in
p (but less accurate in
r) was independently obtained in [
22]:
where
is the least integer no less than
r. In [
19], more examples can be found, say, the upper bounds for the
-distance between the distribution of a generalized negative binomial random sum and the generalized gamma distribution, with
. (For the case
in that problem, a more accurate bound was independently obtained in [
22].)
4. Convergence Rate Bounds for Mixed Geometric Sums
Another reasonable way to generalize geometric distribution is to take as the class of mixed exponential distributions. This class is very wide and actually contains all distributions with distribution functions such that is the Laplace transform of some other probability distribution on the nonnegative half-line. For example, this class contains Weibull distributions with shape parameter , Pareto-type distributions, exponential power distributions with shape parameter , gamma distributions with shape parameter , Mittag-Leffler distributions, one-sided Linnik distributions, etc.
Let
Q be a nonnegative random variable. Let
. In this case,
is the Laplace transform of the random variable
. It is easy to see that a mixed Poisson random sum with the mixing distribution
is a mixed geometric random sum. Indeed, because
,
Q, and
E are assumed independent, by the Fubini theorem for
we have
Here, the integrands are geometric probabilities with the parameter
.
Moreover, the following convergence rate bound holds:
In particular,
These inequalities are particular cases of (
15) and (
16).
In addition to the examples presented in [
19], consider some more particular cases of (
16).
Example 1. The case where
has the (generalized) gamma-distribution. We say that a random variable
has the generalized gamma distribution (GG distribution) if its density has the form
with
,
,
.
The class of GG distributions was proposed in 1925 by the Italian economist L. Amoroso [
23] and is often associated with the work of E. W. Stacy [
24], who introduced this family as the class of probability distributions containing both gamma and Weibull distributions. This family embraces practically all of the most popular absolutely continuous distributions on
. The GG distributions serve as reliable models in reliability testing, life-time analysis, image processing, economics, social network analysis, etc. Apparently, the GG distributions are popular because most of them are adequate asymptotic approximations appearing in limit theorems of probability theory in rather simple limit settings. An analog of the law of large numbers for random sums in which the GG distributions are limit laws was proven in [
25]. In [
26], the maximum entropy principle was used to justify the applicability of GG distributions; also see [
27,
28].
In this case, the random variable
has the generalized negative binomial distribution [
29]. For the convergence bounds for negative binomial sums, see (
20), (
21) and [
20], and for the convergence bounds for generalized binomial sums see [
19,
22].
As a particular case of the GG distribution, consider the Weibull-distributed
. In this case in (
25),
and
. For convenience, without loss of generality, also assume that
. In other words, we consider the case
. In [
30], it was demonstrated that if
, then
where
is a nonnegative random variable with the strictly stable distribution given by its characteristic function
This means that in the case under consideration
. As this is so, in [
31] it was proven that
Therefore, in the case of the Weibull limit distribution with
, bound (
24) takes the form
Example 2. Let
be a random variable with the Pareto type II distribution defined by the probability density
where
. This distribution is also called the Lomax distribution [
32]. This distribution is used in business, economics, actuarial science, Internet traffic modeling, queueing theory, and other fields. Consider the case where
. It is easy to see that in this case the random variable
Q has the inverse gamma distribution, that is,
, where
has the gamma-distribution with the probability density
(see (
17)), so that
Hence, if
, then
so that bound (
24) takes the form
Example 3. Let
. By
, we denote a random variable with the exponential power distribution defined by the density
Consider the case where
. In [
31], it was proven that if
, then
, where the random variable
has the probability density
Here,
is the probability density of the strictly stable distribution defined by the characteristic function (
26). Moreover, in [
31], it was demonstrated that
for
. Therefore, in the case under consideration
and bound (
24) takes the form
As one more example, consider convergence rate bounds for mixed Poisson random sums with the one-sided Linnik mixing distribution.
Example 4. In 1953, Yu. V. Linnik [
33] introduced the class of symmetric distributions corresponding to the characteristic functions
where
. If
, then the Linnik distribution turns into the Laplace distribution whose probability density has the form
A random variable with Laplace density (
29) and its distribution function will be denoted as
and
, respectively.
An overview of the analytic properties of the Linnik distribution can be found in [
30], with the main focus on the various mixture representations of this distribution. Apparently, the Linnik distributions are most often recalled as examples of geometric stable distributions supported by the whole
. Moreover, the Linnik distributions exhaust the class of all
symmetric geometrically strictly stable distributions (e.g., see [
34]).
In what follows, the notation
will stand for a random variable with the Linnik distribution with parameter
. The distribution function and density of
will be denoted as
and
, respectively. It is easy to see that (
28) and (
29) imply
,
.
In [
30], the distribution of the random variable
with
was called the
one-sided Linnik distribution.
In [
35], it was proven that the Linnik distribution density admits the following integral representation:
Hence, the density
of the one-sided Linnik law has the form
That is,
is the Laplace transform of the random variable
, whose probability density
has the form
Hence,
In [
30]), it was shown that if
, then the probability density
of the ratio
has the form
Comparing representation (
30) with (
32), we come to the conclusion that
where
and
are independent nonnegative random variables with one and the same strictly stable distribution given by its characteristic function (
26) with characteristic exponent
and, furthermore,
Moreover, in [
31], it was demonstrated that for
,
Now consider a mixed Poisson random sum with the one-sided Linnik mixing distribution. From (
31), (
33), (
23), and (
35) with
, we obtain the following statement.
Proposition 1. If , , , then Note that if
, then
and (
36) turns into (
12).
6. Convergence Rate Bounds for Mixed Poisson Random Sums with the Mittag-Leffler Mixing Distribution
Another important case of the set is the set of Mittag-Leffler distributions. This case is very interesting, as it is an illustration that, for the law of large numbers for mixed Poisson random sums (that is, for the generalized Rényi theorem) to hold, it is not necessary that the expectation of the mixed Poisson random sum exists.
Assume that, in (
13), the mixing exponential distribution is replaced by the Mittag-Leffler distribution given by its Laplace transform
where
,
. For convenience, without loss of generality, in what follows we will consider the case of the standard scale assuming that
. As an aside, the class of Laplace transforms (
42) coincides with the class introduced by I. N. Kovalenko [
6] and, hence, from what has already been said, the Mittag-Leffler distributions exhaust the class of geometrically stable distributions on
. A random variable with the Laplace transform (
42) with
will be denoted as
.
With
, the Mittag-Leffler distribution turns into the standard exponential distribution, that is,
. However, if
, then the Mittag-Leffler distribution density has a heavy power-type tail:
as
, see, e.g., [
36], so that the moments of the random variable
of orders no less than
are infinite.
But, as was shown in [
21], the convergence of the distribution of a mixed Poisson random sum to the Mittag-Leffler distribution can also take place in the cases where the moments of the summands (expectations, variances, etc.) are finite. To make this sure, consider the following convergence rate bounds.
It is known that the Mittag-Leffler distribution admits the representation
that is, it is mixed exponential. Here,
and
are nonnegative random variables with one and the same strictly stable distribution given by its characteristic function (
26) (see, e.g., [
30]).
Now let
for
and some fixed
. In this case, with the account of (
32), relation (
38) takes the form
For the case
, relation (
44) with
given by the right-hand of (
10) (being consistent with Corollary 1) gives the following bound.
Proposition 2. Let be a random variable with the Mittag-Leffler distribution, . If and , then For other values of
s by the approach proposed in [
19], it is possible to obtain an explicit (but possibly less accurate) estimate. If
, then
for each
and hence,
, implying that, in this case,
From (
43), it follows that, in this case,
so that, for admissible
Therefore, from (
19) and (
43), we obtain the following bound.
Proposition 3. If and , then for and , we have 7. Quantification of the Geometric Stability of the Mittag-Leffler and Linnik Distributions
This section concerns another property of some geometric sums. Namely, here we will consider the property of geometric stability of some probability distributions.
Recall that the geometric stability of the distribution of a random variable
X means that if
are independent identically distributed random variables with the same distribution as that of
X, and
is the random variable with geometric distribution (
1) independent of
, then for each
there exists a constant
such that relation (
2) holds. In what follows, we will concentrate our attention on the property of
strict geometric stability, which means that in (
2), the constants
have the special form, namely,
for some
and
. For the sake of convenience and without loss of generality, assume that
. As (
2) holds for any
, we can let
, so that (
2) can be also regarded as a ‘limit theorem’ for geometric sums in which, unlike Rényi-theorem-type laws of large numbers, the limit law, as
, is completely determined by the distribution of an individual summand.
In many papers, the Mittag-Leffler and Linnik distributions (for the corresponding definitions see
Section 6 and Example 4) are noted as examples of geometrically strictly stable distributions. As this is so,
for the Mittag-Leffler distribution with parameter
and
for the Linnik distribution with parameter
.
First, consider the Mittag-Leffler distribution. Let
and
be independent random variables with one and the same Mittag-Leffler distribution coinciding with that of
. Then, in accordance with (
2), for any
where the random variable
has geometric distribution (
1) and is independent of
The aim of the following statement is to illustrate this circumstance and generalize Kalashnikov’s bound (
8) to all geometrically stable distributions on
. In other words, the aim is to obtain an estimate for the stability of representation of the Mittag-Leffler distribution as a geometric convolution.
Theorem 2. Let be a random variable with the Mittag-Leffler distribution, , , . Then Proof. By virtue of (48) for any
, we have
where
are independent random variables with one and the same Mittag-Leffler distribution coinciding with that of
. Therefore, by Lemmas 3 and 1 with the account of (
7), we have
□
Therefore, the appropriately scaled distribution of a geometric random sum may be close to the Mittag-Leffler distribution for two reasons: first, the parameter
p may be small enough, and/or second, the distribution of a separate summand (say,
) may be close enough to the Mittag-Leffler distribution. In the first case, Theorem 1 serves as an illustration of the transfer theorem for random sums (e.g., see [
27]). In this case,
may be not close to
. The only requirement is that
is finite. The finiteness of
means that the tail of the distribution of
is equivalent to that of
as
. However, this means that
belongs to the domain of attraction of a ‘usual’ strictly stable distribution with the characteristic exponent
. As is known, in this case the moments of the random variable
of orders no less than
do not exist [
37]. As this is so, with small
p the number of summands in the geometric sum is large and, in accordance with the transfer theorem for random sums, the limit distribution of an appropriately normalized geometric sum has the form of a scale mixture of the strictly stable distribution with characteristic exponent
, whereas the mixing distribution is the limit law for the standardized number of summands, i.e., is exponential. However, in the situation under discussion, this mixture is exactly the Mittag-Leffler distribution; for details see, e.g., [
30] and the references therein. In the second case, the parameter
p may not be small, and the closeness of the distribution of a geometric sum to the Mittag-Leffler distribution can be provided by the smallness of the distance between
and
.
As it has already been said, estimate (
46) makes sense, if
. To clarify the meaning of this condition, consider the case
. In this case, the metric
turns into the mean metric (
5), also sometimes referred to as the Kantorovich or Wasserstein distance. Assume that
, where
is the corresponding ‘discrepancy’. In this case, the condition of finiteness of
means that the discrepancy
must be integrable:
so that Theorem 2 implies the following statement.
Corollary 2. Let . Assume that and (47) holds. Then If
, then the Mittag-Leffler distribution turns into the exponential distribution so that bounds (
8) and (
9) can be used.
Now turn to the Linnik distribution. Let
and
be independent random variables with one and the same Linnik distribution coinciding with that of
. Then, in accordance with (
2), for any
where the random variable
has geometric distribution (
1) and is independent of
Therefore, just as in the case of the Mittag-Leffler distribution, the appropriately scaled distribution of a geometric random sum may be close to the Linnik distribution for two reasons: first, the parameter
p may be small enough, and/or second, the distribution of a separate summand (say,
) may be close enough to the Linnik distribution.
Theorem 3. Let , , . Then Proof. This theorem can be proved by exactly the same reasoning that was used to prove Theorem 2. □
Denote , . Theorem 3 implies the following analog of Corollary 2 for the Linnik distribution.
Corollary 3. Let . Assume that the discrepancy is integrable:Then It should be noted that the conditions
in Theorem 2 and
in Theorem 3 as well as the conditions on
s in Corollaries 1 and 2 were assumed only to provide the convergence of the right-hand sides of (
46) and (
49) to zero as
. However, in general, in these inequalities other values of
are also admissible, since, as it has already been said, with arbitrary
fixedp the smallness of the right-hand sides can be provided by the smallness of the
-metrics between the distribution of an individual summand and the corresponding geometrically stable law.