1. Introduction
The exploitation of the regenerative structure has a long and successful history in the development of both theory and algorithms for Markov chains and processes, going back to the pioneering work of Doeblin [
1] in which the central limit theorem for Markov chains was studied. In the 1970s, Athreya and Ney [
2] and Nummelin [
3] independently showed how Harris recurrent Markov chains can be viewed as regenerative stochastic processes. Sigman [
4] later extended these regeneration ideas to the setting of Harris recurrent Markov processes in continuous time. Important contributions were also made to this literature on regeneration by Vladimir Kalashnikov, both in book form [
5] and in various papers published over the course of his research life [
6,
7,
8,
9,
10,
11,
12].
In view of Professor Kalashnikov’s major contributions to this research domain, this note also discusses regeneration. In particular, we provide a new characterization of the class of regenerative Markov processes. Specifically, we recall in
Section 2 that such processes can be identified with the class of chains and processes that are recurrent in the sense of Harris. The main contribution of the current note is that this class of Markov processes is exactly the class for which there exists a single random time
T (not necessarily a randomized stopping time) at which the chain or process has a distribution that does not depend on its initial state, see Theorems 3 and 5. With only this property assumed, the Markov process must in fact then be wide-sense regenerative in discrete time, and one-dependent regenerative in continuous time. A useful review of some of these regeneration concepts can be found in [
13]. In particular, when a single such time
T exists, such processes then necessarily possess an infinite sequence of randomized stopping times at which the process is identically distributed, and at which the process also exhibits various forms of “cycle independence” relative to that sequence of times. In other words, this seemingly weak property involving a single
T is equivalent to the much stronger property that the process regenerates at an infinite sequence of randomized stopping times.
In this sense, there is some similarity to the results of [
14], in which it is shown that for general (possibly non-Markov) stochastic processes, the existence of a single wide-sense (or classical) regeneration time implies the existence of an infinite sequence of such wide-sense (or classical) regeneration times. In contrast to their results, the current paper assumes a Markov structure, but makes no independence assumptions related to either
T or the post-
T process, so makes much weaker demands on
T.
2. Main Results
We start with a discussion in the setting of discrete-time Markov chains. Let
S be a separable metric space, and suppose that
is its associated Borel
-algebra. Put
, and let
be the associated
-algebra on
induced by the product topology. For
, let
(for
) be the
i’th coordinate projection. Given a one-step transition kernel
and
, let
be the probability on
under which
for
(
), so that
is a Markov chain with transition kernel
P starting from
x.
Definition 1. We say that P induces a Harris recurrent Markov chain on S if there exists a non-trivial non-negative σ-finite measure η on for which implies thatfor The theory developed by [
2,
3] showed that Harris recurrence is equivalent to wide-sense regeneration. To state this result, let
and let
be the associated Borel product
-algebra. Put
, and let
be the associated Borel product
-algebra. For
let
and
for
. Given a transition kernel
P on
, we say that the family
of probabilities on
is
consistent with
P if for each
and
,
- (i)
- (ii)
is a sequence of independent and identically distributed (iid) random variables (rv’s) under
The existence of the sequence on the same probability space that supports the Markov chain allows the possibility of constructing random times that exhibit regeneration structure.
Definition 2. We say that P induces wide-sense regeneration if for some family of probabilities on consistent with P, there exist strictly increasing random times and a probability on S such that for each and
- (i)
is independent of under ;
- (ii)
In the presence of wide-sense regeneration, one may analyze
via the use of renewal equations, thereby greatly simplifying the theory of such Markov chains. Refs. [
2,
3] essentially proved the following result. (They proved the “only if” direction. The proof uses the existence of
C-sets to construct the wide-sense regeneration. The
C-set construction, in turn, uses the fact that
is countably generated. This is why we assume that
is the Borel
-algebra of a separable metric space. The converse follows from, for example, our Theorem 3).
Theorem 1. The transition kernel P induces a Harris recurrent Markov chain if and only if P induces wide-sense regeneration.
A related, but different, form of regeneration is the following.
Definition 3. We say that P induces one-dependent regeneration if for some family on consistent with P, there exist strictly increasing random times with , such that for each ,is one-dependent (in n) under andis independent of and Using the Athreya–Ney–Nummelin regenerative construction, ref. [
15] established the “only if” direction of the following result. As for Theorem 1, the other direction follows, for example, from Theorem 3.
Theorem 2. The transition kernel P induces a Harris recurrent Markov chain if, and only if, P induces one-dependent regeneration.
We now turn to our first new result.
Definition 4. We say that the random time T on exhibits distributional invariance with respect to the transition kernel P if there exists a family of probabilities on consistent with P such that the distributions do not depend on .
It is obvious that if P induces either wide-sense regeneration or one-dependent regeneration, then exhibits distributional invariance for .
Theorem 3. The transition kernel P induces a Harris recurrent Markov chain if, and only if, there exists a random time that exhibits distributional invariance with respect to P.
Proof. If
P induces Harris recurrence, then the random time
of Theorem 1 exhibits distributional invariance. On the other hand, if
T exhibits distributional invariance with respect to
P, then
does not depend on
. Suppose
. Then, for each
, there exists
such that
Furthermore, we may select
to be
measurable. Hence, if
, it follows that
for each
. Consequently, if
,
Put
and
for
. Then,
is a strictly increasing sequence for which
for each
. It follows from the conditional Borel–Cantelli lemma (see, for example, [
16], Corollary 2, p. 324) that
proving that
P induces Harris recurrence. □
As noted in the Introduction, the seemingly weak assumption of existence of a distributionally invariant random time T already implies existence of an entire sequence of wide-sense or one-dependent regeneration times.
We now extend this theory to continuous time. Given a separable metric space
S, let
be the space of functions
, such that
is right continuous everywhere, with left limits at
(RCLL). For
, put
,
, and
Let
be a family of probabilities on
for which
for
. Furthermore, we require that for each
and non-negative (measurable)
f,
where
for
and
is the expectation associated with
It follows that
X is a time-homogeneous Markov process under
We further assume that
X is a
Feller process (i.e., for each bounded continuous
and
,
is continuous in
x).
We now state the definition of Harris recurrence in continuous time; see, for example, [
17].
Definition 5. The process X and its associated probabilities on Ω is said to be Harris recurrent in continuous time if there exists a non-trivial non-negative σ-finite measure η for whichfor each whenever . (Here, is the indicator rv corresponding to B). It has been known since the 1990s that such Harris recurrence implies the existence of one-dependent regeneration times (see [
4]). As in discrete time, the statement of this result requires the use of a probability space upon which auxiliary randomization can be defined. To this end, let
For
let
and
for
and
.
We say that the family of probabilites on is consistent with if for each ,
- (i)
- (ii)
is a sequence of iid rv’s on under
Definition 6. We say that X and induce one-dependent regeneration in continuous time if for some family of probabilities on consistent with , there exists a sequence of strictly increasing random times (with ), such that for each ,is a one-dependent sequence under andis independent of and As noted earlier, ref. [
4] proved the forward implication of the next theorem. As in discrete time, the separability of
S is used to ensure that
is countably generated (so that the “splitting construction” of Athreya–Ney–Nummelin applies.) The reverse implication is a consequence of (for example) Theorem 5.
Theorem 4. The process X and the family is Harris recurrent in continuous time if, and only if, X and induce one-dependent regeneration in continuous time.
In view of our discrete time discussion, an obvious question is whether Harris recurrence in continuous time implies wide-sense regeneration. This issue, first raised in the 1990s, remains an open question (see, for example, [
18]).
Our second major result is the extension of Theorem 3 to continuous time.
Definition 7. We say that the random time T on exhibits distributional invariance in continuous time with respect to X and if there exists a family of probabilities on consistent with such that the distributions do not depend on
Theorem 5. The process X and its probabilities is Harris recurrent in continuous time if, and only if, there exists a random time T on exhibiting distributional invariance.
Proof. The proof is similar to that of Theorem 3. Put
. Given
A for which
, let
be a measurable function for which
Set
and
for
. As in discrete time, (
1) implies that for each
,
for
; see [
19] for a discussion of the measurability of hitting times of generic Borel sets
A. The conditional Borel–Cantelli lemma again implies that
for each
Given (
2), Theorem 1 of [
20] then implies that
X and its probabilities
are Harris recurrent in continuous time. Note that Theorem 1 of [
20] requires that
X is a Borel right process (strong Markov process with right continuous sample paths). Since every Feller process with RCLL sample paths satisfies the strong Markov property (see, for example, [
21], Theorem 3.1, p. 102), the requirement is met.
The forward direction is an immediate consequence of [
4]. □