Next Article in Journal
Weak and Strong Convergence Theorems for Common Attractive Points of Widely More Generalized Hybrid Mappings in Hilbert Spaces
Previous Article in Journal
Strong Differential Superordination Results Involving Extended Sălăgean and Ruscheweyh Operators
Previous Article in Special Issue
Sequential Interval Reliability for Discrete-Time Homogeneous Semi-Markov Repairable Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Markov and Semi-Markov Chains, Processes, Systems, and Emerging Related Fields

by
P.-C.G. Vassiliou
1,* and
Andreas C. Georgiou
2,*
1
Department of Statistical Science, University College London, Gower St, London WC1E 6BT, UK
2
Quantitative Methods and Decision Analysis Lab, Department of Business Administration, University of Macedonia, 54636 Thessaloniki, Greece
*
Authors to whom correspondence should be addressed.
Mathematics 2021, 9(19), 2490; https://doi.org/10.3390/math9192490
Submission received: 25 September 2021 / Revised: 26 September 2021 / Accepted: 26 September 2021 / Published: 4 October 2021
Probability resembles the ancient Roman God Janus since, like Janus, probability also has a face with two different sides, which correspond to the metaphorical gateways and transitions between the past and the future. Probability can be seen as a limbo state between abstractness and concreteness. This inherent duality renders one side the closest possible to a branch of pure mathematics, derived from certain axioms in classical areas of algebra.
Nonetheless, with its other side, probability is, indeed, an applied or applicable mathematics discipline, most commonly known as applied probability, although, in our opinion, the common distinction between pure and applied mathematics is, all too often, merely artificial and, at times, fuzzy. This side, without being less demanding in mathematical or stochastic terms, gives birth to valuable models for studying everyday phenomena of the real world. Stochastic processes are, by now, well established as an extension of probability theory. In the area of stochastic processes, Markov and semi-Markov processes play a vital role as an independent area of study, generating important and novel applications and new mathematical results.
The special issue with the title Markov and Semi-Markov Chains, Processes, Systems, and Emerging Related Fields includes fourteen articles published in the journal of “Mathematics” in the section of “Probability and Statistics”, in the period from January–August 2021. The authors of this issue acted as Academic Editors to all the papers except their own papers for which the Editorial Board appointed Academic Editors that were unknown to the authors and became known after the publication and after they agreed to their names being published. We hope that this volume provides opportunities for future research ideas and that the interested reader will discover these paths between the lines and the mathematical formulas of the published papers.
The Guest Editors would like to thank the Chief Editors and the Editorial Board of the Journal of Mathematics for their invitation to edit the present volume. We cordially thank the authors for contributing to the publication of the volume by submitting their significant research articles and addressing all comments and suggestions with diligence and enthusiasm. We also pay our respects to the anonymous reviewers of the volume since, without their valuable assistance, this venture could have not been completed.
We would also like to express our gratitude to the Editorial Manager, Dr. Syna Mu, for his continuous efforts to facilitate the workflow of this issue, for the excellent collaboration with the Guest Editors, and for arranging for the partial funding of the publication of the present volume. The Guest Editors would also like to thank the Professors Andras Telecs and Alexander Zeifman for acting as Academic Editors for our own contributed articles. Last, but not least, many thanks are due to the numerous Editorial Assistants who successfully undertook the tedious tasks of managing the large number of submissions in the present volume.
We now proceed to a brief presentation of the articles by categorizing them in three sub-areas and also provide the reader with some useful references that might introduce them easily to the mathematical background of the papers. The order of the sub-areas (sections) generally follows the title of the special issue, and the articles within each section are sorted by the dates of publication.
i Markov Chains, Processes, and Markov Systems.
Markov processes are stochastic processes that exhibit the Markov property, while Markov Chains are their discrete time and discrete state space counterpart. That is, the probabilistic dependence on the past is only through the present state, which contains all the necessary information for the evolution of the process. Useful introductory texts on homogeneous and non-homogeneous Markov chains and processes are [1,2,3] and ([4], Chapter 3). For Markov Chains on general state space, an excellent reference is [5]. For Markov systems or Open Markov models, which are generalizations of the Markov chain, an introductory review paper is [6]. We now provide a brief description of the articles of the special issue that could be included in this category:
i 1 Geometric Ergodicity of the Random Walk Metropolis with Position-Dependent Proposal Covariance, by Samuel Livingston [7]. In this paper, the ergodic behaviour of a Markov Chain Monte Carlo (MCMC) method is analysed and specifically the Metropolis–Hastings method. MCMC methods are used for estimating the expectations of a probability measure π . , which need not be normalized. This is done by sampling a Markov chain, which has asymptotic distribution π . , and computing the empirical averages. It is vital for the quality of the estimators to have conditions on π . that will produce in a Markov chain, which will converge asymptotically at a geometric rate. In the present work with proposal N x , h G x 1 a Metropolis–Hastings algorithm is studied where x is its current state and the ergodicity properties are investigated. It is shown that appropriate selections of G x severely influence the ergodicity properties in comparison to the respective Random Walk Metropolis.
i 2 Non-Homogeneous Markov Set Systems, by P.-C.G. Vassiliou [8]. The class of stochastic processes defined as Non-Homogeneous Markov systems are, in effect, a generalization of a Markov chain. This provides a general framework for the many stochastic models used to model populations of different kinds of entities with a large diversity. In the present study, for the first time, the basic parameters of a NHMS are in intervals and not point estimates. It is proven that, under certain conditions of convexity of the intervals, the set of the relative expected population structure of memberships is compact and convex. A series of theorems are provided and proved on the asymptotic behaviour and the limit set of the expected relative population structure. Finally, an application for a geriatric and stroke patients in a hospital is presented, and, through which, solutions are provided for the problems that are usually surface in such applications.
i 3 Period-Life of a Branching Process with Migration and Continuous Time, by Prysyazhnyk, K., Bazylevych, I., Mitkvova, L. and Ivanochko, I. [9]. Branching processes are a common tool for the mathematical representation of real processes, such as chemical, biological, demographic, and so on. The reason is that BP can easily describe the population dynamics of entities under different contexts (from physics and chemistry to biology and information technology). There exists a large number of variants of of BPs, and, in this article, the authors investigate the Markov branching process model with migration in continuous time.
The distribution of the period-life is the length of the time interval between the moment when the process is initiated by a positive number of particles and the moment when there are no individuals in the population for the first time. The form of the differential equation and the probability generating function of the random process that describes the behaviour of the process within its period-life is presented. In addition, the limit theorem for the period-life of the subcritical and critical BPMCT was found.
i 4 Optimizing a Multi-State Cold-Standby System With Multiple Vacations in the Repair and Loss of Units, by Ruiz-Castro, J.E. [10]. This article focuses on redundant systems and preventive maintenance as fundamental pylons in ensuring systems reliability, minimizing failures, and reducing costs. In particular, the author studies a complex multi-state system subject to multiple events such as several types of internally or externally induced failures. The analysis takes into account the loss of units due to non-repairable failures, and it is assumed that the system can still operate with one less unit. There is also a repair person whose behaviour is determined by the number of units within the repair facility and the vacation policy applied. This system is is modelled via Markovian Arrival Processes with marked arrivals. The author presents the stationary distributions and multiple measures related to system and financial performance.
i 5 Using Markov Models to Characterize and Predict Process Target Compliance, by McClean, S. [11]. A general phase-type Markov model is presented to predict the process target compliance. The Markov model has several absorbing states with different targets and Poisson arrivals. Several theoretical results are provided, and several close analytic formulas are founded, which provide useful characterizations and predictions in a sufficient lead time of various target compliance. The results are illustrated using data from a stroke patient unit, where there are multiple discharge destinations for patients, namely death, private nursing home, or the patient’s own home, where different discharge destinations may require disparate targets. Key performance indicators are also established, which are important and common place in health care, business, and industrial processes.
i 6 Open Markov Type Population Models: From Discrete to Continuous Time, by Esquivel, M.L., Krasii, N.P. and Guerreiro, G.R. [12]. The study of homogeneous open Markov population models in discrete and continuous time and discrete state space has a long and important history of seventy five years. Over the last forty years, attention has been shifted to the study of non-homogeneous Markov systems or equivalently to non-homogeneous open Markov population models in discrete and continuous time and discrete state space. Lately, there are also studies of non-homogeneous Markov systems in discrete time and general state space. The main contribution of the present work are to extend the results on open Markov chains in discrete time to some continuous time processes of Markov type using different methods of associating a continuous process to an observed process in discrete time.
i 7 Partial Diffusion Markov Model of Heterogeneous TCP Link: Optimization With Incomplete Information, by Borisov, A., Bosov, A., Miller, G., and Sokolov, I. [13]. This paper deals with an old acquaintance that still is an object of perpetual investigation and evolution: the Transmission Control Protocol (TCP). The authors present a new mathematical model of TCP using partially observable controllable Markov jump process (MJP) in a finite state space. The observations of the stochastic dynamic system are formed by low-frequency counting processes of packet losses and timeouts and a high-frequency compound Poisson process of packet acknowledgements.
In this respect, the entire information transmission process is considered as a stochastic control problem with incomplete information. The first aim of the paper is to present of a new mathematical model of the TCP link operation based on the heterogeneous (wired/wireless) channel, and the second aim is the presentation of a new TCP prototype version based on the solution of the optimal MJP state control under complete information as well as the solution to the optimal MJP state filtering given the diffusion and counting observations. The performance of the proposed model is demonstrated with numerical experiments.
i 8 Evaluating the Efficiency of Off-Ball Screens in Elite Basketball Teams via Second-Order Markov Modelling, by Stavropoulos, N., Papadopoulou, A.A., and Kolias, P. [14]. This paper falls in the area of sport oriented stochastic modelling, including sports performance analytics and mathematical optimization. The systematic use of performance indicators in the strategy orientation of sports teams has been the subject of extended research in recent years, and basketball is not an exception. The authors employ second-order, partially non-homogeneous, Markov models to gain insight into the behaviours and interactions of the players using the screens and the final attempt of the shots on the weak side. More specifically, they develop a second-order Markov modelling framework to evaluate the characteristics of off-ball screens that affect the finishing move and the outcome of the offensive movement. In addition, they examine how time, expressed either as the quarter of play or as the time clock (0–24 s), could influence the transition probabilities from screens and finishing moves to outcomes.The authors used a sample of 1170 possessions of the FIBA Basketball Champions League 2018–2019, and the particular variables of interest were the type of screen on the weak side, the finishing move, and the outcome of the shot. The proposed model provides useful information for coaches who can use it in both individual and group training programs as a part of their strategic planning for performance improvement.
i i Semi-Markov Chains, Processes, and Semi-Markov Systems.
Semi-Markov chains are generalizations of Markov chains where the time of transition from each state to another is now a random variable. The same applies for semi-Markov processes, except that now the time is continuous. A very good text on Semi-Markov chains and processes for the interested reader is [15]. For semi-Markov systems or open semi-Markov models, which are, again, generalizations of Markov chains, the first paper that introduced them was [16], and this is a good place to start. We provide below a brief description of the articles that could be categorized in this section:
i i 1 Discrete Time Hybrid Semi-Markov Models in Manpower Planning, by Verbeken, B. and Guerry, M-A. [17]. The present work is on non-homogeneous semi-Markov systems and, in particular, their traditional roots on manpower planning. Non-homogeneous semi-Markov systems have found important applications in a large variety of areas, such as biological phenomena, ecological modelling, DNA analysis, credit risk in mathematical finance, reliability and survival analysis, disability insurance problems, and wind tornado problems. The paper argues that, in a semi-Markov model for manpower problems, there is an advantage in considering some of the transitions as being purely in a Markov model. That is, the joint distribution in these states is a constant. There is also a section where solutions are provided for the problems that surface when applying such models to manpower systems.
i i 2 On State Occupancies, First Passage Times and Duration in Non-Homogeneous Semi-Markov Chains, by Georgiou, A.C., Papadopoulou, A.A., Kolias, P., Palikrousis, H., and Farmakioti, E. [18]. A basic aspect of Semi-Markov processes (SMC) is the utilization of general sojourn time distributions. This paper offers insights for three classes of relevant probabilities of a semi-Markov process and, more specifically, on the first passage time, the occupancy and the duration probabilities. The paper provides closed forms for the three classes of probabilities using the basic parameters of the process and initiating from the recursive relations of the aforementioned probabilities. The analytical results are accompanied with illustrations on the human genome DNA strands, which are often studied using Markovian models. There exist several algorithmic approaches analysing the occupancy and appearance of words in DNA sequences; however, the results suggest that the proposed modelling framework can be also used to investigate the structure of genome sequences.
i i 3 Sequential Interval Reliability for Discrete-Time Homogeneous Semi-Markov Repairable Systems, by Barbu, V.S., D’Amico, G. Gkelsinis, T. [19]. This is another paper of the special issue concerned with reliability indicators; however, this one is dedicated to semi-Markov systems. The authors introduce a new reliability measure, namely the sequential interval reliability (SIR), for homogeneous semi-Markov repairable systems in discrete time. This measure generalises the notion of interval reliability, and takes into account the dependence on what is called the final backward.
As mentioned by the authors, interval reliability was first introduced and studied for continuous-time semi-Markov systems. This measure computes the probability that a system is in a working state during a sequence of non-overlapping intervals, and this is important in applications where a system performs during consequent time periods, in the cases of extreme events over several time periods, in electricity consumption where certain thresholds are exceeded, or in financial modelling and relevant credit scoring models. The article introduces the sequential interval reliability measure from both aspects: transient analysis, providing a recurrence formula and its asymptotic result as time tends to infinity. The paper includes a numerical example.
i i i Related Stochastic Processes.
The subject of introduction in the theory of stochastic processes is well established, and there are many useful introductory or advanced texts that could help any interested reader. Any text with a medium mathematical level will suffice as a first useful tool for the articles that follow. Nevertheless, as a first course in stochastic processes, the book [20] may well serve the purpose. For the more advanced reader, the books [21,22,23] are excellent and may well suffice for further reading.
We provide a brief description of the articles that could be categorized in this section:
i i i 1 Tails of the Moments for Sums with Dominatedly Varying Random Summands, by Dirma, M., Paukstys, S., and Siaulys, J. [24] This paper investigates the asymptotic behaviour of tails of the moments for randomly weighted sums with possibly dependent dominatedly varying summands. The findings improve and generalise other related results of the relevant literature. For example, the authors achieve sharper asymptotic bounds under pairwise quasi-asymptotic independence structure. In addition, the relaxation of the exponent condition allows for the possibility to be any fixed non-negative real number. In the case of randomly weighted sums, the boundedness condition on the random weights is substituted by the less restrictive moments condition. The authors illustrate and conform their asymptotic results with a Monte Carlo simulation with three specific cases of random sums from disjoint sub-classes of dominatedly varying distributions.
i i i 2 Particle Filtering: A Priori Estimation of Observational Errors of a State-Space Model with Linear Observation Equation, by Lykou, R. and Tsaklidis, G. [25] This is the first of two papers related to observational errors of Particle Filtering. Particle Filter (PF) methodology that deals with the estimation of latent variables of stochastic processes taking into consideration noisy observations generated by the latent variables. The paper focuses on state-space models with linear observation equations and provides an estimation of the errors of missing observations (in cases of missing data) aiming at the approximation of weights under a Missing At Random (MAR) assumption.
In this article, the observational errors are estimated prior to the upcoming observations. This action is added to the basic algorithm of the filter as a new step for the acquisition of the state estimations. As mentioned above, this intervention is mainly useful in the presence of missing data problems, as well as in sample tracking for impoverishment issues. The linearity assumption permits sequential replacements of missing values with equal quantities of known distributions. The contribution of the a priori estimation step to the study of impoverishment phenomena is also exhibited through Markov System (MS) framework. A simulation example is provided, highlighting the advantages of the proposed algorithm to existing approaches.
i i i 3 State Space Modeling with Non-Negativity Constraints Using Quadratic Forms, by Theodosiadou, O. and Tsaklidis, G. [26] This article is the second on state space modelling methods. It proposes a method in state space modelling representation, which deals with hidden components that are subject to non-negativity constraints. It is known that state space models are used for the estimation of hidden random variables when noisy observations are available; however, if the state vector is subject to constraints, the standard Kalman filtering algorithm can no longer be used since it assumes linearity.
The proposed model’s state equation describing the dynamic evolution of the hidden states vector is expressed through non-negative definite quadratic forms and, in fact, represents a non-negative valued Markovian stochastic process of order one. The proposed method provides a constrained optimization problem for which stationary points are derived and conditions for feasibility are provided. The proposed methodology exhibits a lower computational load when compared to other nonlinear filtering methods.

References

  1. Iosifescu, M. Finite Markov Processes and Applications; John Wiley & Son: New York, NY, USA, 1980. [Google Scholar]
  2. Isaacson, D.; Madsen, R. Markov Chains Theory and Applications; John Wiley & Son: New York, NY, USA, 1976. [Google Scholar]
  3. Seneta, E. Non-Negative Matrices and Markov Chains; Springer: Berlin/Heidelberg, Germany, 1981. [Google Scholar]
  4. Vassiliou, P.-C.G. Discrete-Time Asset Pricing Models in Applied Stochastic Finance; John Wiley & Son: New York, NY, USA, 2010. [Google Scholar]
  5. Meyn, S.; Tweedie, R. Markov Chains and Stochastic Stability; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
  6. Vassiliou, P.-C.G. The evolution of the theory of non-homogeneous Markov systems. Appl. Stoch. Models Data Anal. 1997, 13, 159–176. [Google Scholar] [CrossRef]
  7. Livingston, S. Geometric ergodicity of the random walk Metropolis with position-independent proposal covariance. Mathematics 2021, 9, 341. [Google Scholar] [CrossRef]
  8. Vassiliou, P.-C.G. Non-homogeneous Markov set systems. Mathematics 2021, 9, 471. [Google Scholar] [CrossRef]
  9. Prysyazhnyk, K.; Bazylevych, I.; Mitkvova, L.; Ivanochko, I. Period-Life of a Branching Process with Migration and continuous time. Mathematics 2021, 9, 868. [Google Scholar] [CrossRef]
  10. Ruiz-Castro, J. Optimizing a multi-state cold-standby system with multiple vacations in the repair and loss of units. Mathematics 2021, 9, 913. [Google Scholar] [CrossRef]
  11. McClean, S. Using Markov models to characterize and predict process target compliance. Mathematics 2021, 9, 1187. [Google Scholar] [CrossRef]
  12. Esquivel, M.; Krasii, N.; Guerreiro, G. Open Markov type population models: From discrete to continuous time. Mathematics 2021, 9, 1496. [Google Scholar] [CrossRef]
  13. Borisov, A.; Bosov, A.; Miller, G.; Sokolov, I. Partial diffusion Markov model of heterogeneous TCP link: Optimization with incomplete information. Mathematics 2021, 9, 1632. [Google Scholar] [CrossRef]
  14. Stavropoulos, N.; Papadopoulou, A.; Kolias, P. Evaluating the efficiency of off-ball screens in elite basketball teams via second order Markov modelling. Mathematics 2021, 9, 1991. [Google Scholar] [CrossRef]
  15. Howard, R. Dynamic Probabilistic Systems: Semi-Markov and Decision Processes; Dover Publications: New York, NY, USA, 2007. [Google Scholar]
  16. Vassiliou, P.-C.G.; Papadopoulou, A.A. Non-homogeneous semi-Markov systems and maintainability of the state sizesl teams via second order Markov modelling. J. Appl. Probab. 1992, 29, 519–534. [Google Scholar] [CrossRef]
  17. Verbeken, B.; Guerry, M.A. Discrete time hybrid semi-Markov models in Manpower planning. Mathematics 2021, 9, 1681. [Google Scholar] [CrossRef]
  18. Georgiou, A.; Papadopoulou, A.; Kolias, P.; Palikrousis, H.; Farmakioti, E. On state occupancies, first passage times and duration in non-homogeneous semi-Markov chains. Mathematics 2021, 9, 1745. [Google Scholar] [CrossRef]
  19. Barbu, V.; D’ Amico, G.; Gkelsinis, T. Sequential interval reliability for discrete-time homogeneous semi-Markov repairable systems. Mathematics 2021, 9, 1997. [Google Scholar] [CrossRef]
  20. Karlin, S.; Taylor, H. A First Course in Stochastic Processes; Academic Press: New York, NY, USA, 1975. [Google Scholar]
  21. Cox, D.; Miller, H. The Theory of Stochastic Processes; Chapman and Hall: London, UK, 1965. [Google Scholar]
  22. Doob, J. Stochastic Processes; All Time Classic Series; John Wiley & Son: New York, NY, USA, 1965. [Google Scholar]
  23. Grimmett, G.; Stirzaker, D. Probability and Random Processes, 3rd ed.; Oxford University Press: Oxford, UK, 2001. [Google Scholar]
  24. Dirma, M.; Paukstys, S.; Siaulys, J. Tails of the Moments for sums with dominatedly varying random summands. Mathematics 2021, 9, 824. [Google Scholar] [CrossRef]
  25. Lykoy, R.; Tsaklidis, G. Particle filtering: A priori estimation of observational errors of a state-space model with linear observation equations. Mathematics 2021, 9, 1445. [Google Scholar] [CrossRef]
  26. Theodosiadou, O.; Tsaklidis, G. State space modelling with non-negativity constraints using quadratic forms. Mathematics 2021, 9, 1908. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vassiliou, P.-C.G.; Georgiou, A.C. Markov and Semi-Markov Chains, Processes, Systems, and Emerging Related Fields. Mathematics 2021, 9, 2490. https://doi.org/10.3390/math9192490

AMA Style

Vassiliou P-CG, Georgiou AC. Markov and Semi-Markov Chains, Processes, Systems, and Emerging Related Fields. Mathematics. 2021; 9(19):2490. https://doi.org/10.3390/math9192490

Chicago/Turabian Style

Vassiliou, P.-C.G., and Andreas C. Georgiou. 2021. "Markov and Semi-Markov Chains, Processes, Systems, and Emerging Related Fields" Mathematics 9, no. 19: 2490. https://doi.org/10.3390/math9192490

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop