Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,105)

Search Parameters:
Keywords = maximum likelihood estimator

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 411 KB  
Article
Infodemic Source Detection with Information Flow: Foundations and Scalable Computation
by Zimeng Wang, Chao Zhao, Qiaoqiao Zhou, Chee Wei Tan and Chung Chan
Entropy 2025, 27(9), 936; https://doi.org/10.3390/e27090936 (registering DOI) - 6 Sep 2025
Abstract
We consider the problem of identifying the source of a rumor in a network, given only a snapshot observation of infected nodes after the rumor has spread. Classical approaches, such as the maximum likelihood (ML) and joint maximum likelihood (JML) estimators based on [...] Read more.
We consider the problem of identifying the source of a rumor in a network, given only a snapshot observation of infected nodes after the rumor has spread. Classical approaches, such as the maximum likelihood (ML) and joint maximum likelihood (JML) estimators based on the conventional Susceptible–Infectious (SI) model, exhibit degeneracy, failing to uniquely identify the source even in simple network structures. To address these limitations, we propose a generalized estimator that incorporates independent random observation times. To capture the structure of information flow beyond graphs, our formulations consider rate constraints on the rumor and the multicast capacities for cyclic polylinking networks. Furthermore, we develop forward elimination and backward search algorithms for rate-constrained source detection and validate their effectiveness and scalability through comprehensive simulations. Our study establishes a rigorous and scalable foundation on the infodemic source detection. Full article
(This article belongs to the Special Issue Applications of Information Theory to Machine Learning)
34 pages, 31206 KB  
Article
Statistical Evaluation of Alpha-Powering Exponential Generalized Progressive Hybrid Censoring and Its Modeling for Medical and Engineering Sciences with Optimization Plans
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Symmetry 2025, 17(9), 1473; https://doi.org/10.3390/sym17091473 (registering DOI) - 6 Sep 2025
Abstract
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, [...] Read more.
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, with the alpha-power transformation amplifying or dampening this skewness depending on the power parameter. The proposed censoring design offers new insights into modeling lifetime data that exhibit non-monotonic hazard behaviors. It enhances testing efficiency by simultaneously imposing fixed-time constraints and ensuring a minimum number of failures, thereby improving inference quality over traditional censoring methods. We derive maximum likelihood and Bayesian estimates for the APE distribution parameters and key reliability measures, such as the reliability and hazard rate functions. Bayesian analysis is performed using independent gamma priors under a symmetric squared error loss, implemented via the Metropolis–Hastings algorithm. Interval estimation is addressed using two normality-based asymptotic confidence intervals and two credible intervals obtained through a simulated Markov Chain Monte Carlo procedure. Monte Carlo simulations across various censoring scenarios demonstrate the stable and superior precision of the proposed methods. Optimal censoring patterns are identified based on the observed Fisher information and its inverse. Two real-world case studies—breast cancer remission times and global oil reserve data—illustrate the practical utility of the APE model within the proposed censoring framework. These applications underscore the model’s capability to effectively analyze diverse reliability phenomena, bridging theoretical innovation with empirical relevance in lifetime data analysis. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
30 pages, 6483 KB  
Article
The Generative Adversarial Approach: A Cautionary Tale of Finite Samples
by Marcos Escobar-Anel and Yiyao Jiao
Algorithms 2025, 18(9), 564; https://doi.org/10.3390/a18090564 - 5 Sep 2025
Abstract
Given the relevance and wide use of the Generative Adversarial (GA) methodology, this paper focuses on finite samples to better understand its benefits and pitfalls. We focus on its finite-sample properties from both statistical and numerical perspectives. We set up a simple and [...] Read more.
Given the relevance and wide use of the Generative Adversarial (GA) methodology, this paper focuses on finite samples to better understand its benefits and pitfalls. We focus on its finite-sample properties from both statistical and numerical perspectives. We set up a simple and ideal “controlled experiment” where the input data are an i.i.d. Gaussian series where the mean is to be learned, and the discriminant and generator are in the same distributional family, not a neural network (NN), as in the popular GAN. We show that, even with the ideal discriminant, the classical GA methodology delivers a biased estimator while producing multiple local optima, confusing numerical methods. The situation worsens when the discriminator is in the correct parametric family but is not the oracle, leading to the absence of a saddle point. To improve the quality of the estimators within the GA method, we propose an alternative loss function, the alternative GA method, that leads to a unique saddle point with better statistical properties. Our findings are intended to start a conversation on the potential pitfalls of GA and GAN methods. In this spirit, the ideas presented here should be explored in other distributional cases and will be extended to the actual use of an NN for discriminators and generators. Full article
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)
Show Figures

Figure 1

18 pages, 684 KB  
Article
A New Topp–Leone Odd Weibull Flexible-G Family of Distributions with Applications
by Fastel Chipepa, Mahmoud M. Abdelwahab, Wellington Fredrick Charumbira and Mustafa M. Hasaballah
Mathematics 2025, 13(17), 2866; https://doi.org/10.3390/math13172866 - 5 Sep 2025
Abstract
The acceptance of generalized distributions has significantly improved over the past two decades. In this paper, we introduce a new generalized distribution: Topp–Leone odd Weibull flexible-G family of distributions (FoD). The new FoD is a combination of two FOD; the Topp–Leone-G and odd [...] Read more.
The acceptance of generalized distributions has significantly improved over the past two decades. In this paper, we introduce a new generalized distribution: Topp–Leone odd Weibull flexible-G family of distributions (FoD). The new FoD is a combination of two FOD; the Topp–Leone-G and odd Weibull-flexible-G families. The proposed FoD possesses more flexibility compared to the two individual FoD when considered separately. Some selected statistical properties of this new model are derived. Three special cases from the proposed family are considered. The new model exhibits symmetry and long or short tails, and it also addresses various levels of kurtosis. Monte Carlo simulation studies were conducted to verify the consistency of the maximum likelihood estimators. Two real data examples were used as illustrations on the flexibility of the new model in comparison to other competing models. The developed model proved to perform better than all the selected competing models. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

19 pages, 2496 KB  
Article
Study on Multifactorial Effects Influencing the Critical Hot-Spot Temperature of Emulsified Matrix and Its Thermal Safety
by Yibo Zhang, Yan He and Xingxing Liang
Processes 2025, 13(9), 2840; https://doi.org/10.3390/pr13092840 - 4 Sep 2025
Abstract
This study focuses on the critical ignition conditions of emulsified matrix, defining the critical hot-spot temperature as the temperature at which the ignition probability of the emulsified matrix reaches 1% under the influence of an internal heat source within a fixed duration. By [...] Read more.
This study focuses on the critical ignition conditions of emulsified matrix, defining the critical hot-spot temperature as the temperature at which the ignition probability of the emulsified matrix reaches 1% under the influence of an internal heat source within a fixed duration. By establishing an experimental system, the critical hot-spot temperature of the emulsified matrix was systematically determined by combining the Langley method with maximum likelihood estimation for statistical analysis. Furthermore, the influence of bubble content and ambient pressure on the critical hot-spot temperature was investigated. The study reveals that the critical hot-spot temperature decreases with increasing ambient pressure (at 1 atm, 2 atm, and 3 atm) and bubble content (at 0%, 1.5%, and 3%). However, under the coupled effects of ambient pressure and bubbles, bubble overflow phenomena may attenuate their influence. Full article
(This article belongs to the Section Chemical Processes and Systems)
Show Figures

Figure 1

20 pages, 952 KB  
Article
Noise-Robust-Based Clock Parameter Estimation and Low-Overhead Time Synchronization in Time-Sensitive Industrial Internet of Things
by Long Tang, Fangyan Li, Zichao Yu and Haiyong Zeng
Entropy 2025, 27(9), 927; https://doi.org/10.3390/e27090927 - 3 Sep 2025
Viewed by 134
Abstract
Time synchronization is critical for task-oriented and time-sensitive Industrial Internet of Things (IIoT) systems. Nevertheless, achieving high-precision synchronization with low communication overhead remains a key challenge due to the constrained resources of IIoT devices. In this paper, we propose a single-timestamp time synchronization [...] Read more.
Time synchronization is critical for task-oriented and time-sensitive Industrial Internet of Things (IIoT) systems. Nevertheless, achieving high-precision synchronization with low communication overhead remains a key challenge due to the constrained resources of IIoT devices. In this paper, we propose a single-timestamp time synchronization scheme that significantly reduces communication overhead by utilizing the mechanism of AP to periodically collect sensor device data. The reduced communication overhead alleviates network congestion, which is essential for achieving low end-to-end latency in synchronized IIoT networks. Furthermore, to mitigate the impact of random delay noise on clock parameter estimation, we propose a noise-robust-based Maximum Likelihood Estimation (NR-MLE) algorithm that jointly optimizes synchronization accuracy and resilience to random delays. Specifically, we decompose the collected timestamp matrix into two low-rank matrices and use gradient descent to minimize reconstruction error and regularization, approximating the true signal and removing noise. The denoised timestamp matrix is then used to jointly estimate clock skew and offset via MLE, with the corresponding Cramér–Rao Lower Bounds (CRLBs) being derived. The simulation results demonstrate that the NR-MLE algorithm achieves a higher clock parameter estimation accuracy than conventional MLE and exhibits strong robustness against increasing noise levels. Full article
Show Figures

Figure 1

27 pages, 5825 KB  
Article
A New One-Parameter Model by Extending Maxwell–Boltzmann Theory to Discrete Lifetime Modeling
by Ahmed Elshahhat, Hoda Rezk and Refah Alotaibi
Mathematics 2025, 13(17), 2803; https://doi.org/10.3390/math13172803 - 1 Sep 2025
Viewed by 167
Abstract
The Maxwell–Boltzmann (MB) distribution is fundamental in statistical physics, providing an exact description of particle speed or energy distributions. In this study, a discrete formulation derived via the survival function discretization technique extends the MB model’s theoretical strengths to realistically handle lifetime and [...] Read more.
The Maxwell–Boltzmann (MB) distribution is fundamental in statistical physics, providing an exact description of particle speed or energy distributions. In this study, a discrete formulation derived via the survival function discretization technique extends the MB model’s theoretical strengths to realistically handle lifetime and reliability data recorded in integer form, enabling accurate modeling under inherently discrete or censored observation schemes. The proposed discrete MB (DMB) model preserves the continuous MB’s flexibility in capturing diverse hazard rate shapes, while directly addressing the discrete and often censored nature of real-world lifetime and reliability data. Its formulation accommodates right-skewed, left-skewed, and symmetric probability mass functions with an inherently increasing hazard rate, enabling robust modeling of negatively skewed and monotonic-failure processes where competing discrete models underperform. We establish a comprehensive suite of distributional properties, including closed-form expressions for the probability mass, cumulative distribution, hazard functions, quantiles, raw moments, dispersion indices, and order statistics. For parameter estimation under Type-II censoring, we develop maximum likelihood, Bayesian, and bootstrap-based approaches and propose six distinct interval estimation methods encompassing frequentist, resampling, and Bayesian paradigms. Extensive Monte Carlo simulations systematically compare estimator performance across varying sample sizes, censoring levels, and prior structures, revealing the superiority of Bayesian–MCMC estimators with highest posterior density intervals in small- to moderate-sample regimes. Two genuine datasets—spanning engineering reliability and clinical survival contexts—demonstrate the DMB model’s superior goodness-of-fit and predictive accuracy over eleven competing discrete lifetime models. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
Show Figures

Figure 1

11 pages, 4728 KB  
Article
Identification of Interacting Objects and Evaluation of Interaction Loss from Wideband Double-Directional Channel Measurement Data by Using Point Cloud Data
by Djiby Marema Diallo and Jun-ichi Takada
Electronics 2025, 14(17), 3495; https://doi.org/10.3390/electronics14173495 - 31 Aug 2025
Viewed by 181
Abstract
This paper proposes an approach to identify interacting objects (IOs) and determine their interaction losses (ILs) using point cloud data from wide-band double-directional channel sounding data. The scattering points (SPs) were identified using the maximum likelihood-based approach applied to the high-resolution path parameters [...] Read more.
This paper proposes an approach to identify interacting objects (IOs) and determine their interaction losses (ILs) using point cloud data from wide-band double-directional channel sounding data. The scattering points (SPs) were identified using the maximum likelihood-based approach applied to the high-resolution path parameters estimated from the channel sounding data, and then IOs were identified via visual inspection of SPs within a 3D point cloud. The proposed approach utilizes all path parameters to calculate the approximate likelihood function for all candidate SPs to determine the SP, regardless of the propagation mechanism. The proposed technique was demonstrated at a suburban residential site with a frequency of 11 GHz. The results show that IOs that are not usually considered in the ray-tracing simulation were identified. Full article
Show Figures

Figure 1

17 pages, 1862 KB  
Article
Molecular Epidemiology of SARS-CoV-2 Detected from Different Areas of the Kandy District of Sri Lanka from November 2020–March 2022
by Bushran N. Iqbal, Sibra R. M. Shihab, Tao Zhang, Aadhil Ahamed, Shiyamalee Arunasalam, Samanthika Jagoda, Leo L. M. Poon, Malik Peiris and Faseeha Noordeen
Viruses 2025, 17(9), 1189; https://doi.org/10.3390/v17091189 - 29 Aug 2025
Viewed by 317
Abstract
A comprehensive analysis of the molecular epidemiology of SARS-CoV-2 in the Kandy District of Sri Lanka from November 2020 to March 2022 was conducted to address the limited genomic surveillance data available across the country. The study investigated the circulating SARS-CoV-2 lineages, their [...] Read more.
A comprehensive analysis of the molecular epidemiology of SARS-CoV-2 in the Kandy District of Sri Lanka from November 2020 to March 2022 was conducted to address the limited genomic surveillance data available across the country. The study investigated the circulating SARS-CoV-2 lineages, their temporal dynamics, and the associated mutational profiles in the study area. A total of 280 SARS-CoV-2-positive samples were selected, and 252 complete genomes were successfully sequenced using Oxford Nanopore Technology. Lineage classification was performed using the EPI2ME tool, while phylogenetic relationships were inferred through maximum likelihood and time-scaled phylogenetic trees using IQ-TREE2 and BEAST, respectively. Amino acid substitutions were analyzed to understand lineage-specific mutation patterns. Fifteen SARS-CoV-2 lineages were identified, and of those B.1.411 (36%) was the most prevalent, followed by Q.8 (21%), AY.28 (9.5%), and the Delta and Omicron variants. The lineage distribution showed a temporal shift from B.1.411 to Alpha, Delta, and finally the Omicron, mirroring the global trends. Time to the most recent common ancestor analyses provided estimates for the introduction of major variants, while mutation analysis revealed the widespread occurrence of D614G in the spike protein and lineage-specific mutations across structural, non-structural, and accessory proteins.Detection of the Epsilon variant (absent in other national-level studies) in November 2020, highlighted the regional heterogeneity viral spread. This study emphasizes the importance of localized genomic surveillance to capture the true diversity and evolution of SARS-CoV-2, to facilitate containment strategies in resource-limited settings. Full article
(This article belongs to the Section Coronaviruses)
Show Figures

Figure 1

15 pages, 926 KB  
Systematic Review
Refractive Outcomes in Keratoconus Patients Following Toric Lens Implantation: A Systematic Review and Single-Group Meta-Analysis
by Tun Giap Tan, Kieran O’Kane and Harry W. Roberts
Life 2025, 15(9), 1362; https://doi.org/10.3390/life15091362 - 27 Aug 2025
Viewed by 533
Abstract
This systematic review and meta-analysis evaluated refractive outcomes, particularly astigmatic correction, in keratoconus following toric intraocular lens (tIOL) implantation. A systematic search identified eligible studies reporting pre- and postoperative refractive cylinder, spherical equivalent (SE), uncorrected distance visual acuity (UDVA), and corrected distance visual [...] Read more.
This systematic review and meta-analysis evaluated refractive outcomes, particularly astigmatic correction, in keratoconus following toric intraocular lens (tIOL) implantation. A systematic search identified eligible studies reporting pre- and postoperative refractive cylinder, spherical equivalent (SE), uncorrected distance visual acuity (UDVA), and corrected distance visual acuity (CDVA). Eight studies, comprising 135 eyes, were included. Outcomes were pooled using a random-effects model with restricted maximum likelihood as the estimator for tau2. Methodological quality was assessed using the MINORS tool for non-comparative studies and the JBI checklist for case series. Postoperative refractive cylinder and SE improved by 2.28 dioptres (95% CI, 1.60–2.96) and 4.17 dioptres (95% CI, 2.32–6.01), respectively. UDVA and CDVA also improved substantially, with pooled gains of 0.87 logMAR (95% CI, 0.71–1.03) and 0.19 logMAR (95% CI, 0.12–0.26), respectively. Most tIOL rotations did not exceed 10 degrees, with only one case requiring realignment surgery. Complications were infrequent and mostly minor. tIOL implantation is effective in reducing astigmatism and improving vision in stable keratoconus patients. However, limitations in vector analysis and methodology heterogeneity underscore the need for standardised reporting to optimise outcomes. Full article
(This article belongs to the Special Issue Vision Science and Optometry: 2nd Edition)
Show Figures

Figure 1

18 pages, 844 KB  
Article
LINEX Loss-Based Estimation of Expected Arrival Time of Next Event from HPP and NHPP Processes Past Truncated Time
by M. S. Aminzadeh
Analytics 2025, 4(3), 20; https://doi.org/10.3390/analytics4030020 - 26 Aug 2025
Viewed by 326
Abstract
This article introduces a computational tool for Bayesian estimation of the expected time until the next event occurs in both homogeneous Poisson processes (HPPs) and non-homogeneous Poisson processes (NHPPs), following a truncated time. The estimation utilizes the linear exponential (LINEX) asymmetric loss function [...] Read more.
This article introduces a computational tool for Bayesian estimation of the expected time until the next event occurs in both homogeneous Poisson processes (HPPs) and non-homogeneous Poisson processes (NHPPs), following a truncated time. The estimation utilizes the linear exponential (LINEX) asymmetric loss function and incorporates both gamma and non-informative priors. Furthermore, it presents a minimax-type criterion to ascertain the optimal sample size required to achieve a specified percentage reduction in posterior risk. Simulation studies indicate that estimators employing gamma priors for both HPP and NHPP demonstrate greater accuracy compared to those based on non-informative priors and maximum likelihood estimates (MLE), provided that the proposed data-driven method for selecting hyperparameters is applied. Full article
Show Figures

Figure 1

28 pages, 3244 KB  
Article
A Novel Poisson–Weibull Model for Stress–Strength Reliability Analysis in Industrial Systems: Bayesian and Classical Approaches
by Hadiqa Basit, Mahmoud M. Abdelwahab, Shakila Bashir, Aamir Sanaullah, Mohamed A. Abdelkawy and Mustafa M. Hasaballah
Axioms 2025, 14(9), 653; https://doi.org/10.3390/axioms14090653 - 22 Aug 2025
Viewed by 296
Abstract
Industrial systems often rely on specialized redundant systems to enhance reliability and prevent unexpected failures. This study introduces a novel three-parameter model, the Poisson–Weibull distribution (PWD), and discovers its various key properties. The primary focus of the study is to develop stress–strength (SS) [...] Read more.
Industrial systems often rely on specialized redundant systems to enhance reliability and prevent unexpected failures. This study introduces a novel three-parameter model, the Poisson–Weibull distribution (PWD), and discovers its various key properties. The primary focus of the study is to develop stress–strength (SS) model based on this newly developed distribution. Parameter estimation for both the PWD and SS models is carried out using maximum likelihood estimation (MLE) and Bayesian estimation techniques. Given the complexity of the proposed distribution, numerical approximation techniques are employed to obtain reliable parameter estimates. A comprehensive simulation study employing the Monte Carlo simulation (MCS) and Markov Chain Monte Carlo (MCMC) examines the behavior of the PWD and SS model parameters under various scenarios. The development of the SS model enhances understanding of the PWD’s dynamics while providing practical insights into its real-life applications and limitations. The effectiveness of the proposed distribution and the SS reliability measure is established through applications to real-life data sets. Full article
Show Figures

Figure 1

24 pages, 7349 KB  
Article
Return Level Prediction with a New Mixture Extreme Value Model
by Emrah Altun, Hana N. Alqifari and Kadir Söyler
Mathematics 2025, 13(17), 2705; https://doi.org/10.3390/math13172705 - 22 Aug 2025
Viewed by 253
Abstract
The generalized Pareto distribution is frequently used for modeling extreme values above an appropriate threshold level. Since the process of determining the appropriate threshold value is difficult, a mixture of extreme value models rises to prominence. In this study, mixture extreme value models [...] Read more.
The generalized Pareto distribution is frequently used for modeling extreme values above an appropriate threshold level. Since the process of determining the appropriate threshold value is difficult, a mixture of extreme value models rises to prominence. In this study, mixture extreme value models based on exponentiated Pareto distribution are proposed. The Weibull, gamma, and log-normal models are used as bulk densities. The parameter estimates of the proposed models are obtained using the maximum likelihood approach. Two different approaches based on maximization of the log-likelihood and Kolmogorov–Smirnov p-value are used to determine the appropriate threshold value. The effectiveness of these methods is compared using simulation studies. The proposed models are compared with other mixture models through an application study on earthquake data. The GammaEP web application is developed to ensure the reproducibility of the results and the usability of the proposed model. Full article
(This article belongs to the Special Issue Mathematical Modelling and Applied Statistics)
Show Figures

Figure 1

27 pages, 4595 KB  
Article
The Unit Inverse Maxwell–Boltzmann Distribution: A Novel Single-Parameter Model for Unit-Interval Data
by Murat Genç and Ömer Özbilen
Axioms 2025, 14(8), 647; https://doi.org/10.3390/axioms14080647 - 21 Aug 2025
Viewed by 222
Abstract
The Unit Inverse Maxwell–Boltzmann (UIMB) distribution is introduced as a novel single-parameter model for data constrained within the unit interval (0,1), derived through an exponential transformation of the Inverse Maxwell–Boltzmann distribution. Designed to address the limitations of traditional unit-interval [...] Read more.
The Unit Inverse Maxwell–Boltzmann (UIMB) distribution is introduced as a novel single-parameter model for data constrained within the unit interval (0,1), derived through an exponential transformation of the Inverse Maxwell–Boltzmann distribution. Designed to address the limitations of traditional unit-interval distributions, the UIMB model exhibits flexible density shapes and hazard rate behaviors, including right-skewed, left-skewed, unimodal, and bathtub-shaped patterns, making it suitable for applications in reliability engineering, environmental science, and health studies. This study derives the statistical properties of the UIMB distribution, including moments, quantiles, survival, and hazard functions, as well as stochastic ordering, entropy measures, and the moment-generating function, and evaluates its performance through simulation studies and real-data applications. Various estimation methods, including maximum likelihood, Anderson–Darling, maximum product spacing, least-squares, and Cramér–von Mises, are assessed, with maximum likelihood demonstrating superior accuracy. Simulation studies confirm the model’s robustness under normal and outlier-contaminated scenarios, with MLE showing resilience across varying skewness levels. Applications to manufacturing and environmental datasets reveal the UIMB distribution’s exceptional fit compared to competing models, as evidenced by lower information criteria and goodness-of-fit statistics. The UIMB distribution’s computational efficiency and adaptability position it as a robust tool for modeling complex unit-interval data, with potential for further extensions in diverse domains. Full article
(This article belongs to the Section Mathematical Analysis)
Show Figures

Figure 1

28 pages, 5495 KB  
Article
Model Comparison and Parameter Estimation for Gompertz Distributions Under Constant Stress Accelerated Lifetime Tests
by Shuyu Du and Wenhao Gui
Appl. Sci. 2025, 15(16), 9199; https://doi.org/10.3390/app15169199 - 21 Aug 2025
Viewed by 357
Abstract
The accelerated lifetime test is a widely used and effective approach in reliability analysis because of its shorter testing duration. In this study, product lifetimes are assumed to follow the Gompertz distribution. This article primarily focuses on performance comparisons between the linear model [...] Read more.
The accelerated lifetime test is a widely used and effective approach in reliability analysis because of its shorter testing duration. In this study, product lifetimes are assumed to follow the Gompertz distribution. This article primarily focuses on performance comparisons between the linear model and the inverse power-law model, both of which are utilized to characterize the relationship between the shape parameter and stress levels. To test model robustness, we also generate data from the Sine-Modified Power Gompertz distribution, a more flexible alternative. We conduct Monte Carlo simulations using four estimation methods: the maximum likelihood method, the least squares method, the maximum product of spacing method, and the Cramér-von Mises method, for small, medium, and large sample sizes. The comparison of mean squared error serves as a critical indicator for evaluating the performance of different methods and models. Additionally, the shape parameter and reliability function are obtained based on the estimation results. Finally, a real dataset is analyzed to demonstrate the most suitable accelerated life model, and the Akaike Information Criterion is used to further assess model fit. Furthermore, we employ leave-one-out cross-validation (LOOCV) to prove this model’s generalizability. Full article
Show Figures

Figure 1

Back to TopTop