Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (91)

Search Parameters:
Keywords = exponential-generalized Gamma

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
34 pages, 31206 KB  
Article
Statistical Evaluation of Alpha-Powering Exponential Generalized Progressive Hybrid Censoring and Its Modeling for Medical and Engineering Sciences with Optimization Plans
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Symmetry 2025, 17(9), 1473; https://doi.org/10.3390/sym17091473 (registering DOI) - 6 Sep 2025
Abstract
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, [...] Read more.
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, with the alpha-power transformation amplifying or dampening this skewness depending on the power parameter. The proposed censoring design offers new insights into modeling lifetime data that exhibit non-monotonic hazard behaviors. It enhances testing efficiency by simultaneously imposing fixed-time constraints and ensuring a minimum number of failures, thereby improving inference quality over traditional censoring methods. We derive maximum likelihood and Bayesian estimates for the APE distribution parameters and key reliability measures, such as the reliability and hazard rate functions. Bayesian analysis is performed using independent gamma priors under a symmetric squared error loss, implemented via the Metropolis–Hastings algorithm. Interval estimation is addressed using two normality-based asymptotic confidence intervals and two credible intervals obtained through a simulated Markov Chain Monte Carlo procedure. Monte Carlo simulations across various censoring scenarios demonstrate the stable and superior precision of the proposed methods. Optimal censoring patterns are identified based on the observed Fisher information and its inverse. Two real-world case studies—breast cancer remission times and global oil reserve data—illustrate the practical utility of the APE model within the proposed censoring framework. These applications underscore the model’s capability to effectively analyze diverse reliability phenomena, bridging theoretical innovation with empirical relevance in lifetime data analysis. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
24 pages, 7349 KB  
Article
Return Level Prediction with a New Mixture Extreme Value Model
by Emrah Altun, Hana N. Alqifari and Kadir Söyler
Mathematics 2025, 13(17), 2705; https://doi.org/10.3390/math13172705 - 22 Aug 2025
Viewed by 253
Abstract
The generalized Pareto distribution is frequently used for modeling extreme values above an appropriate threshold level. Since the process of determining the appropriate threshold value is difficult, a mixture of extreme value models rises to prominence. In this study, mixture extreme value models [...] Read more.
The generalized Pareto distribution is frequently used for modeling extreme values above an appropriate threshold level. Since the process of determining the appropriate threshold value is difficult, a mixture of extreme value models rises to prominence. In this study, mixture extreme value models based on exponentiated Pareto distribution are proposed. The Weibull, gamma, and log-normal models are used as bulk densities. The parameter estimates of the proposed models are obtained using the maximum likelihood approach. Two different approaches based on maximization of the log-likelihood and Kolmogorov–Smirnov p-value are used to determine the appropriate threshold value. The effectiveness of these methods is compared using simulation studies. The proposed models are compared with other mixture models through an application study on earthquake data. The GammaEP web application is developed to ensure the reproducibility of the results and the usability of the proposed model. Full article
(This article belongs to the Special Issue Mathematical Modelling and Applied Statistics)
Show Figures

Figure 1

21 pages, 4164 KB  
Article
Geostatistical Analysis and Delineation of Groundwater Potential Zones for Their Implications in Irrigated Agriculture of Punjab Pakistan
by Aamir Shakoor, Imran Rasheed, Muhammad Nouman Sattar, Akinwale T. Ogunrinde, Sabab Ali Shah, Hafiz Umar Farid, Hareef Ahmed Keerio, Asim Qayyum Butt, Amjad Ali Khan and Malik Sarmad Riaz
World 2025, 6(3), 115; https://doi.org/10.3390/world6030115 - 15 Aug 2025
Viewed by 539
Abstract
Groundwater is essential for irrigated agriculture, yet its use remains unsustainable in many regions worldwide. In countries like Pakistan, the situation is particularly pressing. The irrigated agriculture of Pakistan heavily relies on groundwater resources owing to limited canal-water availability. The groundwater quality in [...] Read more.
Groundwater is essential for irrigated agriculture, yet its use remains unsustainable in many regions worldwide. In countries like Pakistan, the situation is particularly pressing. The irrigated agriculture of Pakistan heavily relies on groundwater resources owing to limited canal-water availability. The groundwater quality in the region ranges from good to poor, with the lower-quality water adversely affecting soil structure and plant health, leading to reduced agricultural productivity. The delineation of quality zones with respect to irrigation parameters is thus crucial for optimizing its sustainable use and management. Therefore, this research study was carried out in the Lower Chenab Canal (LCC) irrigation system to assess the spatial distribution of groundwater quality. The geostatistical analysis was conducted using Gamma Design Software (GS+) and the Kriging interpolation method was applied within a Geographic Information System (GIS) framework to generate groundwater-quality maps. Semivariogram models were evaluated for major irrigation parameters such as electrical conductivity (EC), residual sodium carbonate (RSC), and sodium adsorption ratio (SAR) to identify the best fit for various Ordinary Kriging models. The spherical semivariogram model was the best fit for EC, while the exponential model best suited SAR and RSC. Overlay analysis was performed to produce combined water-quality maps. During the pre-monsoon season, 17.83% of the LCC area demonstrated good irrigation quality, while 42.84% showed marginal quality, and 39.33% was deemed unsuitable for irrigation. In the post-monsoon season, 17.30% of the area had good irrigation quality, 44.53% exhibited marginal quality, and 38.17% was unsuitable for irrigation. The study revealed that Electrical Conductivity (EC) was the primary factor affecting water quality, contributing to 71% of marginal and unsuitable conditions. In comparison, the Sodium Adsorption Ratio (SAR) accounted for 38% and Residual Sodium Carbonate (RSC) contributed 45%. Therefore, it is recommended that groundwater in unsuitable zones be subjected to artificial recharge methods and salt-tolerated crops to enhance its suitability for agricultural applications. Full article
Show Figures

Figure 1

28 pages, 835 KB  
Article
Progressive First-Failure Censoring in Reliability Analysis: Inference for a New Weibull–Pareto Distribution
by Rashad M. EL-Sagheer and Mahmoud M. Ramadan
Mathematics 2025, 13(15), 2377; https://doi.org/10.3390/math13152377 - 24 Jul 2025
Viewed by 293
Abstract
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival [...] Read more.
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival and hazard rate functions, although these estimators do not have explicit closed-form solutions. The Newton–Raphson algorithm is employed for the numerical computation of these estimates. Confidence intervals for the parameters are approximated based on the asymptotic normality of the maximum likelihood estimators. The Fisher information matrix is calculated using the missing information principle, and the delta technique is applied to approximate confidence intervals for the survival and hazard rate functions. Bayesian estimators are developed under squared error, linear exponential, and general entropy loss functions, assuming independent gamma priors. Markov chain Monte Carlo sampling is used to obtain Bayesian point estimates and the highest posterior density credible intervals for the parameters and reliability measures. Finally, the proposed methods are demonstrated through the analysis of a real dataset. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

22 pages, 1823 KB  
Article
Heavy Rainfall Probabilistic Model for Zielona Góra in Poland
by Marcin Wdowikowski, Monika Nowakowska, Maciej Bełcik and Grzegorz Galiniak
Water 2025, 17(11), 1673; https://doi.org/10.3390/w17111673 - 31 May 2025
Viewed by 847
Abstract
The research focuses on probabilistic modeling of maximum rainfall in Zielona Góra, Poland, to improve urban drainage system design. The study utilizes archived pluviographic data from 1951 to 2020, collected at the IMWM-NRI meteorological station. These data include 10 min rainfall records and [...] Read more.
The research focuses on probabilistic modeling of maximum rainfall in Zielona Góra, Poland, to improve urban drainage system design. The study utilizes archived pluviographic data from 1951 to 2020, collected at the IMWM-NRI meteorological station. These data include 10 min rainfall records and aggregated hourly and daily totals. The study employs various statistical distributions, including Fréchet, gamma, generalized exponential (GED), Gumbel, log-normal, and Weibull, to model rainfall intensity–duration–frequency (IDF) relationships. After testing the goodness of fit using the Anderson–Darling test, Bayesian Information Criterion (BIC), and relative residual mean square Error (rRMSE), the GED distribution was found to best describe rainfall patterns. A key outcome is the development of a new rainfall model based on the GED distribution, allowing for the estimation of precipitation amounts for different durations and exceedance probabilities. However, the study highlights limitations, such as the need for more accurate local models and a standardized rainfall atlas for Poland. Full article
Show Figures

Figure 1

31 pages, 1059 KB  
Article
Bayesian and Non-Bayesian for Generalized Kavya–Manoharan Exponential Distribution Based on Progressive-Stress ALT Under Generalized Progressive Hybrid Censoring Scheme
by Ehab M. Almetwally, Osama M. Khaled, Hisham M. Almongy and Haroon M. Barakat
Axioms 2025, 14(6), 410; https://doi.org/10.3390/axioms14060410 - 28 May 2025
Viewed by 400
Abstract
Accelerated life tests are vital in reliability studies, especially as new technologies create highly reliable products to meet market demand and competition. Progressive stress accelerated life test (PSALT) allows continual stress adjustments. For reliability and survival analysis in accelerated life studies, generalized progressive [...] Read more.
Accelerated life tests are vital in reliability studies, especially as new technologies create highly reliable products to meet market demand and competition. Progressive stress accelerated life test (PSALT) allows continual stress adjustments. For reliability and survival analysis in accelerated life studies, generalized progressive hybrid censoring (GPHC) is very important. The research on GPHC in PSALT models is lacking despite its growing importance. Binomial elimination and generalized progressive hybrid censoring augment PSALT in this investigation. This research examines PSALT under the Generalized Kavya–Manoharan exponential distribution based on the GPHC scheme. Using gamma prior, maximum likelihood, and Bayesian techniques, estimate model parameters. Squared error and entropy loss functions yield Bayesian estimators using informational priors in simulation and non-informative priors in application. Various censoring schemes are calculated using Monte Carlo simulation. The methodology is demonstrated using two real-world accelerated life test data sets. Full article
Show Figures

Figure 1

20 pages, 2421 KB  
Article
Socioeconomic Profile of Agricultural Producers and Production Systems in Municipalities of Piauí, Brazil
by Creusa Carvalho da Costa, Ana Cristina Alves Rodrigues, Caroline Chaves Arantes, Graciliano Galdino Alves dos Santos and Emil José Hernández Ruz
Sustainability 2025, 17(9), 4137; https://doi.org/10.3390/su17094137 - 2 May 2025
Viewed by 896
Abstract
Floodplain agriculture is a practice that involves cultivating arable soils along riverbanks and reservoirs, which become submerged during the rainy season. This study aimed to analyze the socioeconomic aspects of floodplain farmers in the municipalities of Amarante, Floriano, and Uruçuí along the banks [...] Read more.
Floodplain agriculture is a practice that involves cultivating arable soils along riverbanks and reservoirs, which become submerged during the rainy season. This study aimed to analyze the socioeconomic aspects of floodplain farmers in the municipalities of Amarante, Floriano, and Uruçuí along the banks of the Parnaíba River in northeastern Brazil. We conducted semi-structured interviews using the rapport technique. Data were analyzed using generalized linear models with four distributions (gamma, inverse Gaussian, exponential, and Gaussian), with the aim of identifying patterns and relationships between socioeconomic variables and production system profiles. The average age of respondents was 49 years across the three communities, with a predominance of male farmers. Regarding the length of residence, communities in Uruçuí had lived in the area the longest. In terms of monthly income, 80% of farmers earned up to one minimum wage. Land size analysis indicated that properties in Amarante had the highest average land area in hectares. We conclude that agriculture in the region studied is dominated by manual planting, low adoption of technologies, and scarce use of soil conservation techniques, suggesting more sustainable agricultural practices, the development of management plans, and rural extension practices. Full article
Show Figures

Figure 1

33 pages, 15492 KB  
Article
Seasonal Bias Correction of Daily Precipitation over France Using a Stitch Model Designed for Robust Representation of Extremes
by Philippe Ear, Elena Di Bernardino, Thomas Laloë, Adrien Lambert and Magali Troin
Atmosphere 2025, 16(4), 480; https://doi.org/10.3390/atmos16040480 - 19 Apr 2025
Viewed by 1047
Abstract
Highly resolved and accurate daily precipitation data are required for impact models to perform adequately and correctly measure the impacts of high-risk events. In order to produce such data, bias correction is often needed. Most of those statistical methods correct the probability distributions [...] Read more.
Highly resolved and accurate daily precipitation data are required for impact models to perform adequately and correctly measure the impacts of high-risk events. In order to produce such data, bias correction is often needed. Most of those statistical methods correct the probability distributions of daily precipitation by modeling them with either empirical or parametric distributions. A recent semi-parametric model based on a penalized Berk–Jones (BJ) statistical test, which allows for automatic and personalized splicing of parametric and non-parametric distributions, has been developed. This method, called the Stitch-BJ model, was found to be able to model daily precipitation correctly and showed interesting potential in a bias correction setting. In the present study, we will consolidate these results by taking into account the seasonal properties of daily precipitation in an out-of-sample context and by considering dry days probabilities in our methodology. We evaluate the performance of the Stitch-BJ method in this seasonal bias correction setting against more classical models such as the Gamma, Exponentiated Weibull (ExpW), Extended Generalized Pareto (EGP) or empirical distributions. Results show that a seasonal separation of data is necessary in order to account for intra-annual non-stationarity. Moreover, the Stitch-BJ distribution was able to consistently perform as well as or better than all the other considered models over the validation set, including the empirical distribution, which is often used due to its robustness. Finally, while methods for correcting dry day probabilities can be easily applied, their relevance can be discussed as temporal and spatial correlations are often neglected. Full article
Show Figures

Figure 1

25 pages, 1729 KB  
Article
Exploring the Lindley Distribution in Stochastic Frontier Analysis: Numerical Methods and Applications
by İsmail Yenilmez
Symmetry 2024, 16(12), 1688; https://doi.org/10.3390/sym16121688 - 19 Dec 2024
Cited by 3 | Viewed by 1092
Abstract
This study introduces the Lindley Stochastic Frontier Analysis—LSFA model, a novel approach that incorporates the Lindley distribution to enhance the flexibility and accuracy of efficiency estimation. The LSFA model is compared against traditional SFA models, including the half-normal, exponential, and gamma models, using [...] Read more.
This study introduces the Lindley Stochastic Frontier Analysis—LSFA model, a novel approach that incorporates the Lindley distribution to enhance the flexibility and accuracy of efficiency estimation. The LSFA model is compared against traditional SFA models, including the half-normal, exponential, and gamma models, using advanced numerical methods such as the Gauss–Hermite Quadrature, Monte Carlo Integration, and Simulated Maximum Likelihood Estimation for parameter estimation. Simulation studies revealed that the LSFA model outperforms in scenarios involving small sample sizes and complex, skewed distributions, particularly those characterized by gamma distributions. In contrast, traditional models such as the half-normal model perform better in larger samples and simpler settings, while the gamma model is particularly effective under exponential inefficiency distributions. Among the numerical techniques, the Gauss–Hermite Quadrature demonstrates a strong performance for half-normal distributions, the Monte Carlo Integration offers consistent results across models, and the Simulated Maximum Likelihood Estimation shows robustness in handling gamma and Lindley distributions despite higher errors in simpler cases. The application to a banking dataset assessed the performance of 12 commercial banks pre-COVID-19 and during COVID-19, demonstrating LSFA’s superior ability to handle skewed and intricate data structures. LSFA achieved the best overall reliability in terms of the root mean square error and bias, while the gamma model emerged as the most accurate for minimizing absolute and percentage errors. These results highlight LSFA’s potential for evaluating efficiency during economic shocks, such as the COVID-19 pandemic, where data patterns may deviate from standard assumptions. This study highlights the advantages of the Lindley distribution in capturing non-standard inefficiency patterns, offering a valuable alternative to simpler distributions like the exponential and half-normal models. However, the LSFA model’s increased computational complexity highlights the need for advanced numerical techniques. Future research may explore the integration of generalized Lindley distributions to enhance model adaptability with enriched numerical optimization to establish its effectiveness across diverse datasets. Full article
(This article belongs to the Special Issue Symmetric or Asymmetric Distributions and Its Applications)
Show Figures

Figure 1

27 pages, 699 KB  
Article
Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application
by Mahmoud M. Ramadan, Rashad M. EL-Sagheer and Amel Abd-El-Monem
Axioms 2024, 13(12), 822; https://doi.org/10.3390/axioms13120822 - 25 Nov 2024
Cited by 2 | Viewed by 1055
Abstract
This paper investigates statistical methods for estimating unknown lifetime parameters using a progressive first-failure censoring dataset. The failure mode’s lifetime distribution is modeled by the odd-generalized-exponential–inverse-Weibull distribution. Maximum-likelihood estimators for the model parameters, including the survival, hazard, and inverse hazard rate functions, are [...] Read more.
This paper investigates statistical methods for estimating unknown lifetime parameters using a progressive first-failure censoring dataset. The failure mode’s lifetime distribution is modeled by the odd-generalized-exponential–inverse-Weibull distribution. Maximum-likelihood estimators for the model parameters, including the survival, hazard, and inverse hazard rate functions, are obtained, though they lack closed-form expressions. The Newton–Raphson method is used to compute these estimations. Confidence intervals for the parameters are approximated via the normal distribution of the maximum-likelihood estimation. The Fisher information matrix is derived using the missing information principle, and the delta method is applied to approximate the confidence intervals for the survival, hazard rate, and inverse hazard rate functions. Bayes estimators were calculated with the squared error, linear exponential, and general entropy loss functions, utilizing independent gamma distributions for informative priors. Markov-chain Monte Carlo sampling provides the highest-posterior-density credible intervals and Bayesian point estimates for the parameters and reliability characteristics. This study evaluates these methods through Monte Carlo simulations, comparing Bayes and maximum-likelihood estimates based on mean squared errors for point estimates, average interval widths, and coverage probabilities for interval estimators. A real dataset is also analyzed to illustrate the proposed methods. Full article
Show Figures

Figure 1

30 pages, 11511 KB  
Article
Sources and Radiations of the Fermi Bubbles
by Vladimir A. Dogiel and Chung-Ming Ko
Universe 2024, 10(11), 424; https://doi.org/10.3390/universe10110424 - 12 Nov 2024
Viewed by 1483
Abstract
Two enigmatic gamma-ray features in the galactic central region, known as Fermi Bubbles (FBs), were found from Fermi-LAT data. An energy release, (e.g., by tidal disruption events in the Galactic Center, GC), generates a cavity with a shock that expands into the local [...] Read more.
Two enigmatic gamma-ray features in the galactic central region, known as Fermi Bubbles (FBs), were found from Fermi-LAT data. An energy release, (e.g., by tidal disruption events in the Galactic Center, GC), generates a cavity with a shock that expands into the local ambient medium of the galactic halo. A decade or so ago, a phenomenological model of the FBs was suggested as a result of routine star disruptions by the supermassive black hole in the GC which might provide enough energy for large-scale structures, like the FBs. In 2020, analytical and numerical models of the FBs as a process of routine tidal disruption of stars near the GC were developed; these disruption events can provide enough cumulative energy to form and maintain large-scale structures like the FBs. The disruption events are expected to be 104105yr1, providing an average power of energy release from the GC into the halo of E˙3×1041 erg s1, which is needed to support the FBs. Analysis of the evolution of superbubbles in exponentially stratified disks concluded that the FB envelope would be destroyed by the Rayleigh–Taylor (RT) instabilities at late stages. The shell is composed of swept-up gas of the bubble, whose thickness is much thinner in comparison to the size of the envelope. We assume that hydrodynamic turbulence is excited in the FB envelope by the RT instability. In this case, the universal energy spectrum of turbulence may be developed in the inertial range of wavenumbers of fluctuations (the Kolmogorov–Obukhov spectrum). From our model we suppose the power of the FBs is transformed partly into the energy of hydrodynamic turbulence in the envelope. If so, hydrodynamic turbulence may generate MHD fluctuations, which accelerate cosmic rays there and generate gamma-ray and radio emission from the FBs. We hope that this model may interpret the observed nonthermal emission from the bubbles. Full article
(This article belongs to the Special Issue Studying Astrophysics with High-Energy Cosmic Particles)
Show Figures

Figure 1

16 pages, 818 KB  
Article
Starlikeness and Convexity of Generalized Bessel-Maitland Function
by Muhammad Umar Nawaz, Daniel Breaz, Mohsan Raza and Luminiţa-Ioana Cotîrlă
Axioms 2024, 13(10), 691; https://doi.org/10.3390/axioms13100691 - 4 Oct 2024
Cited by 1 | Viewed by 961
Abstract
The main objective of this research is to examine a specific sufficiency criteria for the starlikeness and convexity of order δ, k-uniform starlikeness, k-uniform convexity, lemniscate starlikeness and convexity, exponential starlikeness and convexity, uniform convexity of the Generalized Bessel-Maitland function. Applications of [...] Read more.
The main objective of this research is to examine a specific sufficiency criteria for the starlikeness and convexity of order δ, k-uniform starlikeness, k-uniform convexity, lemniscate starlikeness and convexity, exponential starlikeness and convexity, uniform convexity of the Generalized Bessel-Maitland function. Applications of these conclusions to the concept of corollaries are also provided. Additionally, an illustrated representation of these outcomes will be presented. So functional inequalities involving gamma function will be the main research tools of this exploration. The outcomes from this study generalize a number of conclusions from earlier studies. Full article
Show Figures

Figure 1

12 pages, 392 KB  
Article
A New Extension of the Exponentiated Weibull–Poisson Family Using the Gamma-Exponentiated Weibull Distribution: Development and Applications
by Kuntalee Chaisee, Manad Khamkong and Pawat Paksaranuwat
Symmetry 2024, 16(7), 780; https://doi.org/10.3390/sym16070780 - 21 Jun 2024
Cited by 1 | Viewed by 1126
Abstract
This study proposes a new five-parameter distribution called the gamma-exponentiated Weibull–Poisson (GEWP) distribution. As an extension of the exponentiated Weibull–Poisson family, the GEWP distribution offers a more flexible tool for analyzing a wider variety of data due to its theoretically and practically advantageous [...] Read more.
This study proposes a new five-parameter distribution called the gamma-exponentiated Weibull–Poisson (GEWP) distribution. As an extension of the exponentiated Weibull–Poisson family, the GEWP distribution offers a more flexible tool for analyzing a wider variety of data due to its theoretically and practically advantageous properties. It encompasses established distributions like the exponential, Weibull, and exponentiated Weibull. The development of the GEWP distribution proposed in this paper is obtained by combining the gamma–exponentiated Weibull (GEW) and the exponentiated Weibull–Poisson (EWP) distributions. Therefore, it serves as an extension of both the GEW and EWP distributions. This makes the GEWP a viable alternative for describing the variability of occurrences, enabling analysis in situations where GEW and EWP may be limited. This paper analyzes the probability distribution functions and provides the survival and hazard rate functions, the sub-models, the moments, the quantiles, and the maximum likelihood estimation of the GEWP distribution. Then, the numerical experiments for the parameter estimation of GEWP distribution for some finite sample sizes are presented. Finally, the comparative study of GEWP distribution and its sub-models is investigated via the goodness of fit test with real datasets to illustrate its potentiality. Full article
Show Figures

Graphical abstract

20 pages, 4288 KB  
Article
A Physics-Based Tweedie Exponential Dispersion Process Model for Metal Fatigue Crack Propagation and Prognostics
by Lin Yang, Zirong Wang, Zhen Chen and Ershun Pan
Processes 2024, 12(5), 849; https://doi.org/10.3390/pr12050849 - 23 Apr 2024
Cited by 1 | Viewed by 1302
Abstract
Most structural faults in metal parts can be attributed to fatigue crack propagation. The analysis and prognostics of fatigue crack propagation play essential roles in the health management of mechanical systems. Due to the impacts of different uncertainty factors, the crack propagation process [...] Read more.
Most structural faults in metal parts can be attributed to fatigue crack propagation. The analysis and prognostics of fatigue crack propagation play essential roles in the health management of mechanical systems. Due to the impacts of different uncertainty factors, the crack propagation process exhibits significant randomness, which causes difficulties in fatigue life prediction. To improve prognostic accuracy, a physics-based Tweedie exponential dispersion process (TEDP) model is proposed via integrating Paris Law and the stochastic process. This TEDP model can capture both the crack growth mechanism and uncertainty. Compared with other existing models, the TEDP taking Wiener process, Gamma process, and inverse process as special cases is more general and flexible in modeling complex degradation paths. The probability density function of the model is derived based on saddle-joint approximation. The unknown parameters are calculated via maximum likelihood estimation. Then, the analytic expressions of the distributions of lifetime and product reliability are presented. Significant findings include that the proposed TEDP model substantially enhances predictive accuracy in lifetime estimations of mechanical systems under varying operational conditions, as demonstrated in a practical case study on fatigue crack data. This model not only provides highly accurate lifetime predictions, but also offers deep insights into the reliability assessments of mechanically stressed components. Full article
Show Figures

Figure 1

13 pages, 6047 KB  
Article
An Ultra-Throughput Boost Method for Gamma-Ray Spectrometers
by Wenhui Li, Qianqian Zhou, Yuzhong Zhang, Jianming Xie, Wei Zhao, Jinglun Li and Hui Cui
Energies 2024, 17(6), 1456; https://doi.org/10.3390/en17061456 - 18 Mar 2024
Cited by 1 | Viewed by 1440
Abstract
(1) Background: Generally, in nuclear medicine and nuclear power plants, energy spectrum measurements and radioactive nuclide identification are required for evaluation of strong radiation fields to ensure nuclear safety and security; thereby, damage is prevented to nuclear facilities caused by natural disasters or [...] Read more.
(1) Background: Generally, in nuclear medicine and nuclear power plants, energy spectrum measurements and radioactive nuclide identification are required for evaluation of strong radiation fields to ensure nuclear safety and security; thereby, damage is prevented to nuclear facilities caused by natural disasters or the criminal smuggling of nuclear materials. High count rates can lead to signal accumulation, negatively affecting the performance of gamma spectrometers, and in severe cases, even damaging the detectors. Higher pulse throughput with better energy resolution is the ultimate goal of a gamma-ray spectrometer. Traditionally, pileup pulses, which cause dead time and affect throughput, are rejected to maintain good energy resolution. (2) Method: In this paper, an ultra-throughput boost (UTB) off-line processing method was used to improve the throughput and reduce the pileup effect of the spectrometer. Firstly, by fitting the impulse signal of the detector, the response matrix was built by the functional model of a dual exponential tail convolved with the Gaussian kernel; then, a quadratic programming method based on a non-negative least squares (NNLS) algorithm was adopted to solve the constrained optimization problem for the inversion. (3) Results: Both the simulated and experimental results of the UTB method show that most of the impulses in the pulse sequence from the scintillator detector were restored to δ-like pulses, and the throughput of the UTB method for the NaI(Tl) spectrometer reached 207 kcps with a resolution of 7.71% @661.7 keV. A reduction was also seen in the high energy pileup phenomenon. (4) Conclusions: We conclude that the UTB method can restore individual and piled-up pulses to δ-like sequences, effectively boosting pulse throughput and suppressing high-energy tailing and sum peaks caused by the pileup effect at the cost of a slight loss in energy resolution. Full article
(This article belongs to the Special Issue Advancements in Nuclear Energy Technology)
Show Figures

Figure 1

Back to TopTop