Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (541)

Search Parameters:
Keywords = probabilistic risk assessment

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 539 KB  
Article
Assessing the Risk of Damage to Underground Utilities Caused by Spatial Data Quality with Fuzzy Logic
by Marek Ślusarski and Anna Przewięźlikowska
Appl. Sci. 2025, 15(22), 11980; https://doi.org/10.3390/app152211980 - 11 Nov 2025
Abstract
One of the sources of risk inherent to construction projects is the quality of spatial data. Damage to buried pipes and cables often causes accidents, delays, or stoppages of construction works. Fuzzy logic is a method for studying the risk. It is employed [...] Read more.
One of the sources of risk inherent to construction projects is the quality of spatial data. Damage to buried pipes and cables often causes accidents, delays, or stoppages of construction works. Fuzzy logic is a method for studying the risk. It is employed to describe complex or poorly defined phenomena that can hardly be characterised with probabilistic methods. The article proposes a method for assessing the risk of damaging underground utilities based on a fuzzy inference engine. The author first defined linguistic variables and assigned them values based on risk factors. The membership functions for the linguistic variables were modelled using expert judgement. Then, the author determined qualitative fuzzy sets with the rule base. Finally, the values were converted into crisp values. The defuzzification technique employed was the centre of gravity. The proposed method can assess the risk of damage to underground utilities for spatial data exhibiting diverse quality classes. It will be employed to generate large-scale risk maps. The proposed fuzzy logic solution is an effective and appropriate tool for assessing the risk of damage to underground utilities arising from the quality of subsurface data. It should not be regarded as a universal substitute for PRA (Probabilistic Risk Assessment) but as a complementary methodology that is particularly well-suited to risk assessment in data-poor environments characterised by epistemic uncertainty and reliance on qualitative expert judgement. Full article
(This article belongs to the Section Civil Engineering)
17 pages, 829 KB  
Article
vulneraR: An R Package for Uncertainty Analysis in Coastal Vulnerability Studies
by Federico Mattia Stefanini, Sid Ambrosini and Felice D’Alessandro
Mathematics 2025, 13(22), 3603; https://doi.org/10.3390/math13223603 - 10 Nov 2025
Abstract
Coastal vulnerability describes the susceptibility of a system to adverse effects from natural hazards. It is typically evaluated using spatial data on geographical attributes and is often synthesized using tools such as a Coastal Vulnerability Index (CVI). However, the literature highlights that there [...] Read more.
Coastal vulnerability describes the susceptibility of a system to adverse effects from natural hazards. It is typically evaluated using spatial data on geographical attributes and is often synthesized using tools such as a Coastal Vulnerability Index (CVI). However, the literature highlights that there is no universal method for assessing vulnerability, emphasizing the importance of site-specific adaptations. A key challenge in coastal risk management is dealing with the inherent uncertainty of environmental variables and their future dynamics. Incorporating this uncertainty is essential for producing reliable assessments and informed decision-making. In this paper, we present an R package that facilitates the implementation of probabilistic graphical models explicitly incorporating epistemic uncertainty. This approach allows for vulnerability assessments even in situations where data availability is limited. The proposed methodology aims to deliver a more flexible and transparent framework for vulnerability analysis under uncertainty, providing valuable support to local policymakers, in particular during the early phases of intervention planning and technology selection for coastal mitigation strategies. Full article
21 pages, 7074 KB  
Review
Bayesian Network Modeling for Risk-Based Water Quality Decisions with Sparse Data: Case Study of the Kiso River
by Ola Mohamed and Nagahisa Hirayama
Processes 2025, 13(11), 3636; https://doi.org/10.3390/pr13113636 - 10 Nov 2025
Abstract
The study aims to explore the causal relationships among climate, hydrological, and water quality variables in the Kiso River Basin, Japan, using a discrete Bayesian Network (BN) model. The BN was developed to represent probabilistic dependencies between climate factors (rainfall, air temperature), hydrological [...] Read more.
The study aims to explore the causal relationships among climate, hydrological, and water quality variables in the Kiso River Basin, Japan, using a discrete Bayesian Network (BN) model. The BN was developed to represent probabilistic dependencies between climate factors (rainfall, air temperature), hydrological conditions (river flow levels), and water quality indicators (pH, dissolved oxygen [DO], electrical conductivity, ammonia, turbidity, organic pollution, and water temperature). The model used hourly monitoring data collected between 2016 and 2023, and the continuous variables were discretized based on national environmental thresholds to evaluate exceedance probabilities under different hydro-climatic scenarios. Results showed that air temperature strongly influenced water temperature, with a stabilizing effect under constant flow conditions. Rainfall and river flow were key drivers of turbidity; heavy rainfall and high flow increased the probability of exceeding turbidity thresholds by nearly 80%. Elevated ammonia levels during heavy rainfall and low temperatures reflected runoff and limited nitrification processes. Electrical conductivity decreased during high flows due to dilution, while dissolved oxygen was affected by low flows, turbidity, and temperature. As static BNs cannot model temporal dynamics, supplementary cross-correlation analyses were conducted to assess short-term responses among variables, revealing that most water quality parameters respond within ±24 h to changes in hydrological conditions. This study demonstrates that discrete BNs can effectively translate long-term monitoring data into practical, decision-relevant risk assessments to support adaptive water quality management in dynamic river systems. Full article
(This article belongs to the Section Environmental and Green Processes)
Show Figures

Figure 1

17 pages, 3076 KB  
Article
Operational Flexibility Assessment of a Power System Considering Uncertainty of Flexible Resources Supported by Wind Turbines Under Load Shedding Operation
by Guifen Jiang, Jiayin Xu, Yuming Shen, Peiru Feng, Hao Yang, Xu Gui, Yipeng Cao, Mingcheng Chen, Ming Wei and Yinghao Ma
Processes 2025, 13(11), 3635; https://doi.org/10.3390/pr13113635 - 10 Nov 2025
Abstract
The high proportion of renewable energy introduces significant operation risks to the system’s flexibility balance due to its volatility and randomness. Traditional regulation methods struggle to meet the urgent demand for flexible resources. Utilizing wind turbines (WTs) under load shedding operation can provide [...] Read more.
The high proportion of renewable energy introduces significant operation risks to the system’s flexibility balance due to its volatility and randomness. Traditional regulation methods struggle to meet the urgent demand for flexible resources. Utilizing wind turbines (WTs) under load shedding operation can provide additional reserve capacity, thereby reducing the risk of insufficient system flexibility. However, since wind speed and turbine output exhibit a cubic relationship, minor fluctuations in wind speed can lead to significant variations in output and reserve capacity. This increases the uncertainty in the supply of flexible resources from WTs, posing challenges to power system flexibility assessment. This paper investigates a method for assessing power system flexibility considering the uncertainty of flexible resources supported by WT under load shedding operation. Firstly, according to the flexibility supply control model of WT under shedding operation, the analytical relationship between output, flexible resources, and wind speed under a specific wind energy conversion coefficient is constructed; secondly, combined with the probabilistic model of wind speed based on the nonparametric kernel density estimation, the wind turbine flexible resource uncertainty model is constructed; thirdly, the Monte Carlo simulation is used to obtain the sampled wind speed data, and the operational flexibility assessment method of the power system considering the flexibility uncertainty of WT under load shedding operation is proposed. Finally, through case studies, the validity of the proposed model and method were verified. The analysis concludes that load shedding operation of WTs can enhance the system’s flexible resources to a certain extent but cannot provide stable bi-directional regulation capabilities. Full article
Show Figures

Figure 1

24 pages, 2853 KB  
Article
Uncertainty-Driven Reliability Analysis Using Importance Measures and Risk Priority Numbers
by Maria Valentina Clavijo, Fernando Guevara Carazas, Juan David Arango Castrillón and Carmen Elena Patino-Rodriguez
Appl. Sci. 2025, 15(22), 11867; https://doi.org/10.3390/app152211867 - 7 Nov 2025
Viewed by 112
Abstract
Uncertainty is a key factor in the reliability assessment of complex engineering systems, especially when they operate under variable conditions that affect component degradation. This study presents a framework for the systematic and uncertainty-based prioritization of critical components and failure modes. The method [...] Read more.
Uncertainty is a key factor in the reliability assessment of complex engineering systems, especially when they operate under variable conditions that affect component degradation. This study presents a framework for the systematic and uncertainty-based prioritization of critical components and failure modes. The method combines Reliability Block Diagrams, Fault Tree Analysis, and Importance Measures with Failure Mode and Effects Analysis. This two-level approach links component failures with their effect on system reliability. Uncertainty is introduced through the statistical parameters of component reliability distributions and the resulting impact on system behavior is examined. Components with the highest importance are then examined through Failure Mode and Effects Analysis to identify main failure modes and calculate their Risk Priority Numbers. The framework is applied to a fleet of Solid Waste Collection and Compaction Trucks used by a waste management company in a Colombian city. This system operates under high-load variability, mechanical shocks, and environmental stress. The combined Importance Measures and Risk Priority Number analysis provides a probabilistic basis for identifying critical components and their dominant failure modes, linking reliability uncertainty with maintenance prioritization. The results show that combining Importance Measures and Risk Priority Number improves the identification of critical components and dominant failure modes, supporting maintenance prioritization based on reliability impact. The framework offers a practical approach for reliability assessment and maintenance planning under uncertainty, linking component-level uncertainty with system performance to guide decision-making in complex systems. Full article
(This article belongs to the Special Issue Uncertainty and Reliability Analysis for Engineering Systems)
Show Figures

Figure 1

30 pages, 658 KB  
Article
Quantitative Metrics for Balancing Privacy and Utility in Pseudonymized Big Data
by Soonseok Kim
Electronics 2025, 14(21), 4350; https://doi.org/10.3390/electronics14214350 - 6 Nov 2025
Viewed by 170
Abstract
The increasing demand for data utilization has renewed attention to the trade-off between privacy protection and data utility, particularly concerning pseudonymized datasets. Traditional methods for evaluating re-identification risk and utility often rely on fragmented and incompatible metrics, complicating the assessment of the overall [...] Read more.
The increasing demand for data utilization has renewed attention to the trade-off between privacy protection and data utility, particularly concerning pseudonymized datasets. Traditional methods for evaluating re-identification risk and utility often rely on fragmented and incompatible metrics, complicating the assessment of the overall effectiveness of pseudonymization strategies. This study proposes a novel quantitative metric—Relative Utility–Threat (RUT)—which enables the integrated evaluation of safety (privacy) and utility in pseudonymized data. Our method transforms various risk and utility metrics into a unified probabilistic scale (0–1), facilitating standardized and interpretable comparisons. Through scenario-based analyses using synthetic datasets that reflect different data distributions (balanced, skewed, and sparse), we demonstrate how variations in pseudonymization intensity influence both privacy and utility. The results indicate that certain data characteristics significantly affect the balance between protection and usability. This approach relies on simple, lightweight computations—scanning the data once, grouping similar records, and comparing their distributions. Because these operations naturally parallelize in distributed environments such as Spark, the proposed framework can efficiently scale to large pseudonymized datasets. Full article
Show Figures

Figure 1

22 pages, 1128 KB  
Article
Beverage Consumption Patterns in Spanish and Italian Adults: A Comparative Study
by Valentina Micheluzzi, Alessio Lo Cascio, Michela Capoferri, Michela Piredda and Elena Sandri
Beverages 2025, 11(6), 158; https://doi.org/10.3390/beverages11060158 - 6 Nov 2025
Viewed by 350
Abstract
Background: Beverage intake is a consequential yet underappreciated driver of health in Mediterranean settings. Comparative evidence for Spain and Italy based on harmonised measures is scarce. This study addresses that gap by profiling beverage portfolios and their sociodemographic correlates in parallel adult [...] Read more.
Background: Beverage intake is a consequential yet underappreciated driver of health in Mediterranean settings. Comparative evidence for Spain and Italy based on harmonised measures is scarce. This study addresses that gap by profiling beverage portfolios and their sociodemographic correlates in parallel adult samples from both countries. Methods: We conducted a cross-sectional analysis of adults in Spain (n = 483) and Italy (n = 403) using aligned, validated instruments (NutSo-HH; NutSo-HH-Ita). Outcomes were water (Wtr), sugar-sweetened soft drinks (Sfd), juice (Juc), energy drinks (End), coffee (Cff), alcohol (Alc), and episodes of intoxication (Gtd). Associations were assessed via non-parametric tests, multivariable linear models, and an EBIC-selected Gaussian graphical model (GGM). Main results: Italians reported higher Alc and Gtd; Spaniards reported higher Sfd and Juc. Wtr was comparable across countries, and Cff differences were marginal. Age and sex emerged as the most consistent correlates (older age and male sex with higher Alc; younger age with higher Sfd), whereas education and income were not stable determinants. The GGM suggested behavioural clustering of Sfd–Juc–End, with weak partial correlations for other beverages after adjustment. Implications: Distinct country profiles imply differentiated priorities. In Spain, interventions could prioritise reducing sugar-sweetened beverage intake among younger adults through age-targeted primary care counselling, mandatory water (and unsweetened milk) availability in schools, tiered excise taxes on sugar-sweetened drinks, and restrictions on child- and youth-directed marketing of high-sugar beverages. In Italy, primary care and community health services could routinely screen adults for risky alcohol use and deliver brief, culturally attuned advice that promotes lower-risk patterns of wine consumption during meals. Given the cross-sectional design, self-report measures, and non-probabilistic sampling, findings should be interpreted as context-sensitive markers rather than causal determinants; nevertheless, they highlight concrete prevention approaches and regulatory levers for each country’s beverage-related health risks. Full article
Show Figures

Graphical abstract

29 pages, 3863 KB  
Article
Stochastic Finite Element-Based Reliability Analysis of Construction Disturbance Induced by Boom-Type Roadheaders in Karst Tunnels
by Wenyun Ding, Yude Shen, Wenqi Ding, Yongfa Guo, Yafei Qiao and Jixiang Tang
Appl. Sci. 2025, 15(21), 11789; https://doi.org/10.3390/app152111789 - 5 Nov 2025
Viewed by 97
Abstract
Tunnel construction in karst formations faces significant geological uncertainties, which pose challenges for quantifying construction risks using traditional deterministic methods. This paper proposes a probabilistic reliability analysis framework that integrates the Stochastic Finite Element Method (SFEM), a Radial Basis Function Neural Network (RBFNN) [...] Read more.
Tunnel construction in karst formations faces significant geological uncertainties, which pose challenges for quantifying construction risks using traditional deterministic methods. This paper proposes a probabilistic reliability analysis framework that integrates the Stochastic Finite Element Method (SFEM), a Radial Basis Function Neural Network (RBFNN) surrogate model, and Monte Carlo Simulation (MCS) method. The probability distributions of rock mass mechanical parameters and karst geometric parameters were established based on field investigation and geophysical prospecting data. The accuracy of the finite element model was verified through existing physical model tests, with the lateral karst condition identified as the most unfavorable scenario. Limit state functions with control indices, including tunnel crown settlement, invert uplift, ground surface settlement and convergence, were defined. A high-precision surrogate model was constructed using RBFNN (average R2 > 0.98), and the failure probabilities of displacement indices were quantitatively evaluated via MCS (10,000 samples). Results demonstrate that the overall failure probability of tunnel construction is 3.31%, with the highest failure probability observed for crown settlement (3.26%). Sensitivity analysis indicates that the elastic modulus of the disturbed rock mass and the clear distance between the karst cavity and the tunnel are the key parameters influencing deformation. This study provides a probabilistic risk assessment tool and a quantitative decision-making basis for tunnel construction in karst areas. Full article
Show Figures

Figure 1

16 pages, 1901 KB  
Article
Risk Assessment Framework for Structural Failures of Polar Ship Under Ice Loads
by Kai Sun, Xiaodong Chen, Shunying Ji and Haitian Yang
J. Mar. Sci. Eng. 2025, 13(11), 2099; https://doi.org/10.3390/jmse13112099 - 4 Nov 2025
Viewed by 192
Abstract
For polar ships, navigation in ice-covered regions can lead to high risk to structural safety. To study the structural risk induced by ice loads, a risk assessment framework is proposed based on a probabilistic analysis. The fatigue failure probability is derived with the [...] Read more.
For polar ships, navigation in ice-covered regions can lead to high risk to structural safety. To study the structural risk induced by ice loads, a risk assessment framework is proposed based on a probabilistic analysis. The fatigue failure probability is derived with the first-order second-moment (FOSM) method. Typical ice load cases are extracted as a joint probability distribution of ice thickness and ship speed, based on shipboard measurements. Equivalent fatigue stresses for each case are calculated using a coupled discrete element method (DEM) and finite element method (FEM), and fatigue failure probabilities are obtained via linear cumulative damage theory. The ultimate strength failure probability is derived from the reliability theory. The probabilistic distribution of load-carrying capacity for the bow structure, determined by the moment estimation method, is used as the structural resistance, while the ice load distribution identified from shipboard monitoring is treated as the external load. Considering both the likelihood and consequence of failure, a risk matrix is constructed to assess structural failure risk. Inspection and maintenance intervals are then proposed according to the assessed risk levels. This approach offers a quantitative basis for structural risk management, supporting safe navigation and efficient maintenance planning for polar ships. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

14 pages, 2799 KB  
Article
Application of Dynamic PRA to Nuclear Power Plant Operation Support—Evaluation of Plant Operation Support Using a Simple Plant Model
by Nami Yamamoto, Mami Kagimoto, Yohei Ueno, Takafumi Narukawa and Takashi Takata
J. Nucl. Eng. 2025, 6(4), 46; https://doi.org/10.3390/jne6040046 - 4 Nov 2025
Viewed by 167
Abstract
Following the Great East Japan Earthquake in 2011, there has been an increased focus on risk assessment and the practical application of its findings to safety enhancement. In particular, dynamic probabilistic risk assessment (PRA) used in conjunction with plant dynamics analysis is being [...] Read more.
Following the Great East Japan Earthquake in 2011, there has been an increased focus on risk assessment and the practical application of its findings to safety enhancement. In particular, dynamic probabilistic risk assessment (PRA) used in conjunction with plant dynamics analysis is being considered for accident management (AM) and operational support. Determining countermeasure priorities in AM can be challenging due to the diversity of accident scenarios. In multi-unit operations, the complexity of scenarios increases in cases of simultaneous disasters, which makes establishing response operations priorities more difficult. Dynamic PRA methods can efficiently generate and assess complex scenarios by incorporating changes in plant state. This paper introduces the continuous Markov chain Monte Carlo (CMMC) method, a dynamic PRA approach, as a tool for prioritizing countermeasures to support nuclear power plant operations. The proposed method involves three steps: (1) generating exhaustive scenarios that include events, operator actions, and system responses; (2) classifying scenarios according to countermeasure patterns; and (3) assigning priority based on risk data for each pattern. An evaluation was conducted using a simple plant model to analyze event countermeasure patterns for addressing steam generator tube rupture during single-unit operation. The generated scenario patterns included depressurization by opening a pressurizer relief valve (DP), depressurization via heat removal through the steam generator (DSG), and both operations combined (DP + DSG). The timing of the response operations varied randomly, resulting in multiple scenarios. The assessment, based on reactor pressure vessel water level and the potential for core damage, showed that the time margin to core damage depended on the countermeasure pattern. The findings indicate that the effectiveness of each countermeasure can be evaluated and that it is feasible to identify which countermeasure should be prioritized. Full article
(This article belongs to the Special Issue Probabilistic Safety Assessment and Management of Nuclear Facilities)
Show Figures

Figure 1

33 pages, 6956 KB  
Article
Probabilistic Analysis of Creep and Shrinkage Effects on Prestressed Concrete Bridges Using Solid Element Models
by Jun Lu, Hongwei Zhang, Zhibin Jin and Xuezhi Deng
Buildings 2025, 15(21), 3973; https://doi.org/10.3390/buildings15213973 - 3 Nov 2025
Viewed by 329
Abstract
Concrete creep and shrinkage are critical factors affecting the long-term performance of extradosed bridges, leading to deflection, stress redistribution, and potential cracking. Predicting these effects is challenging due to uncertainties in empirical models and a lack of long-term data. While beam element models [...] Read more.
Concrete creep and shrinkage are critical factors affecting the long-term performance of extradosed bridges, leading to deflection, stress redistribution, and potential cracking. Predicting these effects is challenging due to uncertainties in empirical models and a lack of long-term data. While beam element models are common in design, they often fail to capture complex stress fields in disturbed regions (D-regions), potentially leading to non-conservative assessments of crack resistance. This study presents a computationally efficient probabilistic framework that integrates the First-Order Second-Moment (FOSM) method with a high-fidelity solid element model to analyze these time-dependent effects. Our analysis reveals that solid element models predict 14% higher long-term deflections and 64% greater sensitivity to creep and shrinkage parameters compared to beam models, which underestimate both the mean and variability of deformation. The FOSM-based framework proves highly efficient, with its prediction for the standard deviations of bridge deflection falling within 7.1% of those from the more computationally intensive Probability Density Evolution Method. Furthermore, we found that time-varying parameters have a minimal effect on principal stress directions, validating a scalar application of FOSM with less than 3% error. The analysis shows that uncertainties from creep and shrinkage models increase the 95% quantile of in-plane principal stresses by 0.58MPa, which is approximately 23% of the material’s tensile strength and increases the cracking risk. This research underscores the necessity of using high-fidelity models and probabilistic methods for the reliable design and long-term assessment of complex concrete bridges. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

17 pages, 2025 KB  
Article
Breast Organ Dose and Radiation Exposure Reduction in Full-Spine Radiography: A Phantom Model Using PCXMC
by Manami Nemoto and Koichi Chida
Diagnostics 2025, 15(21), 2787; https://doi.org/10.3390/diagnostics15212787 - 3 Nov 2025
Viewed by 232
Abstract
Background/Objectives: Full-spine radiography is frequently performed from childhood to adulthood, raising concerns about radiation-induced breast cancer risk. To assess such probabilistic risks as cancer, accurate estimation of equivalent and effective organ doses is essential. The purpose of this study is to investigate X-ray [...] Read more.
Background/Objectives: Full-spine radiography is frequently performed from childhood to adulthood, raising concerns about radiation-induced breast cancer risk. To assess such probabilistic risks as cancer, accurate estimation of equivalent and effective organ doses is essential. The purpose of this study is to investigate X-ray imaging conditions for radiation reduction based on breast organ dose and to evaluate the accuracy of simulation software for dose calculation. Methods: Breast organ doses from full-spine radiography were calculated using the Monte Carlo-based dose calculation software PCXMC. Breast organ doses were estimated under various technical conditions of full-spine radiography (tube voltage, distance, grid presence, and beam projection). Dose reduction methods were explored, and variations in dose and error due to phantom characteristics and photon history number were evaluated. Results: Among the X-ray conditions, the greatest radiation reduction effect was achieved by changing the imaging direction. Changing from the anteroposterior to posteroanterior direction reduced doses by approximately 76.7% to 89.1% (127.8–326.7 μGy) in children and 80.4% to 91.1% (411.3–911.1 μGy) in adults. In addition, the study highlighted how phantom characteristics and the number of photon histories influence estimated doses and calculation error, with approximately 2 × 106 photon histories recommended to achieve a standard error ≤ 2%. Conclusions: Modifying radiographic conditions is effective for reducing breast radiation exposure in patients with scoliosis. Furthermore, to ensure the accuracy of dose calculation software, the number of photon histories must be adjusted under certain conditions and used while verifying the standard error. This study demonstrates how technical modifications, projection selection, and phantom characteristics influence breast radiation exposure, thereby supporting the need for patient-tailored imaging strategies that minimize radiation risk while maintaining diagnostic validity. The findings may be useful in informing radiographic protocols and the development of safer imaging guidelines for both pediatric and adult patients undergoing spinal examinations. Full article
(This article belongs to the Special Issue Recent Advances in Diagnostic and Interventional Radiology)
Show Figures

Figure 1

20 pages, 1224 KB  
Article
Explainable AI for Coronary Artery Disease Stratification Using Routine Clinical Data
by Nurdaulet Tasmurzayev, Baglan Imanbek, Assiya Boltaboyeva, Gulmira Dikhanbayeva, Sarsenbek Zhussupbekov, Qarlygash Saparbayeva and Gulshat Amirkhanova
Algorithms 2025, 18(11), 693; https://doi.org/10.3390/a18110693 - 3 Nov 2025
Viewed by 364
Abstract
Background: Coronary artery disease (CAD) remains a leading cause of morbidity and mortality. Early diagnosis reduces adverse outcomes and alleviates the burden on healthcare, yet conventional approaches are often invasive, costly, and not always available. In this context, machine learning offers promising solutions. [...] Read more.
Background: Coronary artery disease (CAD) remains a leading cause of morbidity and mortality. Early diagnosis reduces adverse outcomes and alleviates the burden on healthcare, yet conventional approaches are often invasive, costly, and not always available. In this context, machine learning offers promising solutions. Objective: The objective of this study is to evaluate the feasibility of reliably predicting both the presence and the severity of CAD. The analysis is based on a harmonized, multi-center UCI dataset that includes cohorts from Cleveland, Hungary, Switzerland, and Long Beach. The work aims to assess the accuracy and practical utility of models built exclusively on routine tabular clinical and demographic data, without relying on imaging. These models are designed to improve risk stratification and guide patient routing. Methods and Results: The study is based on a uniform and standardized data processing pipeline. This pipeline includes handling missing values, feature encoding, scaling, an 80/20 train–test split and applying the SMOTE method exclusively to the training set to prevent information leakage. Within this pipeline, a standardized comparison of a wide range of models (including gradient boosting, tree-based ensembles, support vector methods, etc.) was conducted with hyperparameter tuning via GridSearchCV. The best results were demonstrated by the CatBoost model: accuracy—0.8278, recall—0.8407, and F1-score—0.8436. Conclusions: A key distinction of this work is the comprehensive evaluation of the models’ practical suitability. Beyond standard metrics, the analysis of calibration curves confirmed the reliability of the probabilistic predictions. Patient-level interpretability using SHAP showed that the model relies on clinically significant predictors, including ST-segment depression. Calibrated and explainable models based on readily available data are positioned as a practical tool for scalable risk stratification and decision support, especially in resource-constrained settings. Full article
Show Figures

Figure 1

23 pages, 2577 KB  
Article
A Hybrid STL-Based Ensemble Model for PM2.5 Forecasting in Pakistani Cities
by Moiz Qureshi, Atef F. Hashem, Hasnain Iftikhar and Paulo Canas Rodrigues
Symmetry 2025, 17(11), 1827; https://doi.org/10.3390/sym17111827 - 31 Oct 2025
Viewed by 282
Abstract
Air pollution, outstanding particulate matter (PM2.5), poses severe risks to human health and the environment in densely populated urban areas. Accurate short-term forecasting of PM2.5 concentrations is therefore crucial for timely public health advisories and effective mitigation strategies. This work [...] Read more.
Air pollution, outstanding particulate matter (PM2.5), poses severe risks to human health and the environment in densely populated urban areas. Accurate short-term forecasting of PM2.5 concentrations is therefore crucial for timely public health advisories and effective mitigation strategies. This work proposes a hybrid approach that combines machine learning models with STL decomposition to provide precise short-term PM2.5 predictions. Daily PM2.5 series from four major Pakistani cities—Islamabad, Lahore, Karachi, and Peshawar—are first pre-processed to handle missing values, outliers, and variance instability. The data are then decomposed via seasonal-trend decomposition using Loess (STL), which explicitly exploits the symmetric and recurrent structure of seasonal patterns. Each decomposed component (trend, seasonality, and remainder) is modeled independently using an ensemble of statistical and machine learning approaches. Forecasts are combined through a weighted aggregation scheme that balances bias–variance trade-offs and preserves the distributional consistency. The final recombined forecasts provide one-day-ahead PM2.5 predictions with associated uncertainty measures. The model evaluation employs multiple statistical accuracy metrics, distributional diagnostics, and out-of-sample validation to assess its performance. The results demonstrate that the proposed framework consistently outperforms conventional benchmark models, yielding robust, interpretable, and probabilistically coherent forecasts. This study demonstrates how periodic and recurrent seasonal structure decomposition and probabilistic ensemble methods enhance the statistical modeling of environmental time series, offering actionable insights for urban air quality management. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
Show Figures

Figure 1

19 pages, 7595 KB  
Article
Probabilistic Forecasting Model for Tropical Cyclone Intensity Based on Diffusion Model
by Jingjia Luo, Peng Yang and Fan Meng
Remote Sens. 2025, 17(21), 3600; https://doi.org/10.3390/rs17213600 - 31 Oct 2025
Viewed by 400
Abstract
Reliable forecasting of tropical cyclone (TC) intensity—particularly rapid intensification (RI) events—remains a major challenge in meteorology, largely due to the inherent difficulty of accurately quantifying predictive uncertainty. Traditional numerical approaches are computationally expensive, while statistical models often fail to capture the highly nonlinear [...] Read more.
Reliable forecasting of tropical cyclone (TC) intensity—particularly rapid intensification (RI) events—remains a major challenge in meteorology, largely due to the inherent difficulty of accurately quantifying predictive uncertainty. Traditional numerical approaches are computationally expensive, while statistical models often fail to capture the highly nonlinear relationships involved. Mainstream machine learning models typically provide only deterministic point forecasts and lack the ability to represent uncertainty. To address this limitation, we propose Tropical Cyclone Diffusion Model (TCDM), the first conditional diffusion-based probabilistic forecasting framework for TC intensity. TCDM integrates multimodal meteorological data, including satellite imagery, re-analysis fields, and environmental predictors, to directly generate the full probability distribution of future intensities. Experimental results show that TCDM not only achieves highly competitive deterministic accuracy (low MAE and RMSE; high R2), but also delivers high-quality probabilistic forecasts (low CRPS; high PICP). Moreover, it substantially improves RI detection by achieving higher hit rates with fewer false alarms. Compared with traditional ensemble-based methods, TCDM provides a more efficient and flexible approach to probabilistic forecasting, offering valuable support for TC risk assessment and disaster preparedness. Full article
Show Figures

Figure 1

Back to TopTop