Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (370)

Search Parameters:
Keywords = credible interval

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
11 pages, 898 KB  
Article
Comparison of Aerosol Generation Between Bag Valve and Chest Compression-Synchronized Ventilation During Simulated Cardiopulmonary Resuscitation
by Young Taeck Oh, Choung Ah Lee, Daun Choi and Hang A. Park
J. Clin. Med. 2025, 14(19), 6790; https://doi.org/10.3390/jcm14196790 - 25 Sep 2025
Abstract
Background: Cardiopulmonary resuscitation can generate aerosols, potentially exposing healthcare workers (HCWs) to infection. Bag valve ventilation (BV) is widely used but is prone to aerosol dispersion, whereas chest compression-synchronized ventilation (CCSV) maintains a closed respiratory circuit. In this study, we compared aerosol [...] Read more.
Background: Cardiopulmonary resuscitation can generate aerosols, potentially exposing healthcare workers (HCWs) to infection. Bag valve ventilation (BV) is widely used but is prone to aerosol dispersion, whereas chest compression-synchronized ventilation (CCSV) maintains a closed respiratory circuit. In this study, we compared aerosol generation between CCSV and BV during chest compressions following endotracheal intubation in a simulated resuscitation setting. Methods: In a randomized crossover design, 12 sessions each of CCSV and BV were conducted on an intubated manikin undergoing mechanical chest compressions for 10 min. Aerosols with ≤5-μm diameter were generated using a saline nebulizer and measured every minute with a particle counter positioned 50 cm from the chest compression site. Bayesian linear regression of minute-by-minute log-transformed aerosol particle counts was used to estimate group differences, yielding posterior means, 95% credible intervals, and posterior probabilities. Results: The aerosol particle counts increased during the initial 3 min with the use of both methods. Thereafter, the aerosol particle counts with CCSV stabilized, whereas those with BV continued to increase. From 4 to 10 min, the posterior probability that CCSV generated fewer particles exceeded 0.98, peaking at 9 min. Both peak and time-averaged log-transformed aerosol particle counts were significantly lower with CCSV than with BV (p = 0.010 and p = 0.020, respectively). Conclusions: In this simulation, CCSV generated significantly fewer aerosols than BV did during chest compressions, with differences emerging after 4 min and persisting thereafter. Thus, CCSV may reduce aerosol exposure of HCWs, supporting its early implementation during resuscitation in infectious disease settings. Full article
Show Figures

Figure 1

24 pages, 782 KB  
Article
Multilevel Analysis of Zero-Dose Children in Sub-Saharan Africa: A Three Delays Model Study
by Charles S. Wiysonge, Muhammed M. B. Uthman, Duduzile Ndwandwe and Olalekan A. Uthman
Vaccines 2025, 13(9), 987; https://doi.org/10.3390/vaccines13090987 - 21 Sep 2025
Viewed by 413
Abstract
Background: Zero-dose children represent a critical challenge for achieving universal immunization coverage in sub-Saharan Africa. This study applies the Three Delays Model to examine multilevel factors associated with zero-dose children. Methods: We analyzed data from 30,500 children aged 12–23 months across 28 sub-Saharan [...] Read more.
Background: Zero-dose children represent a critical challenge for achieving universal immunization coverage in sub-Saharan Africa. This study applies the Three Delays Model to examine multilevel factors associated with zero-dose children. Methods: We analyzed data from 30,500 children aged 12–23 months across 28 sub-Saharan African countries using demographic and health surveys (2015–2024). Zero-dose status was defined as not receiving the first dose of diphtheria–tetanus–pertussis vaccine. Multilevel logistic regression models examined individual-, community-, and country-level determinants. Results: Overall, zero-dose prevalence was 12.19% (95% confidence interval: 11.82–12.56), ranging from 0.51% in Rwanda to 40.00% in Chad. Poor maternal health-seeking behavior showed the strongest association (odds ratio (OR) 12.00, 95% credible interval: 9.78–14.55). Paternal education demonstrated clear gradients, with no formal education increasing odds 1.52-fold. Maternal empowerment factors were significant: lack of decision-making power (OR = 1.23), financial barriers (OR = 1.98), and no media access (OR = 1.32). Low community literacy and low country-level health expenditure were associated with increased zero-dose prevalence. Substantial clustering persisted at community (19.5%) and country (18.7%) levels. Conclusions: Zero-dose children concentrate among the most disadvantaged populations, with maternal health-seeking behavior as the strongest predictor. Immediate policy actions should integrate antenatal care with vaccination services, target high-parity mothers, eliminate financial barriers, and increase health expenditure to 15% of national budgets. Full article
(This article belongs to the Section Vaccines and Public Health)
Show Figures

Figure 1

31 pages, 12350 KB  
Article
Statistical Evaluation of Beta-Binomial Probability Law for Removal in Progressive First-Failure Censoring and Its Applications to Three Cancer Cases
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Mathematics 2025, 13(18), 3028; https://doi.org/10.3390/math13183028 - 19 Sep 2025
Viewed by 172
Abstract
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation [...] Read more.
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation to the exponential baseline, the proposed model introduces an additional flexibility parameter that enriches the family of lifetime distributions, enabling it to better capture varying failure rates and diverse hazard rate behaviors commonly observed in biomedical data, thus extending the classical exponential model. This study develops a novel computational framework for analyzing an α-powered exponential model under beta-binomial random removals within the proposed censoring test. To address the inherent complexity of the likelihood function arising from simultaneous random removals and progressive censoring, we derive closed-form expressions for the likelihood, survival, and hazard functions and propose efficient estimation strategies based on both maximum likelihood and Bayesian inference. For the Bayesian approach, gamma and beta priors are adopted, and a tailored Metropolis–Hastings algorithm is implemented to approximate posterior distributions under symmetric and asymmetric loss functions. To evaluate the empirical performance of the proposed estimators, extensive Monte Carlo simulations are conducted, examining bias, mean squared error, and credible interval coverage under varying censoring levels and removal probabilities. Furthermore, the practical utility of the model is illustrated through three oncological datasets, including multiple myeloma, lung cancer, and breast cancer patients, demonstrating superior goodness of fit and predictive reliability compared to traditional models. The results show that the proposed lifespan model, under the beta-binomial probability law and within the examined censoring mechanism, offers a flexible and computationally tractable framework for reliability and biomedical survival analysis, providing new insights into censored data structures with random withdrawals. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
Show Figures

Figure 1

29 pages, 19296 KB  
Article
Inference for the Chris–Jerry Lifetime Distribution Under Improved Adaptive Progressive Type-II Censoring for Physics and Engineering Data Modelling
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(9), 702; https://doi.org/10.3390/axioms14090702 - 17 Sep 2025
Viewed by 172
Abstract
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater [...] Read more.
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater flexibility in terminating life-testing experiments. We derive maximum likelihood estimators for the CJ parameters and key reliability measures, including the reliability and hazard rate functions, and construct approximate confidence intervals using the observed Fisher information matrix and the delta method. To address the intractability of the likelihood function, Bayesian estimators are obtained under independent gamma priors and a squared-error loss function. Because the posterior distributions are not available in closed form, we apply the Metropolis–Hastings algorithm to generate Bayesian estimates and two types of credible intervals. A comprehensive simulation study evaluates the performance of the proposed estimation techniques under various censoring scenarios. The framework is further validated through two real-world datasets: one involving rainfall measurements and another concerning mechanical failure times. In both cases, the CJ model combined with the proposed censoring strategy demonstrates superior fit and reliability inference compared to competing models. These findings highlight the value of the CJ distribution, together with advanced censoring methods, for modeling lifetime data in physics and engineering applications. Full article
Show Figures

Figure 1

22 pages, 1637 KB  
Article
Optimized Dispatch of a Photovoltaic-Inclusive Virtual Power Plant Based on a Weighted Solar Irradiance Probability Model
by Jiyun Yu, Xinsong Zhang, Xiangyu He, Chaoyue Wang, Jun Lan and Jiejie Huang
Energies 2025, 18(18), 4882; https://doi.org/10.3390/en18184882 - 14 Sep 2025
Viewed by 225
Abstract
Under China’s dual-carbon strategic objectives, virtual power plants (VPPs) actively participate in the coupled electricity–carbon market through the optimized scheduling of distributed energy resources, simultaneously stabilizing grid operations and reducing carbon emissions. Photovoltaic (PV) generation, a cornerstone resource within VPP systems, introduces significant [...] Read more.
Under China’s dual-carbon strategic objectives, virtual power plants (VPPs) actively participate in the coupled electricity–carbon market through the optimized scheduling of distributed energy resources, simultaneously stabilizing grid operations and reducing carbon emissions. Photovoltaic (PV) generation, a cornerstone resource within VPP systems, introduces significant challenges in scheduling due to its inherent output variability. To increase the accuracy in the characterization of the PV output uncertainty, a weighted probability distribution of solar irradiance, based on historical irradiance data, is newly proposed. The leveraging rejection sampling technique is applied to generate solar irradiance scenarios that are consistent with the proposed weighted solar irradiance probability model. Further, a confidence interval-based filtering mechanism is applied to eliminate extreme scenarios, ensuring statistical credibility and enhancing practicability in actual dispatch scenarios. Based on the filtered scenarios, a novel dispatch strategy for the VPP operation in the electricity–carbon market is proposed. Numerical case studies verify that scenarios generated by the weighted solar irradiance probability model are capable of closely replicating historical PV characteristics, and the confidence interval filter effectively excludes improbable extreme scenarios. Compared to conventional normal distribution-based methods, the proposed approach yields dispatch solutions that are more closely aligned with the optimal dispatch of the historical irradiance data, demonstrating the improved accuracy in the probabilistic modelling of the PV output uncertainty. Consequently, the obtained dispatch strategy shows the improved capability to ensure the market revenue of the VPP considering the fluctuations of the PV output. Full article
(This article belongs to the Special Issue New Power System Planning and Scheduling)
Show Figures

Figure 1

34 pages, 31211 KB  
Article
Statistical Evaluation of Alpha-Powering Exponential Generalized Progressive Hybrid Censoring and Its Modeling for Medical and Engineering Sciences with Optimization Plans
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Symmetry 2025, 17(9), 1473; https://doi.org/10.3390/sym17091473 - 6 Sep 2025
Viewed by 462
Abstract
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, [...] Read more.
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, with the alpha-power transformation amplifying or dampening this skewness depending on the power parameter. The proposed censoring design offers new insights into modeling lifetime data that exhibit non-monotonic hazard behaviors. It enhances testing efficiency by simultaneously imposing fixed-time constraints and ensuring a minimum number of failures, thereby improving inference quality over traditional censoring methods. We derive maximum likelihood and Bayesian estimates for the APE distribution parameters and key reliability measures, such as the reliability and hazard rate functions. Bayesian analysis is performed using independent gamma priors under a symmetric squared error loss, implemented via the Metropolis–Hastings algorithm. Interval estimation is addressed using two normality-based asymptotic confidence intervals and two credible intervals obtained through a simulated Markov Chain Monte Carlo procedure. Monte Carlo simulations across various censoring scenarios demonstrate the stable and superior precision of the proposed methods. Optimal censoring patterns are identified based on the observed Fisher information and its inverse. Two real-world case studies—breast cancer remission times and global oil reserve data—illustrate the practical utility of the APE model within the proposed censoring framework. These applications underscore the model’s capability to effectively analyze diverse reliability phenomena, bridging theoretical innovation with empirical relevance in lifetime data analysis. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
Show Figures

Figure 1

16 pages, 1217 KB  
Systematic Review
Efficacy of SGLT2 Inhibitors, GLP-1 Receptor Agonists, DPP-4 Inhibitors, and Sulfonylureas on Moderate-to-Severe COPD Exacerbations Among Patients with Type 2 Diabetes: A Systematic Review and Network Meta-Analysis
by Edoardo Pirera, Domenico Di Raimondo, Lucio D’Anna and Antonino Tuttolomondo
Pharmaceuticals 2025, 18(9), 1337; https://doi.org/10.3390/ph18091337 - 5 Sep 2025
Viewed by 473
Abstract
Background/Objectives: Chronic obstructive pulmonary disease (COPD) and type 2 diabetes mellitus (T2DM) frequently coexist, contributing to worse clinical outcomes and increased risk of exacerbations. While newer glucose-lowering agents have demonstrated cardiovascular and renal benefits, their comparative efficacy on COPD exacerbations remain uncertain. [...] Read more.
Background/Objectives: Chronic obstructive pulmonary disease (COPD) and type 2 diabetes mellitus (T2DM) frequently coexist, contributing to worse clinical outcomes and increased risk of exacerbations. While newer glucose-lowering agents have demonstrated cardiovascular and renal benefits, their comparative efficacy on COPD exacerbations remain uncertain. Methods: We systematically searched PubMed, Embase, Web of Science, Cochrane Library, and ClinicalTrials.gov from inception to June 2025. We included randomised controlled trials (RCTs) and observational studies enrolling adults with COPD and T2DM that reported the risk of COPD exacerbations following initiation of SGLT2is, GLP-1RAs, DPP-4is, or sulfonylureas, with an active comparator group. The primary outcome was a composite of moderate-to-severe COPD exacerbations. Secondary outcomes included the individual components separately. A Bayesian random-effects network meta-analysis was performed to estimate risk ratio (RR) with 95% credible intervals (95% CIs). Results: Nine observational studies were ultimately included. No RCTs were retrieved. Compared to sulfonylureas, initiation of SGLT2is (RR 0.64, 0.59–0.69), GLP-1RAs (0.66, 0.60–0.71), and DPP-4is (0.79, 0.74–0.86) was associated with reduced risk of moderate-to-severe exacerbations. Moreover, SGLT2is (0.80, 0.75–0.86) and GLP-1RAs (0.83, 0.77–0.88) were more favourable compared to DPP4is. Consistent results were found for secondary outcomes. Sensitivity analyses confirmed the robustness of the findings for the primary outcome. Robustness was not consistently observed across all treatment comparisons for secondary outcomes. Conclusions: Among patients with COPD and T2DM, newer glucose-lowering agents, particularly SGLT2is and GLP-1RAs, were associated with significantly lower risk of moderate-to-severe exacerbations. These findings support the potential respiratory benefits of these agents and warrant confirmation through RCTs. Full article
Show Figures

Figure 1

14 pages, 1736 KB  
Systematic Review
Performance of Stratification Scores on the Risk of Stroke After a Transient Ischemic Attack: A Systematic Review and Network Meta-Analysis
by Dimitrios Deris, Sabrina Mastroianni, Jonathan Kan, Areti Angeliki Veroniki, Mukul Sharma, Raed A. Joundi, Ashkan Shoamanesh, Abhilekh Srivastava and Aristeidis H. Katsanos
J. Clin. Med. 2025, 14(17), 6268; https://doi.org/10.3390/jcm14176268 - 5 Sep 2025
Viewed by 647
Abstract
Background: Patients after a transient ischemic attack (TIA) are at high risk of subsequent stroke. There are various scores that aim to accurately identify patients at the highest risk of stroke. However, without comparisons between these scores, it is still unknown which is [...] Read more.
Background: Patients after a transient ischemic attack (TIA) are at high risk of subsequent stroke. There are various scores that aim to accurately identify patients at the highest risk of stroke. However, without comparisons between these scores, it is still unknown which is the score with the best predictive utility. Our study aims to identify the risk stratification score with the highest utility to identify patients at high risk for stroke within 90 days after a TIA. Methods: The MEDLINE and Scopus databases were systematically searched on 1 December 2023 for observational cohort studies assessing the ability of a score to predict a stroke within the first 90 days from the index TIA event. Only studies that had a direct comparison of at least two scores were included. A random-effects network meta-analysis was performed. Sensitivity and specificity, along with relevant 95% credible intervals, and between-score and between-study heterogeneity were estimated. We also estimated relative sensitivities and relative specificities compared with the ABCD2 score. We ranked each score according to its predictive accuracy based on both sensitivity and specificity estimates, using the diagnostic odds ratio (DOR) and the summary receiver operating characteristic (SROC) curve. Results: Our systematic review highlighted 9 studies including 14 discrete cohorts. The performance of all scores to identify patients at high risk for stroke recurrence within 90 days following a TIA was low (pooled sensitivity range 48–64%, pooled specificity range 59–72%). In the network meta-analysis, we analyzed 6 studies with 11 discrete cohorts, including data from 8217 patients. The ABCD3-I score demonstrated the highest DOR, followed by the ESRS, ABCD, California, and ABCD2. The SROC curves demonstrate no significant differences in the performance of the scores, using the ABCD score as the common comparator. Conclusions: In this systematic review and network meta-analysis of observational cohort studies of patients who experienced TIA and were followed for the occurrence of subsequent stroke, we failed to identify a score performing significantly better for the prediction of stroke at 90 days. New models are needed for the prediction and stroke risk stratification following a TIA. Full article
(This article belongs to the Special Issue Ischemic Stroke: Diagnosis and Treatment)
Show Figures

Figure 1

17 pages, 3444 KB  
Article
Determination of Orbital Parameters of Binary Star Systems Using the MCMC Method
by Nadezhda L. Vaidman, Shakhida T. Nurmakhametova, Anatoly S. Miroshnichenko, Serik A. Khokhlov, Aldiyar T. Agishev, Azamat A. Khokhlov, Yeskendyr K. Ashimov and Berik S. Yermekbayev
Galaxies 2025, 13(5), 101; https://doi.org/10.3390/galaxies13050101 - 2 Sep 2025
Viewed by 522
Abstract
We present new spectroscopic orbits for the bright binaries Mizar B, 3 Pup, ν Gem, 2 Lac, and ϕ Aql. Our analysis is based on medium-resolution (R 12,000) échelle spectra obtained with the 0.81-m telescope and fiber-fed eShel spectrograph of the [...] Read more.
We present new spectroscopic orbits for the bright binaries Mizar B, 3 Pup, ν Gem, 2 Lac, and ϕ Aql. Our analysis is based on medium-resolution (R 12,000) échelle spectra obtained with the 0.81-m telescope and fiber-fed eShel spectrograph of the Three College Observatory (Greensboro, NC, USA) between 2015 and 2024. Orbital elements were inferred with an affine-invariant Markov-chain Monte-Carlo sampler; convergence was verified through the integrated autocorrelation time and the Gelman–Rubin statistic. Errors quote the 16th–84th-percentile credible intervals. Compared with previously published orbital solutions for the studied stars, our method improves the root-mean-square residuals by 25–50% and bring the 1σ uncertainties on the radial velocity (RV) semi-amplitudes down to 0.02–0.15 km s1. These gains translate into markedly tighter mass functions and systemic RVs, providing a robust dynamical baseline for future interferometric and photometric studies. A complete Python analysis pipeline is openly available in a GitHub repository, ensuring full reproducibility. The results demonstrate that a Bayesian RV analysis with well-motivated priors and rigorous convergence checks yields orbital parameters that are both more precise and more reproducible than previous determinations, while offering fully transparent uncertainty budgets. Full article
Show Figures

Figure 1

28 pages, 2302 KB  
Article
New Energy Vehicle Decision-Making for Consumers: An IBULIQOWA Operator-Based DM Approach Considering Information Quality
by Yi Yang, Xiangjun Wang, Jingyi Chen, Jie Chen, Junfeng Yang and Chang Qi
Sustainability 2025, 17(17), 7753; https://doi.org/10.3390/su17177753 - 28 Aug 2025
Viewed by 383
Abstract
New energy vehicles (NEVs) have gained increasing favor among NEV consumers due to their dual advantages of “low cost” and “environmental friendliness.” In recent years, the share of NEVs in the global automotive market has been steadily rising. For instance, in the Chinese [...] Read more.
New energy vehicles (NEVs) have gained increasing favor among NEV consumers due to their dual advantages of “low cost” and “environmental friendliness.” In recent years, the share of NEVs in the global automotive market has been steadily rising. For instance, in the Chinese market, the sales of new energy vehicles in 2024 increased by 35.5% year-on-year, accounting for 70.5% of global NEV sales. However, as the diversity of NEV brands and models expands, selecting the most suitable model from a vast amount of information has become the primary challenge for NEV consumers. Although online service platforms offer extensive user reviews and rating data, the uncertainty, inconsistent quality, and sheer volume of this information pose significant challenges to decision-making for NEV consumers. Against this backdrop, leveraging the strengths of the quasi OWA (QOWA) operator in information aggregation and interval basic uncertain linguistic information (IBULI) information aggregation and two-dimensional information representation of “information + quality”, this study proposes a large-scale group data aggregation method for decision support based on the IBULIQOWA operator. This approach aims to assist consumers of new energy vehicles in making informed decisions from the perspective of information quality. Firstly, the quasi ordered weighted averaging (QOWA) operator on the unit interval is extended to the closed interval 0,τ, and the extended basic uncertain information quasi ordered weighted averaging (EBUIQOWA) operator is defined. Secondly, in order to aggregate groups of IBULI, based on the EBUIQOWA operator, the basic uncertain linguistic information QOWA (BULIQOWA) operator and the IBULIQOWA operator are proposed, and the monotonicity and degeneracy of the proposed operators are discussed. Finally, for the problem of product decision making in online service platforms, considering the credibility of information, a product decision-making method based on the IBULIQOWA operator is proposed, and its effectiveness and applicability are verified through a case study of NEV product decision making in a car online service platform, providing a reference for decision support in product ranking of online service platforms. Full article
(This article belongs to the Special Issue Decision-Making in Sustainable Management)
Show Figures

Figure 1

33 pages, 2241 KB  
Systematic Review
Dairy Consumption and Risk of Cardiovascular and Bone Health Outcomes in Adults: An Umbrella Review and Updated Meta-Analyses
by Payam Sharifan, Roshanak Roustaee, Mojtaba Shafiee, Zoe L. Longworth, Pardis Keshavarz, Ian G. Davies, Richard J. Webb, Mohsen Mazidi and Hassan Vatanparast
Nutrients 2025, 17(17), 2723; https://doi.org/10.3390/nu17172723 - 22 Aug 2025
Viewed by 3444
Abstract
Background/Objectives: The relationship between dairy consumption and cardiovascular or bone health outcomes remains controversial, with inconsistent findings across existing meta-analyses. In this study, we aimed to systematically evaluate and synthesize the evidence from published meta-analyses on dairy consumption and cardiovascular and bone health [...] Read more.
Background/Objectives: The relationship between dairy consumption and cardiovascular or bone health outcomes remains controversial, with inconsistent findings across existing meta-analyses. In this study, we aimed to systematically evaluate and synthesize the evidence from published meta-analyses on dairy consumption and cardiovascular and bone health outcomes in adults, and to conduct updated meta-analyses incorporating recently published prospective cohort studies. Methods: We performed an umbrella review following PRISMA guidelines, searching published and grey literature up to April 2024. Meta-analyses evaluating dairy intake and its impact on cardiovascular and bone health outcomes were included. Updated meta-analyses were conducted for cardiovascular outcomes, while bone health outcomes were synthesized qualitatively. Methodological quality was assessed using the Joanna Briggs Institute checklist. Random-effects models were applied, and heterogeneity, small-study effects, excess significance, and prediction intervals were evaluated. Results: We included 33 meta-analyses (26 on cardiovascular, 7 on bone health outcomes). Updated meta-analyses showed that total dairy (RR: 0.96), milk (RR: 0.97), and yogurt (RR: 0.92) were significantly associated with reduced CVD risk. Total dairy and low-fat dairy were inversely linked to hypertension (RRs: 0.89, 0.87), and milk and low-fat dairy were associated with reduced stroke risk. Small-study effects were absent for most associations. Credibility was rated as “weak” for most associations, with total dairy and stroke, and total dairy and hypertension showing "suggestive" evidence. For bone health, dairy—especially milk—was linked to higher bone mineral density (BMD). Evidence on osteoporosis risk was mixed, and while total dairy and milk showed inconsistent associations with fractures, cheese and yogurt showed more consistent protective effects. Limited evidence suggested milk may reduce bone resorption markers. Conclusions: This review suggests that dairy consumption, particularly milk and yogurt, is modestly associated with reduced cardiovascular risk, while dairy intake appears to benefit BMD and fracture prevention. However, further research is needed to confirm these associations. Full article
(This article belongs to the Section Nutrition and Public Health)
Show Figures

Figure 1

18 pages, 1111 KB  
Systematic Review
Comparison with Dietary Groups of Various Macronutrient Ratios on Body Weight and Cardiovascular Risk Factors in Adults: A Systematic Review and Network Meta-Analysis
by Yiling Lou, Hengchang Wang, Linlin Wang, Shen Huang, Yulin Xie, Fujian Song, Zuxun Lu, Furong Wang, Qingqing Jiang and Shiyi Cao
Nutrients 2025, 17(16), 2683; https://doi.org/10.3390/nu17162683 - 19 Aug 2025
Viewed by 1592
Abstract
Background: This network meta-analysis aimed to assess the relative efficacy of macronutrient dietary groups with varying carbohydrate, fat, and protein ratios on weight control and cardiovascular risk factors improvement in adults. Methods: We searched PubMed, the Cochrane Central Register of Controlled Trials (CENTRAL), [...] Read more.
Background: This network meta-analysis aimed to assess the relative efficacy of macronutrient dietary groups with varying carbohydrate, fat, and protein ratios on weight control and cardiovascular risk factors improvement in adults. Methods: We searched PubMed, the Cochrane Central Register of Controlled Trials (CENTRAL), Embase, Web of Science Core Collection, and ClinicalTrials.gov from inception to 30 November 2024, as well as reference lists of related systematic reviews. Eligible randomized controlled trials (RCTs) were included. Literature screening, data extraction, and risk of bias assessment were conducted independently by two reviewers. The changes in body weight, blood glucose, systolic blood pressure, diastolic blood pressure, high density lipoprotein (HDL) cholesterol, low density lipoprotein (LDL) cholesterol, triglycerides, and total cholesterol were the study outcomes. Utilizing a Bayesian framework, a series of random-effects network meta-analyses were conducted to estimate mean difference (MD) with 95% credible interval (CrI) and determine the relative effectiveness of the macronutrient dietary groups. The quality of evidence for each pair of dietary groups was assessed based on the online tool called confidence in network meta-analysis (CINeMA). Results: This study initially identified 14,988 studies and ultimately included 66 eligible RCTs involving 4301 participants in the analysis. The very low carbohydrate–low protein (VLCLP, MD −4.10 kg, 95% CrI −6.70 to −1.54), the moderate carbohydrate–high protein (MCHP, MD −1.51 kg, 95% CrI −2.90 to −0.20), the very low carbohydrate–high protein (VLCHP, MD −1.35 kg, 95% CrI −2.52 to −0.26) dietary groups might lead to weight loss compared with the moderate fat–low protein (MFLP) dietary group. Among the dietary groups relative to the MFLP dietary group, the moderate carbohydrate–low protein (MCLP, MD 0.09 mmol/L, 95% CrI 0.02 to 0.16) and VLCHP (MD 0.16 mmol/L, 95% CrI 0.08 to 0.24) dietary groups were less effective in lowering HDL cholesterol, and the VLCHP (MD 0.50 mmol/L, 95% CrI 0.26 to 0.75) dietary group was less effective in lowering LDL cholesterol. In terms of triglyceride reduction, the MCLP (MD −0.33 mmol/L, 95% CrI −0.44 to −0.22), VLCHP (MD −0.31 mmol/L, 95% CrI −0.42 to −0.18), VLCLP (MD −0.14 mmol/L, 95% CrI −0.25 to −0.02), and moderate fat–high protein (MFHP, MD −0.13 mmol/L, 95% CrI −0.21 to −0.06) dietary groups were more efficacious than the MFLP dietary group, while any pair of dietary group interventions showed minimal to no difference in the effects on blood glucose, blood pressure, and total cholesterol. Conclusions: High or moderate certainty evidence reveals that the VLCLP dietary group is the most appropriate for weight loss, while the MCLP dietary group is best for reducing triglycerides. For control of blood glucose, blood pressure, and cholesterol levels, there is little to no difference between macronutrient dietary groups. Additionally, future studies in normal-weight populations are needed to verify the applicability of our findings. Full article
(This article belongs to the Section Nutrition and Public Health)
Show Figures

Figure 1

49 pages, 14879 KB  
Article
Fully Bayesian Inference for Meta-Analytic Deconvolution Using Efron’s Log-Spline Prior
by JoonHo Lee and Daihe Sui
Mathematics 2025, 13(16), 2639; https://doi.org/10.3390/math13162639 - 17 Aug 2025
Viewed by 492
Abstract
Meta-analytic deconvolution seeks to recover the distribution of true effects from noisy site-specific estimates. While Efron’s log-spline prior provides an elegant empirical Bayes solution with excellent point estimation properties, its plug-in nature yields severely anti-conservative uncertainty quantification for individual site effects—a critical limitation [...] Read more.
Meta-analytic deconvolution seeks to recover the distribution of true effects from noisy site-specific estimates. While Efron’s log-spline prior provides an elegant empirical Bayes solution with excellent point estimation properties, its plug-in nature yields severely anti-conservative uncertainty quantification for individual site effects—a critical limitation for what Efron terms “finite-Bayes inference.” We develop a fully Bayesian extension that preserves the computational advantages of the log-spline framework while properly propagating hyperparameter uncertainty into site-level posteriors. Our approach embeds the log-spline prior within a hierarchical model with adaptive regularization, enabling exact finite-sample inference without asymptotic approximations. Through simulation studies calibrated to realistic meta-analytic scenarios, we demonstrate that our method achieves near-nominal coverage (88–91%) for 90% credible intervals while matching empirical Bayes point estimation accuracy. We provide a complete Stan implementation handling heteroscedastic observations—a critical feature absent from existing software. The method enables principled uncertainty quantification for individual effects at modest computational cost, making it particularly valuable for applications requiring accurate site-specific inference, such as multisite trials and institutional performance assessment. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

33 pages, 6324 KB  
Article
The Inverted Hjorth Distribution and Its Applications in Environmental and Pharmaceutical Sciences
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Symmetry 2025, 17(8), 1327; https://doi.org/10.3390/sym17081327 - 14 Aug 2025
Viewed by 406
Abstract
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing [...] Read more.
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing models in the literature, as it can accommodate data exhibiting an inverted (upside-down) bathtub-shaped hazard rate. We derive key properties of the model, including quantiles, moments, reliability measures, stress–strength reliability, and order statistics. Point estimation of the IH model parameters is performed using maximum likelihood and Bayesian approaches. Moreover, for interval estimation, two types of asymptotic confidence intervals and two types of Bayesian credible intervals are obtained using the same estimation methodologies. As an extension to a complete sampling plan, Type-II censoring is employed to examine the impact of data incompleteness on IH parameter estimation. Monte Carlo simulation results indicate that Bayesian point and credible estimates outperform those obtained via classical estimation methods across several precision metrics, including mean squared error, average absolute bias, average interval length, and coverage probability. To further assess its performance, two real datasets are analyzed: one from the environmental domain (minimum monthly water flows of the Piracicaba River) and another from the pharmacological domain (plasma indomethacin concentrations). The superiority and flexibility of the inverted Hjorth model are evaluated and compared with several competing models. The results confirm that the IH distribution provides a better fit than several existing lifetime models—such as the inverted Gompertz, inverted log-logistic, inverted Lomax, and inverted Nadarajah–Haghighi distributions—making it a valuable tool for reliability and survival data analysis. Full article
Show Figures

Figure 1

28 pages, 875 KB  
Article
Statistical Inference for the Modified Fréchet-Lomax Exponential Distribution Under Constant-Stress PALT with Progressive First-Failure Censoring
by Ahmed T. Farhat, Dina A. Ramadan, Hanan Haj Ahmad and Beih S. El-Desouky
Mathematics 2025, 13(16), 2585; https://doi.org/10.3390/math13162585 - 12 Aug 2025
Viewed by 305
Abstract
Life testing of products often requires extended observation periods. To shorten the duration of these tests, products can be subjected to more extreme conditions than those encountered in normal use; an approach known as accelerated life testing (ALT) is considered. This study investigates [...] Read more.
Life testing of products often requires extended observation periods. To shorten the duration of these tests, products can be subjected to more extreme conditions than those encountered in normal use; an approach known as accelerated life testing (ALT) is considered. This study investigates the estimation of unknown parameters and the acceleration factor for the modified Fréchet-Lomax exponential distribution (MFLED), utilizing Type II progressively first-failure censored (PFFC) samples obtained under the framework of constant-stress partially accelerated life testing (CSPALT). Maximum likelihood (ML) estimation is employed to obtain point estimates for the model parameters and the acceleration factor, while the Fisher information matrix is used to construct asymptotic confidence intervals (ACIs) for these estimates. To improve the precision of inference, two parametric bootstrap methods are also implemented. In the Bayesian context, a method for eliciting prior hyperparameters is proposed, and Bayesian estimates are obtained using the Markov Chain Monte Carlo (MCMC) method. These estimates are evaluated under both symmetric and asymmetric loss functions, and the corresponding credible intervals (CRIs) are computed. A comprehensive simulation study is conducted to compare the performance of ML, bootstrap, and Bayesian estimators in terms of mean squared error and coverage probabilities of confidence intervals. Finally, real-world failure time data of light-emitting diodes (LEDs) are analyzed to demonstrate the applicability and efficiency of the proposed methods in practical reliability studies, highlighting their value in modeling the lifetime behavior of electronic components. Full article
(This article belongs to the Special Issue Statistical Analysis: Theory, Methods and Applications)
Show Figures

Figure 1

Back to TopTop