Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (135)

Search Parameters:
Keywords = inflation uncertainty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
65 pages, 8778 KB  
Systematic Review
Beyond Accuracy: Transferability Limits, Validation Inflation, and Uncertainty Gaps in Satellite-Based Water Quality Monitoring—A Systematic Quantitative Synthesis and Operational Framework
by Saeid Pourmorad, Valerie Graw, Andreas Rienow and Luca Antonio Dimuccio
Remote Sens. 2026, 18(7), 1098; https://doi.org/10.3390/rs18071098 - 7 Apr 2026
Abstract
Satellite remote sensing has become essential for water quality assessment across inland and coastal environments, with rapid improvements in recent years. Significant advances have been made in detecting optically active parameters (such as chlorophyll-a, suspended matter, and turbidity), showing consistently strong performance across [...] Read more.
Satellite remote sensing has become essential for water quality assessment across inland and coastal environments, with rapid improvements in recent years. Significant advances have been made in detecting optically active parameters (such as chlorophyll-a, suspended matter, and turbidity), showing consistently strong performance across multiple studies. Specifically, the median validation performance (R2) derived from the quantitative synthesis indicates R2 = 0.82 for chlorophyll-a (interquartile range—IQR: 0.75–0.90), R2 = 0.80 for total suspended matter (IQR: 0.78–0.85), and R2 = 0.88 for turbidity (IQR: 0.85–0.90). Conversely, the retrieval of optically inactive parameters (such as nutrients like total phosphorus and total nitrogen) remains more context dependent. It exhibits moderate, more variable results, with median R2 = 0.68 (IQR: 0.64–0.74) for total phosphorus and R2 = 0.75 (IQR: 0.70–0.80) for total nitrogen. These findings clearly illustrate the varying success of retrievals of optically active and inactive parameters and underscore the inherent difficulties of indirect estimation methods. However, high reported accuracy has yet to translate into transferable, uncertainty-informed, and operational monitoring systems. This gap stems from structural issues in validation design, physics integration, uncertainty management, and multi-sensor compatibility rather than data limitations alone. We present a PRISMA-guided, distribution-aware quantitative synthesis of 152 peer-reviewed studies (1980–2025), based on a systematic search protocol, to evaluate satellite-based retrievals of both optically active and inactive parameters. Instead of simply averaging performance, we analyse the empirical distributions of validation metrics, considering the validation protocol, sensor type, parameter category, degree of physics integration, and uncertainty quantification. The synthesis demonstrates that validation strategy often influences reported results more than the algorithm class itself, with accuracy inflated under non-independent cross-validation methods and notable variability between studies concealed by mean-based reports. Across four decades, four persistent structural challenges remain: limited transferability across sites and sensors beyond calibration areas; weak or implicit physical integration in many data-driven models; lack of or inconsistency in uncertainty quantification; and fragmented multi-sensor harmonisation that restricts operational scalability. To address these issues, we introduce two evidence-based coding frameworks: a physics-integration taxonomy (P0–P4) and an uncertainty-quantification hierarchy (U0–U4). Applying these frameworks shows that most studies remain focused on low-to-moderate levels of physics integration and primarily consider uncertainty at the prediction stage, with limited attention to upstream sources throughout the observation and inference process. Building on this structured synthesis, we propose a transferable, physics-informed, and uncertainty-aware conceptual framework that links model architecture, validation robustness, and probabilistic uncertainty to well-founded design principles. By shifting satellite water quality modelling from isolated algorithm demonstrations towards integrated, evidence-based system design, this study promotes scalable, decision-grade environmental monitoring amid the accelerating impacts of climate change. Full article
Show Figures

Figure 1

33 pages, 2332 KB  
Article
EvalHack: Answer-Side Prompt Injection for Probing LLM Exam-Grading Panel Stability
by Catalin Anghel, Marian Viorel Craciun, Adina Cocu, Andreea Alexandra Anghel, Antonio Stefan Balau, Adrian Istrate and Aurelian-Dumitrache Anghele
Information 2026, 17(3), 297; https://doi.org/10.3390/info17030297 - 18 Mar 2026
Viewed by 302
Abstract
Large language models are increasingly used as automated graders, yet their reliability under answer-side manipulation and their behavior in multi-model panels remain insufficiently understood. This paper introduces EvalHack, a matrix benchmark in which a fixed committee of four LLMs grades university-level machine learning [...] Read more.
Large language models are increasingly used as automated graders, yet their reliability under answer-side manipulation and their behavior in multi-model panels remain insufficiently understood. This paper introduces EvalHack, a matrix benchmark in which a fixed committee of four LLMs grades university-level machine learning exam answers under a strict integer-only contract (0–10) grounded in instructor-authored rubric artifacts. The dataset comprises 100 students answering 10 short, open-ended items (1000 answers). For each answer, the evaluation includes a clean version and two content-preserving adversarial variants that operate only on the student text: A1, a visible coercive suffix appended to the answer, and A2, a stealth variant that uses Unicode control characters (e.g., zero-width and bidirectional marks) to embed an instruction. EvalHack instruments the full grading pipeline, recording item-level member scores, the committee aggregate, within-panel disagreement, and discrepancies to human grades. Empirically, answer-side edits induce systematic score inflation and stronger top-end concentration, with edited answers clustering near the upper end of the scale. Within-panel disagreement, measured as the range between the highest and lowest member score, varies across conditions, with median Consistency Spread values of 3.0 (clean), 2.0 (A1), and 6.0 (A2). Compared to human graders, the panel is more lenient on average (MAE = 1.897; bias human − panel = −1.345). Finally, grouping items by disagreement shows that low-disagreement items exhibit smaller human-panel errors, indicating that within-panel spread can serve as a practical uncertainty signal for routing difficult answers to human review or to larger/more specialized panels. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Graphical abstract

22 pages, 348 KB  
Article
Exchange Rate Volatility and Corporate Cash-Flow Resilience: Firm-Level Evidence from MENA Emerging Markets
by Soufiane Jamali and Said Elbouazizi
J. Risk Financial Manag. 2026, 19(3), 222; https://doi.org/10.3390/jrfm19030222 - 17 Mar 2026
Viewed by 541
Abstract
Exchange rate volatility creates uncertainty for firms in open economies, especially in emerging markets with structural vulnerability and shallow financial markets. This work examines the impact of exchange rate volatility on the cash-flow performance of non-financial firms in the Middle East and North [...] Read more.
Exchange rate volatility creates uncertainty for firms in open economies, especially in emerging markets with structural vulnerability and shallow financial markets. This work examines the impact of exchange rate volatility on the cash-flow performance of non-financial firms in the Middle East and North Africa (MENA) region of 292 firms across 11 countries from 2014 to 2023. Heteroskedasticity, serial correlation and cross-sectional dependence were estimated using fixed effects, random effects and robustness estimation using Driscoll–Kraay standard errors and Feasible Generalized Least Squares (FGLS). Exchange rate volatility has no statistically significant impact on corporate cash flows across all specifications, confirming the existence of an exchange rate exposure puzzle in emerging markets. Firm size always appears to be the strongest and most robust predictor of liquidity performance. The macroeconomic growth effect is weaker and context dependent: It is insignificant with baseline panel estimations, is negative with Driscoll–Kraay corrections and is marginally positive with FGLS structural controls. Profitability and inflation are virtually nonexistent. These insights inform both financial risk management and policy actions aimed at enhancing corporate stability and supporting sustainable development in emerging markets. Full article
(This article belongs to the Section Financial Markets)
21 pages, 765 KB  
Article
Beyond Production: Institutional and Environmental Drivers of Food Security in East Asia
by Ramzi Knani, Chaker Gabsi and Adel Benhamed
Economies 2026, 14(3), 91; https://doi.org/10.3390/economies14030091 - 12 Mar 2026
Viewed by 341
Abstract
This study assesses the role of institutional quality, macroeconomic performance, and environmental pressures in shaping food security in East Asia. Using a PMG-ARDL panel model with data from China, Singapore, and Japan—three economies characterized by high institutional standards—the analysis covers the period 1996–2023. [...] Read more.
This study assesses the role of institutional quality, macroeconomic performance, and environmental pressures in shaping food security in East Asia. Using a PMG-ARDL panel model with data from China, Singapore, and Japan—three economies characterized by high institutional standards—the analysis covers the period 1996–2023. The findings highlight a strong and statistically significant long-term effect of institutional quality on food production, underlining the essential role of governance in reducing regulatory uncertainty, attracting agricultural investment, and enabling coherent policy frameworks. The CO2 emissions growth also exhibits a significant negative impact, underscoring that climate change poses a structural threat to food security. Control variables show that population growth and macroeconomic stability enhance food security, reflecting an ability to adapt to demand. In contrast, the effect of inflation is insignificant in the long term. In the short term, the analysis reveals heterogeneity in adjustment. The ECT is negative and significant for Singapore, indicating an effective return to long-term equilibrium. In contrast, it is insignificant for Japan and China, suggesting a lack of automatic convergence due to structural specificities in the short term. Overall, the study demonstrates that sustainable food security in advanced East Asian economies relies not only on productive capacity, but also on effective governance, macroeconomic stability, and the integration of climate considerations into long-term policy strategies. Full article
Show Figures

Figure 1

17 pages, 258 KB  
Article
From Theory to Practice, and Back: Student Evidence Testing ZPD, APOS, CLT, and Constructivism in Mathematical Thinking Workshops
by Mashudu Mokhithi, Anita Campbell, Jonathan Shock and Pragashni Padayachee
Educ. Sci. 2026, 16(3), 385; https://doi.org/10.3390/educsci16030385 - 4 Mar 2026
Viewed by 384
Abstract
University mathematics-support programs rarely test their theoretical foundations against student evidence, particularly in the Global South. This study addresses that gap by analyzing how students’ experiences in Mathematical Thinking Workshops (MTWs) at a South African university confirm, nuance, or challenge assumptions the Zone [...] Read more.
University mathematics-support programs rarely test their theoretical foundations against student evidence, particularly in the Global South. This study addresses that gap by analyzing how students’ experiences in Mathematical Thinking Workshops (MTWs) at a South African university confirm, nuance, or challenge assumptions the Zone of Proximal Development (ZPD), Action–Process–Object–Schema (APOS) theory, Cognitive Load Theory (CLT), and constructivism. We conducted a qualitative secondary analysis of six focus-group interviews (n = 17), using abductive reflexive thematic analysis and an Assumption–Indicator–Evidence matrix that linked design rationales to student narratives. Student accounts strongly supported ZPD, with facilitation and peer norms fostering psychological safety and risk-taking, while also showing that equitable participation required explicit role-rotation routines. APOS-informed task sequencing enabled coordination across representations but operated recursively, with students calling for planned revisiting sessions to consolidate difficult ideas. CLT claims were affirmed where venue conditions and timing inflated extraneous load, highlighting the need for short debriefs and load-aware logistics. Constructivist activity fostered belonging, confidence, and more social views of mathematics but generated uncertainty when tasks ended without brief closure. We conclude by proposing context-aware refinements to these frameworks and outlining a replicable routine for testing educational theory through student evidence. Full article
(This article belongs to the Special Issue Engaging Students to Transform Tertiary Mathematics Education)
21 pages, 1927 KB  
Article
A Dynamic Hybrid Weighting Framework for Teaching Effectiveness Evaluation in Multi-Criteria Decision-Making: Integrating Interval-Valued Intuitionistic Fuzzy AHP and Entropy Triggering
by Chengling Lu and Yanxue Zhang
Entropy 2026, 28(2), 241; https://doi.org/10.3390/e28020241 - 19 Feb 2026
Viewed by 401
Abstract
Multi-criteria decision-making (MCDM) problems in complex evaluation systems are often characterized by high uncertainty in expert judgments and dynamic variations in indicator importance. Traditional analytic hierarchy process (AHP) and entropy-based weighting methods typically suffer from two inherent limitations: the inability to explicitly quantify [...] Read more.
Multi-criteria decision-making (MCDM) problems in complex evaluation systems are often characterized by high uncertainty in expert judgments and dynamic variations in indicator importance. Traditional analytic hierarchy process (AHP) and entropy-based weighting methods typically suffer from two inherent limitations: the inability to explicitly quantify expert hesitation and the rigidity of static weight assignment under evolving data distributions. To address these challenges, this paper proposes a dynamic hybrid weighting framework that integrates an interval-valued intuitionistic fuzzy analytic hierarchy process (IVIF-AHP) with an entropy-triggered correction mechanism. First, interval-valued intuitionistic fuzzy numbers are employed to simultaneously model membership, non-membership, and hesitation degrees in pairwise comparisons, enabling a more comprehensive representation of expert uncertainty. Second, an entropy-triggered dynamic fusion strategy is developed by jointly incorporating information entropy and coefficient of variation, allowing adaptive adjustment between subjective expert weights and objective data-driven weights. This mechanism effectively enhances sensitivity to high-dispersion criteria while preserving expert knowledge in low-variability indicators. The proposed framework is formulated in a hierarchical fuzzy decision structure and implemented through a fuzzy comprehensive evaluation process. Its feasibility and robustness are validated through a concrete case study on teaching effectiveness evaluation for a university engineering course, leveraging multi-source data. Comparative analysis demonstrates that the proposed approach effectively mitigates the weight rigidity and evaluation inflation observed in conventional methods. Furthermore, it improves diagnostic resolution and decision stability across different evaluation periods. The results indicate that the proposed entropy-triggered IVIF-AHP framework provides a mathematically sound and practically applicable solution for dynamic MCDM problems under uncertainty, with strong potential for extension to other complex evaluation and decision-support systems. Full article
Show Figures

Figure 1

33 pages, 2919 KB  
Article
Life-Cycle Co-Optimization of User-Side Energy Storage Systems with Multi-Service Stacking and Degradation-Aware Dispatch
by Lixiang Lin, Yuanliang Zhang, Chenxi Zhang, Xin Li, Zixuan Guo, Haotian Cai and Xiangang Peng
Processes 2026, 14(3), 477; https://doi.org/10.3390/pr14030477 - 29 Jan 2026
Viewed by 342
Abstract
The integration of a user-side energy storage system (ESS) faces notable economic challenges, including high upfront investment, uncertainty in quantifying battery degradation, and fragmented ancillary service revenue streams, which hinder large-scale deployment. Conventional configuration studies often handle capacity planning and operational scheduling at [...] Read more.
The integration of a user-side energy storage system (ESS) faces notable economic challenges, including high upfront investment, uncertainty in quantifying battery degradation, and fragmented ancillary service revenue streams, which hinder large-scale deployment. Conventional configuration studies often handle capacity planning and operational scheduling at different stages, complicating consistent life-cycle valuation under degradation and multi-service participation. This paper proposes a life-cycle multi-service co-optimization model (LC-MSCOM) to jointly determine ESS power–energy ratings and operating strategies. A unified revenue framework quantifies stacked revenues from time-of-use arbitrage, demand charge management, demand response, and renewable energy accommodation, while depth of discharge (DoD)-related lifetime loss is converted into an equivalent degradation cost and embedded in the optimization. The model is validated on a modified IEEE benchmark system using real generation and load data. Results show that LC-MSCOM increases net present value (NPV) by 26.8% and reduces discounted payback period (DPP) by 12.7% relative to conventional benchmarks, and sensitivity analyses confirm robustness under discount-rate, inflation-rate, and tariff uncertainties. By coordinating ESS dispatch with distribution network operating limits (nodal power balance, voltage bounds, and branch ampacity constraints), the framework provides practical, investment-oriented decision support for user-side ESS deployment. Full article
Show Figures

Figure 1

28 pages, 22450 KB  
Article
Identifying Dominant Inflation Risks in Residential Construction Projects Using Fuzzy Truth Qualification
by Burak Oz and Merve Kocyigit
Sustainability 2026, 18(3), 1317; https://doi.org/10.3390/su18031317 - 28 Jan 2026
Viewed by 334
Abstract
Persistent inflation has intensified uncertainty in the construction industry, particularly in volatile economies. Inflation-driven risks affecting Turkish residential projects are examined in this study, focusing on rising costs, fluctuating labor and material prices, and associated risks. The power-based linguistic hedges were used to [...] Read more.
Persistent inflation has intensified uncertainty in the construction industry, particularly in volatile economies. Inflation-driven risks affecting Turkish residential projects are examined in this study, focusing on rising costs, fluctuating labor and material prices, and associated risks. The power-based linguistic hedges were used to quantify dominant severity levels under uncertainty based on descriptive statistics and standard deviation thresholds. Results indicate that inflation mostly impacts projects through budget overruns and wage inflation, which exhibit the highest severity and crisis-level risk behaviors. A number of factors drive material price volatility, particularly macroeconomic instability, currency depreciation, and supply-chain disruptions. There is a sustained pressure on contractor profitability due to wage inflation. In contrast, inflation-related effects on schedule, quality, safety, and contractual disputes are secondary and context-dependent. The findings indicate a structural shift in the risk profile of Turkish residential construction, indicating a need for inflation-resilient cost management, adaptive contracting, and proactive labor planning. Full article
Show Figures

Figure 1

15 pages, 3763 KB  
Article
Understanding the Financial Implications of Antimicrobial Resistance Surveillance in Nepal: Context-Specific Evidence for Policy and Sustainable Financing Strategies
by Yunjin Yum, Monika Karki, Dan Whitaker, Kshitij Karki, Ratnaa Shakya, Hari Prasad Kattel, Amrit Saud, Vishan Gajmer, Pankaj Chaudhary, Shrija Thapa, Rakchya Amatya, Timothy Worth, Claudia Parry, Wongyeong Choi, Clemence Nohe, Adrienne Chattoe-Brown, Deepak C. Bajracharya, Krishna Prasad Rai, Sangita Sharma, Kiran Pandey, Bijaya Kumar Shrestha, Runa Jha and Jung-Seok Leeadd Show full author list remove Hide full author list
Antibiotics 2026, 15(1), 103; https://doi.org/10.3390/antibiotics15010103 - 20 Jan 2026
Viewed by 612
Abstract
Background/Objectives: Antimicrobial resistance (AMR) surveillance is a cornerstone of national AMR strategies but requires sustained, cross-sectoral financing. While the need for such financing is well recognized, its quantification remains scarce in low- and middle-income countries. This study aimed to estimate the full [...] Read more.
Background/Objectives: Antimicrobial resistance (AMR) surveillance is a cornerstone of national AMR strategies but requires sustained, cross-sectoral financing. While the need for such financing is well recognized, its quantification remains scarce in low- and middle-income countries. This study aimed to estimate the full costs of AMR surveillance across the human health, animal health, and food sectors (2021–2030) in selected facilities in Nepal and generate evidence to inform sustainable financing. Methods: A bottom-up micro-costing approach was used to analyze data from five sites. Costs were adjusted for inflation using projected gross domestic product deflators, and probabilistic sensitivity analyses were conducted to assess uncertainty in laboratory sample volumes under four scenarios. Results: The total cost of AMR surveillance in Nepal was $6.7 million: $3.4 million for human health (50.3% out of the aggregated costs), $2.7 million for animal health (39.8%), and $0.7 million for the food sector (9.9%). Laboratories accounted for >90% of total costs, with consumables and personnel as the main cost drivers. Average cost per sample was $150 (animal), $64 (food), and $6 (human). Conclusions: This study offers the first robust, multi-sectoral 10-year cost estimates of AMR surveillance in Nepal. The findings highlight that sustaining AMR surveillance requires predictable domestic financing, particularly to cover recurrent laboratory operations as donor support declines. These results provide cost evidence to support future budgeting and policy planning toward sustainable, nationally financed AMR surveillance in Nepal. Full article
Show Figures

Figure 1

32 pages, 4385 KB  
Article
Probabilistic Wind Speed Forecasting Under at Site and Regional Frameworks: A Comparative Evaluation of BART, GPR, and QRF
by Khaled Haddad and Ataur Rahman
Climate 2026, 14(1), 21; https://doi.org/10.3390/cli14010021 - 15 Jan 2026
Viewed by 492
Abstract
Reliable probabilistic wind speed forecasts are essential for integrating renewable energy into power grids and managing operational uncertainty. This study compares Quantile Regression Forests (QRF), Bayesian Additive Regression Trees (BART), and Gaussian Process Regression (GPR) under at-site and regional pooled frameworks using 21 [...] Read more.
Reliable probabilistic wind speed forecasts are essential for integrating renewable energy into power grids and managing operational uncertainty. This study compares Quantile Regression Forests (QRF), Bayesian Additive Regression Trees (BART), and Gaussian Process Regression (GPR) under at-site and regional pooled frameworks using 21 years (2000–2020) of daily wind data from eleven stations in New South Wales and Queensland, Australia. Models are evaluated via strict year-based holdout validation across seven metrics: RMSE, MAE, R2, bias, correlation, coverage, and Continuous Ranked Probability Score (CRPS). Regional QRF achieves exceptional point forecast stability with minimal RMSE increase but suffers persistent under-coverage, rendering probabilistic bounds unreliable. BART attains near-nominal coverage at individual sites but experiences catastrophic calibration collapse under regional pooling, driven by fixed noise priors inadequate for spatially heterogeneous data. In contrast, GPR maintains robust probabilistic skill regionally despite larger point forecast RMSE penalties, achieving the lowest overall CRPS and near-nominal coverage through kernel-based variance inflation. Variable importance analysis identifies surface pressure and minimum temperature as dominant predictors (60–80%), with spatial covariates critical for regional differentiation. Operationally, regional QRF is prioritised for point accuracy, regional GPR for calibrated probabilistic forecasts in risk-sensitive applications, and at-site BART when local data suffice. These findings show that Bayesian machine learning methods can effectively navigate the trade-off between local specificity and regional pooling, a challenge common to wind forecasting in diverse terrain globally. The methodology and insights are transferable to other heterogeneous regions, providing guidance for probabilistic wind forecasting and renewable energy grid integration. Full article
Show Figures

Figure 1

27 pages, 3750 KB  
Article
Digital Asset Analytics for DeFi Protocol Valuation: An Explainable Optuna-Tuned Super Learner Ensemble Framework
by Gihan M. Ali
J. Risk Financial Manag. 2026, 19(1), 63; https://doi.org/10.3390/jrfm19010063 - 13 Jan 2026
Cited by 1 | Viewed by 1054
Abstract
Decentralized Finance (DeFi) has become a major component of digital asset markets, yet accurately valuing protocol performance remains difficult due to high volatility, nonlinear pricing dynamics, and persistent disclosure gaps that amplify valuation risk. This study develops an Optuna-tuned Super Learner stacked ensemble [...] Read more.
Decentralized Finance (DeFi) has become a major component of digital asset markets, yet accurately valuing protocol performance remains difficult due to high volatility, nonlinear pricing dynamics, and persistent disclosure gaps that amplify valuation risk. This study develops an Optuna-tuned Super Learner stacked ensemble to improve risk-aware DeFi valuation, combining Extremely Randomized Trees (ETs), Support Vector Regression (SVR), and Categorical Boosting (CAT) as heterogeneous base learners, with a K-Nearest Neighbors (KNNs) meta-learner integrating their forecasts. Using an expanding-window panel time-series cross-validation design, the framework achieves significantly higher predictive accuracy than individual models, benchmark ensembles, and econometric baselines, obtaining RMSE = 0.085, MAE = 0.065, and R2 = 0.97—representing a 25–36% reduction in valuation error. Wilcoxon tests confirm that these gains are statistically significant (p < 0.01). SHAP-based interpretability analysis identifies Gross Merchandise Volume (GMV) as the primary valuation determinant, followed by Total Value Locked (TVL) and key protocol design features such as Decentralized Exchange (DEX) classification, while revenue variables and inflation contribute secondary effects. The findings demonstrate how explainable ensemble learning can strengthen valuation accuracy, reduce information-driven uncertainty, and support risk-informed decision-making for investors, analysts, developers, and policymakers operating within rapidly evolving blockchain-based digital asset environments. Full article
(This article belongs to the Section Financial Technology and Innovation)
Show Figures

Figure 1

16 pages, 336 KB  
Article
Bayesian Neural Networks with Regularization for Sparse Zero-Inflated Data Modeling
by Sunghae Jun
Information 2026, 17(1), 81; https://doi.org/10.3390/info17010081 - 13 Jan 2026
Viewed by 537
Abstract
Zero inflation is pervasive across text mining, event log, and sensor analytics, and it often degrades the predictive performance of analytical models. Classical approaches, most notably the zero-inflated Poisson (ZIP) and zero-inflated negative binomial (ZINB) models, address excess zeros but rely on rigid [...] Read more.
Zero inflation is pervasive across text mining, event log, and sensor analytics, and it often degrades the predictive performance of analytical models. Classical approaches, most notably the zero-inflated Poisson (ZIP) and zero-inflated negative binomial (ZINB) models, address excess zeros but rely on rigid parametric assumptions and fixed model structures, which can limit flexibility in high-dimensional, sparse settings. We propose a Bayesian neural network (BNN) with regularization for sparse zero-inflated data modeling. The method separately parameterizes the zero inflation probability and the count intensity under ZIP/ZINB likelihoods, while employing Bayesian regularization to induce sparsity and control overfitting. Posterior inference is performed using variational inference. We evaluate the approach through controlled simulations with varying zero ratios and a real-world dataset, and we compare it against Poisson generalized linear models, ZIP, and ZINB baselines. The present study focuses on predictive performance measured by mean squared error (MSE). Across all settings, the proposed method achieves consistently lower prediction error and improved uncertainty problems, with ablation studies confirming the contribution of the regularization components. These results demonstrate that a regularized BNN provides a flexible and robust framework for sparse zero-inflated data analysis in information-rich environments. Full article
(This article belongs to the Special Issue Feature Papers in Information in 2024–2025)
Show Figures

Graphical abstract

28 pages, 506 KB  
Article
Economic Policy Uncertainty and Firm Profitability in Nigeria: Does Oil Price Volatility Deepen the Shock?
by Olajide O. Oyadeyi, Ehireme Uddin and Esther O. Olusola
Economies 2026, 14(1), 18; https://doi.org/10.3390/economies14010018 - 9 Jan 2026
Viewed by 1086
Abstract
Recent studies have focused on the detrimental effects of global economic policy uncertainty (EPU) on firm profitability. Nevertheless, none of these studies has focused on a developing economy like Nigeria. To understand this, the study conducted a host of regression analyses using the [...] Read more.
Recent studies have focused on the detrimental effects of global economic policy uncertainty (EPU) on firm profitability. Nevertheless, none of these studies has focused on a developing economy like Nigeria. To understand this, the study conducted a host of regression analyses using the Driscoll and Kraay fixed-effect estimator and the two-step system generalised method of moments to examine the effects of global crude oil prices and domestic and global economic policy uncertainty on firm profitability in Nigeria from 2005 to 2024. The findings indicate that while global EPU had a minimal impact on firm profitability, domestic EPU had a substantial negative impact. The findings remain consistent even across the sub-samples, sensitivity, and robustness analyses. Furthermore, the findings showed that firm size and capital are significant determinants of profitability for Nigerian firms. At the same time, oil prices and their interactions do not affect firm profitability in Nigeria. The study suggests that regulators in the Nigerian business environment can contribute to building a more resilient environment by implementing systems to monitor critical economic indicators and ensure timely responses to emerging challenges. Systematic evaluations of economic uncertainties, including business sentiment, inflation rates, exchange rates, interest rates, and economic growth, can provide valuable insights for policy formulation and interventions aimed at enhancing the profitability of Nigerian firms. Full article
Show Figures

Figure 1

23 pages, 668 KB  
Article
The Impact of Economic Factors on Medium-Term Budget Revenue Forecasts: Insights from an Ex Post Analysis of Advanced Economies
by Berat Kara and Fatih Sarıoğlu
J. Risk Financial Manag. 2026, 19(1), 34; https://doi.org/10.3390/jrfm19010034 - 4 Jan 2026
Viewed by 2233
Abstract
This study investigates the determinants of medium-term revenue forecast errors across eight developed countries: the United States, the United Kingdom, Germany, Ireland, Hong Kong, New Zealand, Australia, and Canada. By examining two- and three-year revenue forecasts, this study applies the Kenneth Holden–David Peel [...] Read more.
This study investigates the determinants of medium-term revenue forecast errors across eight developed countries: the United States, the United Kingdom, Germany, Ireland, Hong Kong, New Zealand, Australia, and Canada. By examining two- and three-year revenue forecasts, this study applies the Kenneth Holden–David Peel test to identify forecast biases and employs panel regression models to assess the economic factors influencing forecast accuracy. The findings indicate that inflation, GDP growth, and budget balance rules are positively associated with forecast errors, whereas unemployment, population growth, and the number of fiscal rules mitigate these errors. Panel estimates reveal that fiscal structure-related variables are not only statistically significant but also economically meaningful determinants of medium-term revenue forecast errors. The results underscore the persistent challenges in achieving accurate revenue forecasts and highlight the necessity for improved forecasting methodologies to enhance fiscal policy effectiveness and resource allocation. Strengthening forecasting frameworks can contribute to more reliable revenue projections, reducing fiscal uncertainty and supporting sound economic decision making. Full article
(This article belongs to the Special Issue Public Budgeting and Finance)
Show Figures

Figure 1

19 pages, 5002 KB  
Article
Deep Learning-Based Diffraction Identification and Uncertainty-Aware Adaptive Weighting for GNSS Positioning in Occluded Environments
by Chenhui Wang, Haoliang Shen, Yanyan Liu, Qingjia Meng and Chuang Qian
Remote Sens. 2026, 18(1), 158; https://doi.org/10.3390/rs18010158 - 3 Jan 2026
Viewed by 512
Abstract
In natural canyons and urban occluded environments, signal anomalies induced by the satellite diffraction effect are a critical error source affecting the positioning accuracy of deformation monitoring. This paper proposes a deep learning-based method for diffraction signal identification and mitigation. The method utilizes [...] Read more.
In natural canyons and urban occluded environments, signal anomalies induced by the satellite diffraction effect are a critical error source affecting the positioning accuracy of deformation monitoring. This paper proposes a deep learning-based method for diffraction signal identification and mitigation. The method utilizes a LSTM network to deeply mine the time-series characteristics of GNSS observation data. We systematically analyze the impact of azimuth, elevation, SNR, and multi-feature combinations on model recognition performance, demonstrating that single features suffer from incomplete information or poor discrimination. Experimental results show that the multi-dimensional feature scheme of “SNR + Elevation + Azimuth” effectively characterizes both signal strength and spatial geometric information, achieving complementary feature advantages. The overall recognition accuracy of the proposed method reaches 84.2%, with an accuracy of 88.0% for anomalous satellites that severely impact positioning precision. Furthermore, we propose an Adaptive Weighting Method for Diffraction Mitigation Based on Uncertainty Quantification. This method constructs a variance inflation model using the probability vector output from the LSTM Softmax layer and introduces Information Entropy to quantify prediction uncertainty, ensuring that the weighting model possesses protection capability when the model fails or is uncertain. In processing a set of GNSS data collected in a highly-occluded environment, the proposed method significantly outperforms traditional cut-off elevation and SNR mask strategies, improving the AFR to 99.9%, and enhancing the positioning accuracy in the horizontal and vertical directions by an average of 80.1% and 76.4%, respectively, thereby effectively boosting the positioning accuracy and reliability in occluded environments. Full article
Show Figures

Figure 1

Back to TopTop