Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (254)

Search Parameters:
Keywords = heteroscedastic modeling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
46 pages, 7272 KB  
Article
Prediction Models for Nitrogen Content in Metal at Various Stages of the Basic Oxygen Furnace Steelmaking Process
by Jaroslav Demeter, Branislav Buľko, Peter Demeter and Martina Hrubovčáková
Appl. Sci. 2025, 15(17), 9561; https://doi.org/10.3390/app15179561 - 30 Aug 2025
Viewed by 203
Abstract
Controlling dissolved nitrogen is critical to meeting increasingly stringent steel quality targets, yet the variable kinetics of gas absorption and removal across production stages complicate real-time decision-making. Leveraging a total of 291 metal samples, the research applied ordinary least squares (OLS) regression, enhanced [...] Read more.
Controlling dissolved nitrogen is critical to meeting increasingly stringent steel quality targets, yet the variable kinetics of gas absorption and removal across production stages complicate real-time decision-making. Leveraging a total of 291 metal samples, the research applied ordinary least squares (OLS) regression, enhanced by cointegration diagnostics, to develop four stage-specific models covering pig iron after desulfurization, crude steel in the basic oxygen furnace (BOF) before tapping, steel at the beginning and end of secondary metallurgy processing. Predictor selection combined thermodynamic reasoning and correlation analysis to produce prediction equations that passed heteroscedasticity, normality, autocorrelation, collinearity, and graphical residual distribution tests. The k-fold cross-validation method was also used to evaluate models’ performance. The models achieved an adequate accuracy of 77.23–83.46% for their respective stages. These findings demonstrate that statistically robust and physically interpretable regressions can capture the complex interplay between kinetics and the various processes that govern nitrogen pick-up and removal. All data are from U. S. Steel Košice, Slovakia; thus, the models capture specific setup, raw materials, and production practices. After adaptation within the knowledge transfer, implementing these models in process control systems could enable proactive parameter optimization and reduce laboratory delays, ultimately minimizing excessive nitrogenation in finished steel. Full article
(This article belongs to the Special Issue Digital Technologies Enabling Modern Industries)
Show Figures

Figure 1

20 pages, 1969 KB  
Article
Contagion or Decoupling? Evidence from Emerging Stock Markets
by Lumengo Bonga-Bonga and Zinzile Lorna Ndiweni
Risks 2025, 13(9), 165; https://doi.org/10.3390/risks13090165 - 29 Aug 2025
Viewed by 175
Abstract
This paper uses a statistical test based on entropy theory to propose a new way to distinguish between interdependence, contagion, and the decoupling hypotheses in the context of shock transmission and spillover. Applying the proposed approach, the three hypotheses are examined when measuring [...] Read more.
This paper uses a statistical test based on entropy theory to propose a new way to distinguish between interdependence, contagion, and the decoupling hypotheses in the context of shock transmission and spillover. Applying the proposed approach, the three hypotheses are examined when measuring the extent of shock spillover between selected developed and emerging markets during idiosyncratic crisis and normal periods. The US and EU are identified as developed economies. However, emerging markets are classified by regions to determine whether their responses to shocks from developed economies are homogeneous or heterogeneous depending on the region to which they belong. The suggested entropy test is based on the conditional correlations obtained from an asymmetric dynamic conditional correlation generalized autoregressive conditional heteroscedasticity (A-DCC GARCH) model. In addition to economic methods, statistical methods based on the regime-switching technique are used to date the different phases of the global financial crisis (GFC) and the European sovereign debt crisis (ESDC). Our findings show that all emerging markets decoupled from developed economies in at least one of the phases of the two crises. These findings provide valuable insights for policymakers, investors, and asset managers for portfolio allocation and financial regulations. Full article
Show Figures

Figure 1

18 pages, 2432 KB  
Article
From Volume to Mass: Transforming Volatile Organic Compound Detection with Photoionization Detectors and Machine Learning
by Yunfei Cai, Xiang Che and Yusen Duan
Sensors 2025, 25(17), 5314; https://doi.org/10.3390/s25175314 - 27 Aug 2025
Viewed by 501
Abstract
(1) Objective: Volatile organic compounds (VOCs) monitoring in industrial parks is crucial for environmental regulation and public health protection. However, current techniques face challenges related to cost and real-time performance. This study aims to develop a dynamic calibration framework for accurate real-time conversion [...] Read more.
(1) Objective: Volatile organic compounds (VOCs) monitoring in industrial parks is crucial for environmental regulation and public health protection. However, current techniques face challenges related to cost and real-time performance. This study aims to develop a dynamic calibration framework for accurate real-time conversion of VOCs volume fractions (nmol mol−1) to mass concentrations (μg m−3) in industrial environments, addressing the limitations of conventional monitoring methods such as high costs and delayed response times. (2) Methods: By innovatively integrating photoionization detector (PID) with machine learning, we developed a robust conversion model utilizing PID signals, meteorological data, and a random forest’s (RF) algorithm. The system’s performance was rigorously evaluated against standard gas chromatography-flame ionization detectors (GC-FID) measurements. (3) Results: The proposed framework demonstrated superior performance, achieving a coefficient of determination (R2) of 0.81, root mean squared error (RMSE) of 48.23 μg m−3, symmetric mean absolute percentage error (SMAPE) of 62.47%, and a normalized RMSE (RMSEnorm) of 2.07%, outperforming conventional methods. This framework not only achieved minute-level response times but also reduced costs to just 10% of those associated with GC-FID methods. Additionally, the model exhibited strong cross-site robustness with R2 values ranging from 0.68 to 0.69, although its accuracy was somewhat reduced for high-concentration samples (>1500 μg m−3), where the mean absolute percentage error (MAPE) was 17.8%. The inclusion of SMAPE and RMSEnorm provides a more nuanced understanding of the model’s performance, particularly in the context of skewed or heteroscedastic data distributions, thereby offering a more comprehensive assessment of the framework’s effectiveness. (4) Conclusions: The framework’s innovative combination of PID’s real-time capability and RF’s nonlinear modeling achieves accurate mass concentration conversion (R2 = 0.81) while maintaining a 95% faster response and 90% cost reduction compared to GC-FID systems. Compared with traditional single-coefficient PID calibration, this framework significantly improves accuracy and adaptability under dynamic industrial conditions. Future work will apply transfer learning to improve high-concentration detection for pollution tracing and environmental governance in industrial parks. Full article
(This article belongs to the Special Issue Advanced Sensors for Gas Monitoring)
Show Figures

Figure 1

29 pages, 4733 KB  
Article
Water Quality Index (WQI) Forecasting and Analysis Based on Neuro-Fuzzy and Statistical Methods
by Amar Lokman, Wan Zakiah Wan Ismail, Nor Azlina Ab Aziz and Anith Khairunnisa Ghazali
Appl. Sci. 2025, 15(17), 9364; https://doi.org/10.3390/app15179364 - 26 Aug 2025
Viewed by 534
Abstract
Water quality is crucial to the economy and ecology because a healthy aquatic eco-system supports human survival and biodiversity. We have developed the Neuro-Adapt Fuzzy Strategist (NAFS) to improve water quality index (WQI) forecasting accuracy. The objective of the developed model is to [...] Read more.
Water quality is crucial to the economy and ecology because a healthy aquatic eco-system supports human survival and biodiversity. We have developed the Neuro-Adapt Fuzzy Strategist (NAFS) to improve water quality index (WQI) forecasting accuracy. The objective of the developed model is to achieve a balance by improving prediction accuracy while preserving high interpretability and computational efficiency. Neural networks and fuzzy logic improve the NAFS model’s flexibility and prediction accuracy, while its optimized backward pass improves training convergence speed and parameter update effectiveness, contributing to better learning performance. The normalized and partial derivative computations are refined to improve the model. NAFS is compared with ANN, Adaptive Neuro-Fuzzy Inference System (ANFIS), and current machine learning (ML) models such as LSTM, GRU, and Transformer based on performance evaluation metrics. NAFS outperforms ANFIS and ANN, with MSE of 1.678. NAFS predicts water quality better than ANFIS and ANN, with RMSE of 1.295. NAFS captures complicated water quality parameter interdependencies better than ANN and ANFIS using principal component analysis (PCA) and Pearson correlation. The performance comparison shows that NAFS outperforms all baseline models with the lowest MAE, MSE, RMSE and MAPE, and the highest R2, confirming its superior accuracy. PCA is employed to reduce data dimensionality and identify the most influential water quality parameters. It reveals that two principal components account for 72% of the total variance, highlighting key contributors to WQI and supporting feature prioritization in the NAFS model. The Breusch–Pagan test reveals heteroscedasticity in residuals, justifying the use of non-linear models over linear methods. The Shapiro–Wilk test indicates non-normality in residuals. This shows that the NAFS model can handle complex, non-linear environmental variables better than previous water quality prediction research. NAFS not only can predict water quality index values but also enhance WQI estimation. Full article
(This article belongs to the Special Issue AI in Wastewater Treatment)
Show Figures

Figure 1

49 pages, 14879 KB  
Article
Fully Bayesian Inference for Meta-Analytic Deconvolution Using Efron’s Log-Spline Prior
by JoonHo Lee and Daihe Sui
Mathematics 2025, 13(16), 2639; https://doi.org/10.3390/math13162639 - 17 Aug 2025
Viewed by 353
Abstract
Meta-analytic deconvolution seeks to recover the distribution of true effects from noisy site-specific estimates. While Efron’s log-spline prior provides an elegant empirical Bayes solution with excellent point estimation properties, its plug-in nature yields severely anti-conservative uncertainty quantification for individual site effects—a critical limitation [...] Read more.
Meta-analytic deconvolution seeks to recover the distribution of true effects from noisy site-specific estimates. While Efron’s log-spline prior provides an elegant empirical Bayes solution with excellent point estimation properties, its plug-in nature yields severely anti-conservative uncertainty quantification for individual site effects—a critical limitation for what Efron terms “finite-Bayes inference.” We develop a fully Bayesian extension that preserves the computational advantages of the log-spline framework while properly propagating hyperparameter uncertainty into site-level posteriors. Our approach embeds the log-spline prior within a hierarchical model with adaptive regularization, enabling exact finite-sample inference without asymptotic approximations. Through simulation studies calibrated to realistic meta-analytic scenarios, we demonstrate that our method achieves near-nominal coverage (88–91%) for 90% credible intervals while matching empirical Bayes point estimation accuracy. We provide a complete Stan implementation handling heteroscedastic observations—a critical feature absent from existing software. The method enables principled uncertainty quantification for individual effects at modest computational cost, making it particularly valuable for applications requiring accurate site-specific inference, such as multisite trials and institutional performance assessment. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

13 pages, 970 KB  
Article
A Mixture Integer GARCH Model with Application to Modeling and Forecasting COVID-19 Counts
by Wooi Chen Khoo, Seng Huat Ong, Victor Jian Ming Low and Hari M. Srivastava
Stats 2025, 8(3), 73; https://doi.org/10.3390/stats8030073 - 13 Aug 2025
Viewed by 310
Abstract
This article introduces a flexible time series regression model known as the Mixture of Integer-Valued Generalized Autoregressive Conditional Heteroscedasticity (MINGARCH). Mixture models provide versatile frameworks for capturing heterogeneity in count data, including features such as multiple peaks, seasonality, and intervention effects. The proposed [...] Read more.
This article introduces a flexible time series regression model known as the Mixture of Integer-Valued Generalized Autoregressive Conditional Heteroscedasticity (MINGARCH). Mixture models provide versatile frameworks for capturing heterogeneity in count data, including features such as multiple peaks, seasonality, and intervention effects. The proposed model is applied to regional COVID-19 data from Malaysia. To account for geographical variability, five regions—Selangor, Kuala Lumpur, Penang, Johor, and Sarawak—were selected for analysis, covering a total of 86 weeks of data. Comparative analysis with existing time series regression models demonstrates that MINGARCH outperforms alternative approaches. Further investigation into forecasting reveals that MINGARCH yields superior performance in regions with high population density, and significant influencing factors have been identified. In low-density regions, confirmed cases peaked within three weeks, whereas high-density regions exhibited a monthly seasonal pattern. Forecasting metrics—including MAPE, MAE, and RMSE—are significantly lower for the MINGARCH model compared to other models. These results suggest that MINGARCH is well-suited for forecasting disease spread in urban and densely populated areas, offering valuable insights for policymaking. Full article
Show Figures

Figure 1

23 pages, 4597 KB  
Article
High-Throughput UAV Hyperspectral Remote Sensing Pinpoints Bacterial Leaf Streak Resistance in Wheat
by Alireza Sanaeifar, Ruth Dill-Macky, Rebecca D. Curland, Susan Reynolds, Matthew N. Rouse, Shahryar Kianian and Ce Yang
Remote Sens. 2025, 17(16), 2799; https://doi.org/10.3390/rs17162799 - 13 Aug 2025
Viewed by 625
Abstract
Bacterial leaf streak (BLS), caused by Xanthomonas translucens pv. undulosa, has become an intermittent yet economically significant disease of wheat in the Upper Midwest during the last decade. Because chemical and cultural controls remain ineffective, breeders rely on developing resistant varieties, yet [...] Read more.
Bacterial leaf streak (BLS), caused by Xanthomonas translucens pv. undulosa, has become an intermittent yet economically significant disease of wheat in the Upper Midwest during the last decade. Because chemical and cultural controls remain ineffective, breeders rely on developing resistant varieties, yet visual ratings in inoculated nurseries are labor-intensive, subjective, and time-consuming. To accelerate this process, we combined unmanned-aerial-vehicle hyperspectral imaging (UAV-HSI) with a carefully tuned chemometric workflow that delivers rapid, objective estimates of disease severity. Principal component analysis cleanly separated BLS, leaf rust, and Fusarium head blight, with the first component explaining 97.76% of the spectral variance, demonstrating in-field pathogen discrimination. Pre-processing of the hyperspectral cubes, followed by robust Partial Least Squares (RPLS) regression, improved model reliability by managing outliers and heteroscedastic noise. Four variable-selection strategies—Variable Importance in Projection (VIP), Interval PLS (iPLS), Recursive Weighted PLS (rPLS), and Genetic Algorithm (GA)—were evaluated; rPLS provided the best balance between parsimony and accuracy, trimming the predictor set from 244 to 29 bands. Informative wavelengths clustered in the near-infrared and red-edge regions, which are linked to chlorophyll loss and canopy water stress. The best model, RPLS with optimal preprocessing and variable selection based on the rPLS method, showed high predictive accuracy, achieving a cross-validated R2 of 0.823 and cross-validated RMSE of 7.452, demonstrating its effectiveness for detecting and quantifying BLS. We also explored the spectral overlap with Sentinel-2 bands, showing how UAV-derived maps can nest within satellite mosaics to link plot-level scouting to landscape-scale surveillance. Together, these results lay a practical foundation for breeders to speed the selection of resistant lines and for agronomists to monitor BLS dynamics across multiple spatial scales. Full article
Show Figures

Figure 1

11 pages, 3342 KB  
Proceeding Paper
Fundamentals of Time Series Analysis in Electricity Price Forecasting
by Ciaran O’Connor, Andrea Visentin and Steven Prestwich
Comput. Sci. Math. Forum 2025, 11(1), 16; https://doi.org/10.3390/cmsf2025011016 - 11 Aug 2025
Viewed by 157
Abstract
Time series forecasting is a cornerstone of decision-making in energy and finance, yet many studies fail to rigorously analyse the underlying dataset characteristics, leading to suboptimal model selection and unreliable outcomes. This paper addresses these shortcomings by presenting a comprehensive framework that integrates [...] Read more.
Time series forecasting is a cornerstone of decision-making in energy and finance, yet many studies fail to rigorously analyse the underlying dataset characteristics, leading to suboptimal model selection and unreliable outcomes. This paper addresses these shortcomings by presenting a comprehensive framework that integrates fundamental time series diagnostics—stationarity tests, autocorrelation analysis, heteroscedasticity, multicollinearity, and correlation analysis—into forecasting workflows. Unlike existing studies that prioritise pre-packaged machine learning and deep learning methods, often at the expense of interpretable statistical benchmarks, our approach advocates for the combined use of statistical models alongside advanced machine learning methods. Using the Day-Ahead Market dataset from the Irish electricity market as a case study, we demonstrate how rigorous statistical diagnostics can guide model selection, improve interpretability, and improve forecasting accuracy. This work offers a novel, integrative methodology that bridges the gap between statistical rigour and modern computational techniques, improving reliability in time series forecasting. Full article
Show Figures

Figure 1

26 pages, 3766 KB  
Article
Water Quality Evaluation and Analysis by Integrating Statistical and Machine Learning Approaches
by Amar Lokman, Wan Zakiah Wan Ismail and Nor Azlina Ab Aziz
Algorithms 2025, 18(8), 494; https://doi.org/10.3390/a18080494 - 8 Aug 2025
Viewed by 522
Abstract
Water quality assessment plays a vital role in environmental monitoring and resource management. This study aims to enhance the predictive modeling of the Water Quality Index (WQI) using a combination of statistical diagnostics and machine learning techniques. Data collected from six river locations [...] Read more.
Water quality assessment plays a vital role in environmental monitoring and resource management. This study aims to enhance the predictive modeling of the Water Quality Index (WQI) using a combination of statistical diagnostics and machine learning techniques. Data collected from six river locations in Malaysia are analyzed. The methodology involves collecting water quality data from six river locations in Malaysia, followed by a series of statistical analyses including assumption testing (shapiro–wilk and breusch–pagan tests), diagnostic evaluations, feature importance analysis, and principal component analysis (PCA). Decision tree regression (DTR) and autoregressive integrated moving average (ARIMA) are employed for regression, while random forest is used for classification. Learning curve analysis is conducted to evaluate model performance and generalization. The results indicate that dissolved oxygen (DO) and ammoniacal nitrogen (AN) are the most influential parameters, with normalized importance scores of 1.000 and 0.565, respectively. The breusch–pagan test identifies significant heteroscedasticity (p-value = (3.138e115)), while the Shapiro–Wilk test confirms non-normality (p-value = 0.0). PCA effectively reduces dimensionality while preserving 95% of dataset variance, optimizing computational efficiency. Among the regression models, ARIMA demonstrates better predictive accuracy than DTR. Meanwhile, random forest achieves high classification performance and shows strong generalization capability with increasing training data. Learning curve analysis reveals overfitting in the regression model, suggesting the need for hyperparameter tuning, while the classification model demonstrates improved generalization with additional training data. Strong correlations among key parameters indicate potential multicollinearity, emphasizing the need for careful feature selection. These findings highlight the synergy between statistical pre-processing and machine learning, offering a more accurate and efficient approach to water quality prediction for informed environmental policy and real-time monitoring systems. Full article
Show Figures

Figure 1

34 pages, 1602 KB  
Article
Dynamic Spillovers Among Green Bond Markets: The Impact of Investor Sentiment
by Thuy Duong Le, Ariful Hoque and Thi Le
J. Risk Financial Manag. 2025, 18(8), 444; https://doi.org/10.3390/jrfm18080444 - 8 Aug 2025
Viewed by 665
Abstract
This research investigates the dynamic spillover effects among green bond markets and the impact of investor sentiment on these spillovers. We employ different research methods, including a time-varying parameter vector autoregression, an exponential general autoregressive conditional heteroscedasticity, and a generalized autoregressive conditional heteroskedasticity-mixed [...] Read more.
This research investigates the dynamic spillover effects among green bond markets and the impact of investor sentiment on these spillovers. We employ different research methods, including a time-varying parameter vector autoregression, an exponential general autoregressive conditional heteroscedasticity, and a generalized autoregressive conditional heteroskedasticity-mixed data sampling model. Our sample is for twelve international green bond markets from 3 January 2022 to 31 December 2024. Our results evidence the strong correlation between twelve green bond markets, with the United States and China being the net risk receivers and Sweden being the largest net shock transmitter. We also find the varied impact of direct and indirect investor sentiment on the net total directional spillovers. Our research offers fresh contributions to the existing literature in different ways. On the one hand, it adds to the green finance literature by clarifying the dynamic spillovers among leading international green bond markets. On the other hand, it extends behavioral finance research by including direct and indirect investor sentiment in the spillovers of domestic and foreign green bond markets. Our study is also significant to related stakeholders, including investors in their portfolio rebalancing and policymakers in stabilizing green bond markets. Full article
(This article belongs to the Special Issue Behaviour in Financial Decision-Making)
Show Figures

Figure 1

18 pages, 366 KB  
Article
Nonparametric Transformation Models for Double-Censored Data with Crossed Survival Curves: A Bayesian Approach
by Ping Xu, Ruichen Ni, Shouzheng Chen, Zhihua Ma and Chong Zhong
Mathematics 2025, 13(15), 2461; https://doi.org/10.3390/math13152461 - 30 Jul 2025
Viewed by 296
Abstract
Double-censored data are frequently encountered in pharmacological and epidemiological studies, where the failure time can only be observed within a certain range and is otherwise either left- or right-censored. In this paper, we present a Bayesian approach for analyzing double-censored survival data with [...] Read more.
Double-censored data are frequently encountered in pharmacological and epidemiological studies, where the failure time can only be observed within a certain range and is otherwise either left- or right-censored. In this paper, we present a Bayesian approach for analyzing double-censored survival data with crossed survival curves. We introduce a novel pseudo-quantile I-splines prior to model monotone transformations under both random and fixed censoring schemes. Additionally, we incorporate categorical heteroscedasticity using the dependent Dirichlet process (DDP), enabling the estimation of crossed survival curves. Comprehensive simulations further validate the robustness and accuracy of the method, particularly under the fixed censoring scheme, where traditional approaches may NOT be applicable. In the randomized AIDS clinical trial, by incorporating the categorical heteroscedasticity, we obtain a new finding that the effect of baseline log RNA levels is significant. The proposed framework provides a flexible and reliable tool for survival analysis, offering an alternative to parametric and semiparametric models. Full article
Show Figures

Figure 1

29 pages, 5118 KB  
Article
Effective Comparison of Thermo-Mechanical Characteristics of Self-Compacting Concretes Through Machine Learning-Based Predictions
by Armando La Scala and Leonarda Carnimeo
Fire 2025, 8(8), 289; https://doi.org/10.3390/fire8080289 - 23 Jul 2025
Viewed by 485
Abstract
This present study proposes different machine learning-based predictors for the assessment of the residual compressive strength of Self-Compacting Concrete (SCC) subjected to high temperatures. The investigation is based on several literature algorithmic approaches based on Artificial Neural Networks with distinct training algorithms (Bayesian [...] Read more.
This present study proposes different machine learning-based predictors for the assessment of the residual compressive strength of Self-Compacting Concrete (SCC) subjected to high temperatures. The investigation is based on several literature algorithmic approaches based on Artificial Neural Networks with distinct training algorithms (Bayesian Regularization, Levenberg–Marquardt, Scaled Conjugate Gradient, and Resilient Backpropagation), Support Vector Regression, and Random Forest methods. A training database of 150 experimental data points is derived from a careful literature review, incorporating temperature (20–800 °C), geometric ratio (height/diameter), and corresponding compressive strength values. A statistical analysis revealed complex non-linear relationships between variables, with strong negative correlation between temperature and strength and heteroscedastic data distribution, justifying the selection of advanced machine learning techniques. Feature engineering improved model performance through the incorporation of quadratic terms, interaction variables, and cyclic transformations. The Resilient Backpropagation algorithm demonstrated superior performance with the lowest prediction errors, followed by Bayesian Regularization. Support Vector Regression achieved competitive accuracy despite its simpler architecture. Experimental validation using specimens tested up to 800 °C showed a good reliability of the developed systems, with prediction errors ranging from 0.33% to 23.35% across different temperature ranges. Full article
Show Figures

Figure 1

37 pages, 100736 KB  
Article
Hybrid GIS-Transformer Approach for Forecasting Sentinel-1 Displacement Time Series
by Lama Moualla, Alessio Rucci, Giampiero Naletto, Nantheera Anantrasirichai and Vania Da Deppo
Remote Sens. 2025, 17(14), 2382; https://doi.org/10.3390/rs17142382 - 10 Jul 2025
Cited by 1 | Viewed by 515
Abstract
This study presents a deep learning-based approach for forecasting Sentinel-1 displacement time series, with particular attention to irregular temporal patterns—an aspect often overlooked in previous works. Displacement data were generated using the Parallel Small BAseline Subset (P-SBAS) technique via the Geohazard Thematic Exploitation [...] Read more.
This study presents a deep learning-based approach for forecasting Sentinel-1 displacement time series, with particular attention to irregular temporal patterns—an aspect often overlooked in previous works. Displacement data were generated using the Parallel Small BAseline Subset (P-SBAS) technique via the Geohazard Thematic Exploitation Platform (G-TEP). Initial experiments on a regular dataset from Lombardy employed Long Short-Term Memory (LSTM) models to forecast multiple future time steps. Empirical analysis determined that optimal forecasting is achieved with a 50-time-step input sequence, and that predicting 10% of the input sequence length strikes a balance between temporal coverage and accuracy. The investigation then extended to irregular datasets from Lisbon and Washington, comparing two preprocessing strategies: imputation and the inclusion of time intervals as a second feature. While imputation improved one-step predictions, it was inadequate for multi-step forecasting. To address this, a Time-Gated LSTM (TG-LSTM) was implemented. TG-LSTM outperformed standard LSTM for irregular data in one-step prediction but faced limitations in handling heteroscedasticity and computational cost during multi-step forecasting. These issues were effectively resolved using Temporal Fusion Transformers (TFT), which achieved the best performance, with RMSE values of 1.71 mm/year (Lisbon) and 1.26 mm/year (Washington). A key contribution of this work is the development of a GIS-integrated forecasting toolbox that incorporates LSTM models for regular sequences and TG-LSTM/TFT models for irregular ones. The toolbox enables both single- and multi-step displacement predictions, offering a scalable solution for geohazard monitoring and early warning applications. Full article
Show Figures

Figure 1

16 pages, 692 KB  
Article
Exchange Rate Volatility and Its Impact on International Trade: Evidence from Zimbabwe
by Iveny Makore and Chisinga Ngonidzashe Chikutuma
J. Risk Financial Manag. 2025, 18(7), 376; https://doi.org/10.3390/jrfm18070376 - 7 Jul 2025
Cited by 1 | Viewed by 3300
Abstract
Zimbabwe’s economy has experienced extreme exchange rate fluctuations over the past decades, driven by persistent macroeconomic instability and episodes of hyperinflation. The instability in exchange rates can significantly impact trade balances, inflation rates, and overall economic resilience. Understanding the impact of exchange rate [...] Read more.
Zimbabwe’s economy has experienced extreme exchange rate fluctuations over the past decades, driven by persistent macroeconomic instability and episodes of hyperinflation. The instability in exchange rates can significantly impact trade balances, inflation rates, and overall economic resilience. Understanding the impact of exchange rate volatility (ERV) on international trade is crucial in such a context. This study investigates the impact of exchange rate volatility (ERV) on international trade in Zimbabwe, addressing a literature gap related to its unique economic challenges and hyperinflation. Using the Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model on data from 1990 to 2023, the study finds a negative relationship between ERV and international trade. The analysis suggests that inflation reduces imports, but foreign direct investment (FDI) and balance of payments (BOP) increase export uncertainties. This study recommends optimal fiscal and monetary management to mitigate ERV and enhance trade stability, offering insights for policymakers to strengthen Zimbabwe’s trade resilience amid exchange rate fluctuations. Full article
(This article belongs to the Section Financial Markets)
Show Figures

Figure 1

24 pages, 2253 KB  
Article
Modeling Spatial Data with Heteroscedasticity Using PLVCSAR Model: A Bayesian Quantile Regression Approach
by Rongshang Chen and Zhiyong Chen
Entropy 2025, 27(7), 715; https://doi.org/10.3390/e27070715 - 1 Jul 2025
Viewed by 372
Abstract
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model [...] Read more.
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model for spatial data to improve the prediction of performance. It can be used to capture the response of covariates to linear and nonlinear effects at different quantile points. Through an approximation of the nonparametric functions with free-knot splines, we develop a Bayesian sampling approach that can be applied by the Markov chain Monte Carlo (MCMC) approach and design an efficient Metropolis–Hastings within the Gibbs sampling algorithm to explore the joint posterior distributions. Computational efficiency is achieved through a modified reversible-jump MCMC algorithm incorporating adaptive movement steps to accelerate chain convergence. The simulation results demonstrate that our estimator exhibits robustness to alternative spatial weight matrices and outperforms both quantile regression (QR) and instrumental variable quantile regression (IVQR) in a finite sample at different quantiles. The effectiveness of the proposed model and estimation method is demonstrated by the use of real data from the Boston median house price. Full article
(This article belongs to the Special Issue Bayesian Hierarchical Models with Applications)
Show Figures

Figure 1

Back to TopTop