Next Issue
Volume 13, December
Previous Issue
Volume 13, June
 
 

Econometrics, Volume 13, Issue 3 (September 2025) – 14 articles

Cover Story (view full-size image): In this paper, we analyse how changes in prescription drug sales influenced mortality and hospital use in Belgium between 1998 and 2019. Increased use of newer drugs, those not sold before 1999, correlates with significant reductions in years of life lost and hospital days. In 2018, life-years lost before the age of 85 were reduced by 31%, and hospital days in 2019 were reduced by 20%. If hospital reduction is ignored, the cost per life-year gained was EUR 6824; however, savings in hospital care exceeded spending on newer drugs. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
27 pages, 358 KB  
Article
Re-Examining Confidence Intervals for Ratios of Parameters
by Zaka Ratsimalahelo
Econometrics 2025, 13(3), 37; https://doi.org/10.3390/econometrics13030037 - 20 Sep 2025
Viewed by 148
Abstract
This paper considers the problem of constructing confidence intervals (CIs) for nonlinear functions of parameters, particularly ratios of parameters a common issue in econometrics and statistics. Classical CIs (such as the Delta method and the Fieller method) often fail in small samples due [...] Read more.
This paper considers the problem of constructing confidence intervals (CIs) for nonlinear functions of parameters, particularly ratios of parameters a common issue in econometrics and statistics. Classical CIs (such as the Delta method and the Fieller method) often fail in small samples due to biased parameter estimators and skewed distributions. We extended the Delta method using the Edgeworth expansion to correct for skewness due to estimated parameters having non-normal and asymmetric distributions. The resulting bias-corrected confidence intervals are easy to compute and have a good coverage probability that converges to the nominal level at a rate of O(n1/2) where n is the sample size. We also propose bias-corrected estimators based on second-order Taylor expansions, aligning with the “almost unbiased ratio estimator” . We then correct the CIs according to the Delta method and the Edgeworth expansion. Thus, our new methods for constructing confidence intervals account for both the bias and the skewness of the distribution of the nonlinear functions of parameters. We conduct a simulation study to compare the confidence intervals of our new methods with the two classical methods. The methods evaluated include Fieller’s interval, Delta with and without the bias correction interval, and Edgeworth expansion with and without the bias correction interval. The results show that our new methods with bias correction generally have good performance in terms of controlling the coverage probabilities and average length intervals. They should be recommended for constructing confidence intervals for nonlinear functions of estimated parameters. Full article
30 pages, 6284 KB  
Article
Integration and Risk Transmission Dynamics Between Bitcoin, Currency Pairs, and Traditional Financial Assets in South Africa
by Benjamin Mudiangombe Mudiangombe and John Weirstrass Muteba Mwamba
Econometrics 2025, 13(3), 36; https://doi.org/10.3390/econometrics13030036 - 19 Sep 2025
Viewed by 303
Abstract
This study explores the new insights into the integration and dynamic asymmetric volatility risk spillovers between Bitcoin, currency pairs (USD/ZAR, GBP/ZAR and EUR/ZAR), and traditional financial assets (ALSI, Bond, and Gold) in South Africa using daily data spanning the period from 2010 to [...] Read more.
This study explores the new insights into the integration and dynamic asymmetric volatility risk spillovers between Bitcoin, currency pairs (USD/ZAR, GBP/ZAR and EUR/ZAR), and traditional financial assets (ALSI, Bond, and Gold) in South Africa using daily data spanning the period from 2010 to 2024 and employing Time-Varying Parameter Vector Autoregression (TVP-VAR) and wavelet coherence. The findings revealed strengthened integration between traditional financial assets and currency pairs, as well as weak integration with BTC/ZAR. Furthermore, BTC/ZAR and traditional financial assets were receivers of shocks, while the currency pairs were transmitters of spillovers. Gold emerged as an attractive investment during periods of inflation or currency devaluation. However, the assets have a total connectedness index of 28.37%, offering a reduced systemic risk. Distinct patterns were observed in the short, medium, and long term in time scales and frequency. There is a diversification benefit and potential hedging strategies due to gold’s negative influence on BTC/ZAR. Bitcoin’s high volatility and lack of regulatory oversight continue to be deterrents for institutional investors. This study lays a solid foundation for understanding the financial dynamics in South Africa, offering valuable insights for investors and policymakers interested in the intricate linkages between BTC/ZAR, currency pairs, and traditional financial assets, allowing for more targeted policy measures. Full article
Show Figures

Figure 1

23 pages, 1850 KB  
Article
Forecasting of GDP Growth in the South Caucasian Countries Using Hybrid Ensemble Models
by Gaetano Perone and Manuel A. Zambrano-Monserrate
Econometrics 2025, 13(3), 35; https://doi.org/10.3390/econometrics13030035 - 10 Sep 2025
Viewed by 453
Abstract
This study aimed to forecast the gross domestic product (GDP) of the South Caucasian nations (Armenia, Azerbaijan, and Georgia) by scrutinizing the accuracy of various econometric methodologies. This topic is noteworthy considering the significant economic development exhibited by these countries in the context [...] Read more.
This study aimed to forecast the gross domestic product (GDP) of the South Caucasian nations (Armenia, Azerbaijan, and Georgia) by scrutinizing the accuracy of various econometric methodologies. This topic is noteworthy considering the significant economic development exhibited by these countries in the context of recovery post COVID-19. The seasonal autoregressive integrated moving average (SARIMA), exponential smoothing state space (ETS) model, neural network autoregressive (NNAR) model, and trigonometric exponential smoothing state space model with Box–Cox transformation, ARMA errors, and trend and seasonal components (TBATS), together with their feasible hybrid combinations, were employed. The empirical investigation utilized quarterly GDP data at market prices from 1Q-2010 to 2Q-2024. According to the results, the hybrid models significantly outperformed the corresponding single models, handling the linear and nonlinear components of the GDP time series more effectively. Rolling-window cross-validation showed that hybrid ETS-NNAR-TBATS for Armenia, hybrid ETS-NNAR-SARIMA for Azerbaijan, and hybrid ETS-SARIMA for Georgia were the best-performing models. The forecasts also suggest that Georgia is likely to record the strongest GDP growth over the projection horizon, followed by Armenia and Azerbaijan. These findings confirm that hybrid models constitute a reliable technique for forecasting GDP in the South Caucasian countries. This region is not only economically dynamic but also strategically important, with direct implications for policy and regional planning. Full article
Show Figures

Figure 1

21 pages, 3095 KB  
Article
Volatility Analysis of Returns of Financial Assets Using a Bayesian Time-Varying Realized GARCH-Itô Model
by Pathairat Pastpipatkul and Htwe Ko
Econometrics 2025, 13(3), 34; https://doi.org/10.3390/econometrics13030034 - 9 Sep 2025
Viewed by 441
Abstract
In a stage of more and more complex and high-frequency financial markets, the volatility analysis is a cornerstone of modern financial econometrics with practical applications in portfolio optimization, derivative pricing, and systematic risk assessment. This paper introduces a novel Bayesian Time-varying Generalized Autoregressive [...] Read more.
In a stage of more and more complex and high-frequency financial markets, the volatility analysis is a cornerstone of modern financial econometrics with practical applications in portfolio optimization, derivative pricing, and systematic risk assessment. This paper introduces a novel Bayesian Time-varying Generalized Autoregressive Conditional Heteroskedasticity (BtvGARCH-Itô) model designed to improve the precision and flexibility of volatility modeling in financial markets. Original GARCH-Itô models, while effective in capturing realized volatility and intraday patterns, rely on fixed or constant parameters; thus, it is limited to studying structural changes. Our proposed model addresses this restraint by integrating the continuous-time Ito process with a time-varying Bayesian inference to allow parameters to vary over time based on prior beliefs to quantify uncertainty and minimize overfitting, especially in small-sample or high-dimensional settings. Through simulation studies, using sample sizes of N = 100 and N = 200, we find that BtvGARCH-Itô outperformed original GARCH-Itô in-sample fit and out-of-sample forecast accuracy based on posterior estimates comparison with true parameter values and forecasting error metrics. For the empirical validation, this model is applied to analyze the volatility of S&P 500 and Bitcoin (BTC) using one-minute length data for S&P 500 (from 3 January 2023 to 31 December 2024) and BTC (from 1 January 2023 to 1 January 2025). This model has potential as a robust tool and a new direction in volatility modeling for financial risk management. Full article
Show Figures

Figure 1

27 pages, 1290 KB  
Article
Modelling and Forecasting Financial Volatility with Realized GARCH Model: A Comparative Study of Skew-t Distributions Using GRG and MCMC Methods
by Didit Budi Nugroho, Adi Setiawan and Takayuki Morimoto
Econometrics 2025, 13(3), 33; https://doi.org/10.3390/econometrics13030033 - 4 Sep 2025
Viewed by 424
Abstract
Financial time-series data often exhibit statistically significant skewness and heavy tails, and numerous flexible distributions have been proposed to model them. In the context of the Log-linear Realized GARCH model with Skew-t (ST) distributions, our objective is to explore how the choice [...] Read more.
Financial time-series data often exhibit statistically significant skewness and heavy tails, and numerous flexible distributions have been proposed to model them. In the context of the Log-linear Realized GARCH model with Skew-t (ST) distributions, our objective is to explore how the choice of prior distributions in the Adaptive Random Walk Metropolis method and initial parameter values in the Generalized Reduced Gradient (GRG) Solver method affect ST parameter and log-likelihood estimates. An empirical study was conducted using the FTSE 100 index to evaluate model performance. We provide a comprehensive step-by-step tutorial demonstrating how to perform estimation and sensitivity analysis using data tables in Microsoft Excel. Among seven ST distributions—namely, the asymmetric, epsilon, exponentiated half-logistic, Hansen, Jones–Faddy, Mittnik–Paolella, and Rosco–Jones–Pewsey distributions—Hansen’s ST distribution is found to be superior. This study also applied the GRG method to estimate new approaches, including Realized Real-Time GARCH, Realized ASHARV, and GARCH@CARR models. An empirical study showed that the GARCH@CARR model with the feedback effect provides the best goodness of fit. Out-of-sample forecasting evaluations further confirm the predictive dominance of models incorporating real-time information, particularly Realized Real-Time GARCH for volatility forecasting and Realized ASHARV for 1% VaR estimation. The findings offer actionable insights for portfolio managers and risk analysts, particularly in improving volatility forecasts and tail-risk assessments during market crises, thereby enhancing risk-adjusted returns and regulatory compliance. Although the GRG method is sensitive to initial values, its presence in the spreadsheet method can be a powerful and promising tool in working with probability density functions that have explicit forms and are unimodal, high-dimensional, and complex, without the need for programming experience. Full article
Show Figures

Figure 1

37 pages, 414 KB  
Article
Comparisons Between Frequency Distributions Based on Gini’s Approach: Principal Component Analysis Addressed to Time Series
by Pierpaolo Angelini
Econometrics 2025, 13(3), 32; https://doi.org/10.3390/econometrics13030032 - 13 Aug 2025
Viewed by 800
Abstract
In this paper, time series of length T are seen as frequency distributions. Each distribution is defined with respect to a statistical variable having T observed values. A methodological system based on Gini’s approach is put forward, so the statistical model through which [...] Read more.
In this paper, time series of length T are seen as frequency distributions. Each distribution is defined with respect to a statistical variable having T observed values. A methodological system based on Gini’s approach is put forward, so the statistical model through which time series are handled is a frequency distribution studied inside a linear system. In addition to the starting frequency distributions that are observed, other frequency distributions are treated. Thus, marginal distributions based on the notion of proportionality are introduced together with joint distributions. Both distributions are statistical models. A fundamental invariance property related to marginal distributions is made explicit in this research work, so one can focus on collections of marginal frequency distributions, identifying multiple frequency distributions. For this reason, the latter is studied via a tensor. As frequency distributions are practical realizations of nonparametric probability distributions over R, one passes from frequency distributions to discrete random variables. In this paper, a mathematical model that generates time series is put forward. It is a stochastic process based on subjective previsions of random variables. A subdivision of the exchangeability of variables of a statistical nature is shown, so a reinterpretation of principal component analysis that is based on the notion of proportionality also characterizes this research work. Full article
33 pages, 415 KB  
Article
A Statistical Characterization of Median-Based Inequality Measures
by Charles M. Beach and Russell Davidson
Econometrics 2025, 13(3), 31; https://doi.org/10.3390/econometrics13030031 - 9 Aug 2025
Viewed by 368
Abstract
For income distributions divided into middle, lower, and higher regions based on scalar median cut-offs, this paper establishes the asymptotic distribution properties—including explicit empirically applicable variance formulas and hence standard errors—of sample estimates of the proportion of the population within the group, their [...] Read more.
For income distributions divided into middle, lower, and higher regions based on scalar median cut-offs, this paper establishes the asymptotic distribution properties—including explicit empirically applicable variance formulas and hence standard errors—of sample estimates of the proportion of the population within the group, their share of total income, and the groups’ mean incomes. It then applies these results for relative mean income ratios, various polarization measures, and decile-mean income ratios. Since the derived formulas are not distribution-free, the study advises using a density estimation technique proposed by Comte and Genon-Catalot. A shrinking middle-income group with declining relative incomes and marked upper-tail polarization among men’s incomes are all found to be highly statistically significant. Full article
32 pages, 1187 KB  
Article
Simple Approximations and Interpretation of Pareto Index and Gini Coefficient Using Mean Absolute Deviations and Quantile Functions
by Eugene Pinsky and Qifu Wen
Econometrics 2025, 13(3), 30; https://doi.org/10.3390/econometrics13030030 - 8 Aug 2025
Viewed by 592
Abstract
The Pareto distribution has been widely used to model income distribution and inequality. The tail index and the Gini index are typically computed by iteration using Maximum Likelihood and are usually interpreted in terms of the Lorenz curve. We derive an alternative method [...] Read more.
The Pareto distribution has been widely used to model income distribution and inequality. The tail index and the Gini index are typically computed by iteration using Maximum Likelihood and are usually interpreted in terms of the Lorenz curve. We derive an alternative method by considering a truncated Pareto distribution and deriving a simple closed-form approximation for the tail index and the Gini coefficient in terms of the mean absolute deviation and weighted quartile differences. The obtained expressions can be used for any Pareto distribution, even without a finite mean or variance. These expressions are resistant to outliers and have a simple geometric and “economic” interpretation in terms of the quantile function and quartiles. Extensive simulations demonstrate that the proposed approximate values for the tail index and the Gini coefficient are within a few percent relative error of the exact values, even for a moderate number of data points. Our paper offers practical and computationally simple methods to analyze a class of models with Pareto distributions. The proposed methodology can be extended to many other distributions used in econometrics and related fields. Full article
Show Figures

Figure 1

36 pages, 2033 KB  
Article
Beyond GDP: COVID-19’s Effects on Macroeconomic Efficiency and Productivity Dynamics in OECD Countries
by Ümit Sağlam
Econometrics 2025, 13(3), 29; https://doi.org/10.3390/econometrics13030029 - 4 Aug 2025
Cited by 1 | Viewed by 1212
Abstract
The COVID-19 pandemic triggered unprecedented economic disruptions, raising critical questions about the resilience and adaptability of macroeconomic productivity across countries. This study examines the impact of COVID-19 on macroeconomic efficiency and productivity dynamics in 37 OECD countries using quarterly data from 2018Q1 to [...] Read more.
The COVID-19 pandemic triggered unprecedented economic disruptions, raising critical questions about the resilience and adaptability of macroeconomic productivity across countries. This study examines the impact of COVID-19 on macroeconomic efficiency and productivity dynamics in 37 OECD countries using quarterly data from 2018Q1 to 2024Q4. By employing a Slack-Based Measure Data Envelopment Analysis (SBM-DEA) and the Malmquist Productivity Index (MPI), we decompose total factor productivity (TFP) into efficiency change (EC) and technological change (TC) across three periods: pre-pandemic, during-pandemic, and post-pandemic. Our framework incorporates both desirable (GDP) and undesirable outputs (inflation, unemployment, housing price inflation, and interest rate distortions), offering a multidimensional view of macroeconomic efficiency. Results show broad but uneven productivity gains, with technological progress proving more resilient than efficiency during the pandemic. Post-COVID recovery trajectories diverged, reflecting differences in structural adaptability and innovation capacity. Regression analysis reveals that stringent lockdowns in 2020 were associated with lower productivity in 2023–2024, while more adaptive policies in 2021 supported long-term technological gains. These findings highlight the importance of aligning crisis response with forward-looking economic strategies and demonstrate the value of DEA-based methods for evaluating macroeconomic performance beyond GDP. Full article
(This article belongs to the Special Issue Advancements in Macroeconometric Modeling and Time Series Analysis)
Show Figures

Figure 1

18 pages, 1033 KB  
Article
Analyzing the Impact of Carbon Mitigation on the Eurozone’s Trade Dynamics with the US and China
by Pathairat Pastpipatkul and Terdthiti Chitkasame
Econometrics 2025, 13(3), 28; https://doi.org/10.3390/econometrics13030028 - 29 Jul 2025
Viewed by 416
Abstract
This study focusses on the transmission of carbon pricing mechanisms in shaping trade dynamics between the Eurozone and key partners: the USA and China. Using Bayesian variable selection methods and a Time-Varying Structural Vector Autoregressions (TV-SVAR) model, the research identifies the key variables [...] Read more.
This study focusses on the transmission of carbon pricing mechanisms in shaping trade dynamics between the Eurozone and key partners: the USA and China. Using Bayesian variable selection methods and a Time-Varying Structural Vector Autoregressions (TV-SVAR) model, the research identifies the key variables impacting EU carbon emissions over time. The results reveal that manufactured products from the US have a diminishing positive impact on EU carbon emissions, suggesting potential exemption from future regulations. In contrast, manufactured goods from the US and petroleum products from China are expected to increase emissions, indicating a need for stricter trade policies. These findings provide strategic insights for policymakers aiming to balance trade and environmental objectives. Full article
Show Figures

Figure 1

16 pages, 311 KB  
Article
Pseudo-Panel Decomposition of the Blinder–Oaxaca Gender Wage Gap
by Jhon James Mora and Diana Yaneth Herrera
Econometrics 2025, 13(3), 27; https://doi.org/10.3390/econometrics13030027 - 19 Jul 2025
Cited by 1 | Viewed by 1137
Abstract
This article introduces a novel approach to decomposing the Blinder–Oaxaca gender wage gap using pseudo-panel data. In many developing countries, panel data are not available; however, understanding the evolution of the gender wage gap over time requires tracking individuals longitudinally. When individuals change [...] Read more.
This article introduces a novel approach to decomposing the Blinder–Oaxaca gender wage gap using pseudo-panel data. In many developing countries, panel data are not available; however, understanding the evolution of the gender wage gap over time requires tracking individuals longitudinally. When individuals change across time periods, estimators tend to be inconsistent and inefficient. To address this issue, and building upon the traditional Blinder–Oaxaca methodology, we propose an alternative procedure that follows cohorts over time rather than individuals. This approach enables the estimation of both the explained and unexplained components—“endowment effect” and “remuneration effect”—of the wage gap, along with their respective standard errors, even in the absence of true panel data. We apply this methodology to the case of Colombia, finding a gender wage gap of approximately 15% in favor of male cohorts. This gap comprises a −5.6% explained component and a 20% unexplained component without controls. When we control by informality, size of the firm and sector the gap comprises a −3.5% explained component and a 18.7% unexplained component. Full article
11 pages, 346 KB  
Article
Daily Emissions of CO2 in the World: A Fractional Integration Approach
by Luis Alberiko Gil-Alana and Carlos Poza
Econometrics 2025, 13(3), 26; https://doi.org/10.3390/econometrics13030026 - 17 Jul 2025
Viewed by 534
Abstract
In this article, daily CO2 emissions for the years 2019–2022 are examined using fractional integration for Brazil, China, EU-27 (and the UK), India, and the USA. According to the findings, all series exhibit long memory mean-reversion tendencies, with orders of integration ranging [...] Read more.
In this article, daily CO2 emissions for the years 2019–2022 are examined using fractional integration for Brazil, China, EU-27 (and the UK), India, and the USA. According to the findings, all series exhibit long memory mean-reversion tendencies, with orders of integration ranging between 0.22 in the case of India (with white noise errors) and 0.70 for Brazil (under autocorrelated disturbances). Nevertheless, the differencing parameter estimates are all considerably below 1, which supports the theory of mean reversion and transient shocks. These results suggest the need for a greater intensification of green policies complemented with economic structural reforms to achieve the zero-emissions target by 2050. Full article
Show Figures

Figure 1

31 pages, 2058 KB  
Article
The Long-Run Impact of Changes in Prescription Drug Sales on Mortality and Hospital Utilization in Belgium, 1998–2019
by Frank R. Lichtenberg
Econometrics 2025, 13(3), 25; https://doi.org/10.3390/econometrics13030025 - 23 Jun 2025
Viewed by 701
Abstract
Objectives: We investigate the long-run impact of changes in prescription drug sales on mortality and hospital utilization in Belgium during the first two decades of the 21st century. Methods: We analyze the correlation across diseases between changes in the drugs used to treat [...] Read more.
Objectives: We investigate the long-run impact of changes in prescription drug sales on mortality and hospital utilization in Belgium during the first two decades of the 21st century. Methods: We analyze the correlation across diseases between changes in the drugs used to treat the disease and changes in mortality or hospital utilization from that disease. The measure of the change in prescription drug sales we use is the long-run (1998–2018 or 2000–2019) change in the fraction of post-1999 drugs sold. A post-1999 drug is a drug that was not sold during 1989–1999. Results: The 1998–2018 increase in the fraction of post-1999 drugs sold is estimated to have reduced the number of years of life lost before ages 85, 75, and 65 in 2018 by about 438 thousand (31%), 225 thousand (31%), and 114 thousand (32%), respectively. The 1995–2014 increase in in the fraction of post-1999 drugs sold is estimated to have reduced the number of hospital days in 2019 by 2.66 million (20%). Conclusions: Even if we ignore the reduction in hospital utilization attributable to changes in pharmaceutical consumption, a conservative estimate of the 2018 cost per life-year before age 85 gained is EUR 6824. We estimate that previous changes in pharmaceutical consumption reduced 2019 expenditure on inpatient curative and rehabilitative care by EUR 3.55 billion, which is higher than the 2018 expenditure on drugs that were authorized during the period 1998–2018: EUR 2.99 billion. Full article
Show Figures

Figure 1

31 pages, 1988 KB  
Article
The Effect of Macroeconomic Announcements on U.S. Treasury Markets: An Autometric General-to-Specific Analysis of the Greenspan Era
by James J. Forest
Econometrics 2025, 13(3), 24; https://doi.org/10.3390/econometrics13030024 - 21 Jun 2025
Viewed by 2161
Abstract
This research studies the impact of macroeconomic announcement surprises on daily U.S. Treasury excess returns during the heart of Alan Greenspan’s tenure as Federal Reserve Chair, addressing the possible limitations of standard static regression (SSR) models, which may suffer from omitted variable bias, [...] Read more.
This research studies the impact of macroeconomic announcement surprises on daily U.S. Treasury excess returns during the heart of Alan Greenspan’s tenure as Federal Reserve Chair, addressing the possible limitations of standard static regression (SSR) models, which may suffer from omitted variable bias, parameter instability, and poor mis-specification diagnostics. To complement the SSR framework, an automated general-to-specific (Gets) modeling approach, enhanced with modern indicator saturation methods for robustness, is applied to improve empirical model discovery and mitigate potential biases. By progressively reducing an initially broad set of candidate variables, the Gets methodology steers the model toward congruence, dispenses unstable parameters, and seeks to limit information loss while seeking model congruence and precision. The findings, herein, suggest that U.S. Treasury market responses to macroeconomic news shocks exhibited stability for a core set of announcements that reliably influenced excess returns. In contrast to computationally costless standard static models, the automated Gets-based approach enhances parameter precision and provides a more adaptive structure for identifying relevant predictors. These results demonstrate the potential value of incorporating interpretable automated model selection techniques alongside traditional SSR and Markov switching approaches to improve empirical insights into macroeconomic announcement effects on financial markets. Full article
(This article belongs to the Special Issue Advancements in Macroeconometric Modeling and Time Series Analysis)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop