Challenging statements have appeared in recent years in the literature on advances in computational procedures. Examples are statements like: “Tapping the super computer under your desk” and “It is trivial to parallelize a value function iteration using Graphics Processing Units” (e.g., see [
1,
2,
3]). These statements refer to the fact that massively parallel computing is becoming an easy and revolutionary tool for handling more complex economic issues by using large sets of data (e.g., see [
4,
5]) in combination with inferential, forecasting and decision methods that require a tremendous speed-up in computations. Applications are spreading rapidly in many fields but occur so far in few areas in economics and finance.
The computational revolution in simulation techniques has, however, shown to become a key ingredient in the field of Bayesian econometrics and opened new possibilities to study complex economic and financial phenomena. Applications include risk measurement, forecasting, assessment of policy effectiveness in macro, finance, marketing and monetary economics (see [
6] among others).
This special issue aims to contribute to this literature by collecting a set of carefully evaluated papers that are grouped amongst three topics: parallelized Bayesian algorithms; combining inference and decision analysis in economics and finance; and handling large sets of macro series for forecasting.
The focus in the first set of papers is on the use of parallelized, simulation-based Bayesian econometric methods in order to analyze complex econometric problems. With the recent advances in computational procedures, it is possible to speed up computations with different techniques. Computations can be performed in multiple cores or clusters by running independent parts of the code in parallel. Such parts of the code may involve the same econometric model applied to different data sets, or the non-recursive pieces of the main algorithm used. Additionally, the operations can be performed making use of GPU multiprocessor architecture which is efficient in handling large matrix operations and provides massive parallel processing possibility.
The opening paper is by John Geweke and it explains the Sequentially Adaptive Bayesian Learning (SABL) algorithm and the corresponding SABL software (see [
7,
8]). The paper utilizes methodological innovations in SABL including optimization of irregular and multi-modal functions and produces the conventional maximum likelihood asymptotic variance matrix as a by-product. The practical application is analyzing the secular and cyclical half-life and the cycle period of real gross domestic product.
The next paper by Nalan Baştürk, Stefano Grassi, Lennart Hoogerheide and Herman K. van Dijk deals with practical experience with four canonical econometric models using the Parallelized Mixture of t Distributions estimated by an Importance Sampling weighted Expectation-Maximization (ParMitISEM) Algorithm. The results show that the parallelization of the MitISEM algorithm on Graphics Processing Units and multi-core Central Processing Units is straightforward and fast to program using MATLAB. Moreover the speed performance of the Graphics Processing Unit version is much higher than the Central Processing Unit one. The econometric applications include inference for a mixture GARCH model, an Instrumental Variable regression model and a new Keynesian Philips curve model. These models involve very nonstandard distributions in high dimensions.
The third paper by Arnaud Dufays deals with an evolutionary sequential Monte Carlo sampler applied to the analysis of change-point models. In this paper a tempered and time (TNT) algorithm is developed which combines the (off-line) tempered Sequential Monte Carlo (SMC) inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. This update is inspired from the heuristic differential evolution optimization literature and many extensions of the algorithm are possible. The algorithm is applied to the marginal likelihood calculation of change-point GARCH models.
The second set of papers refers to the combination of inference and decision analysis in economics and finance involving complex models and computational challenges. The first paper by Urbi Garay, Enrique ter Horst, German Molina and Abel Rodriguez deals with Bayesian nonparametric measurement of factor Betas and with clustering of financial returns. The authors propose a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. The performance of the approach is illustrated through simulation studies and an application to hedge fund returns. The approach exhibits fast adaptation to abrupt changes in the parameters, especially in periods which can be identified as times of stressful market events which reflect the dynamic positioning of hedge fund portfolio managers.
In the next paper return and risk of pairs trading strategies are explored by David Ardia, Lukasz T. Gatarek, Lennart Hoogerheide and Herman K. van Dijk using a simulation-based Bayesian procedure for predicting stable ratios of stock prices in a co-integration model. In the paper two cases of pairs trading strategies are investigated: a conditional statistical arbitrage method and an implicit statistical arbitrage method. The paper presents the effect that an encompassing prior under an orthogonal normalization has for the selection of pairs of cointegrated stock prices and for the estimation and prediction of the spread between cointegrated stock prices and its uncertainty. The model is applied to the stocks in the Dow Jones Composite Average index where it is shown that orthogonal normalization is important for the estimation and prediction of the spread.
The third paper in this set is by Samuel Malone, Robert Gramacy and Enrique ter Horst and it deals with timing foreign exchange markets. The authors use predictable and traded foreign exchange market risk factors as fundamentals to improve short-horizon exchange rate forecasts. In addition, they employ Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between the fundamentals and exchange rates. The paper explains how, through a model averaging Monte Carlo scheme, the BTGP is able to simultaneously exploit smoothness and rough breaks in between-variable dynamics. It is shown that trading strategies based on ex ante BTGP forecasts deliver the highest out-of-sample risk-adjusted returns for the median currency, as well as for both predictable, traded risk factors.
The fourth paper in this set is by Haroon Mumtaz and deals with the evolving transmission of uncertainty shocks in the United Kingdom. The authors investigate if the impact of uncertainty shocks on the UK economy has changed over time. For this purpose, the authors propose an extended time-varying VAR model that allows the estimation of a measure of uncertainty, where this measure encompasses volatility from the real and financial sectors of the economy. The proposed model simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. For the UK data, it is shown that the impact of uncertainty shocks on these variables has declined over time.
The last paper in this special issue deals with handling large sets of macro series for forecasting and structural analysis. In the paper by Roberto Casarin, Giulia Mantoan and Francesco Ravazzolo a Bayesian calibration method using generalized pools of predictive distributions is introduced. Combining more opinions and calibrating them to maximize the forecast accuracy is shown to be a crucial issue in several economic problems. The authors apply a Bayesian beta mixture model in order to derive a combined and calibrated density function. They compare linear, harmonic and logarithmic pooling schemes by using simulation experiments and an empirical application to a large database of stock data. The sequential forecasting in the financial application is based on a parallel implementation of the estimation algorithms. The simulation experiments show that in a beta-mixture calibration framework the three combination schemes are substantially equivalent. The financial application shows, on the contrary, that the linear pooling together with beta mixture calibration is achieving the best results in terms of calibrated forecast.
The guest editors want to thank all referees for a speedy and high quality evaluation procedure.