Next Article in Journal
Volatility Forecasting: Downside Risk, Jumps and Leverage Effect
Next Article in Special Issue
Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP
Previous Article in Journal
Multiple Discrete Endogenous Variables in Weakly-Separable Triangular Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Computational Complexity and Parallelization in Bayesian Econometric Analysis

by
Nalan Baştürk
1,
Roberto Casarin
2,
Francesco Ravazzolo
3 and
Herman K. Van Dijk
4,5,6,*
1
Department of Quantitative Economics, School of Business and Economics, Maastricht University, Maastricht, 6211LM, The Netherlands
2
Department of Economics, University Ca’ Foscari of Venice, Venice, 31022, Italy
3
Faculty of Economics and Management, Free University of Bozen-Bolzano, Bolzano, 39100, Italy
4
Faculty of Economics and Business Administration, Vrije Universiteit Amsterdam, Amsterdam, 1081HV, The Netherlands
5
Tinbergen Institute, Amsterdam, 1082 MS, The Netherlands
6
Econometric Institute, Erasmus School of Economics, Erasmus University, Rotterdam, 3062 PA, The Netherlands
*
Author to whom correspondence should be addressed.
Econometrics 2016, 4(1), 9; https://doi.org/10.3390/econometrics4010009
Submission received: 27 January 2016 / Revised: 28 January 2016 / Accepted: 3 February 2016 / Published: 22 February 2016
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Challenging statements have appeared in recent years in the literature on advances in computational procedures. Examples are statements like: “Tapping the super computer under your desk” and “It is trivial to parallelize a value function iteration using Graphics Processing Units” (e.g., see [1,2,3]). These statements refer to the fact that massively parallel computing is becoming an easy and revolutionary tool for handling more complex economic issues by using large sets of data (e.g., see [4,5]) in combination with inferential, forecasting and decision methods that require a tremendous speed-up in computations. Applications are spreading rapidly in many fields but occur so far in few areas in economics and finance.
The computational revolution in simulation techniques has, however, shown to become a key ingredient in the field of Bayesian econometrics and opened new possibilities to study complex economic and financial phenomena. Applications include risk measurement, forecasting, assessment of policy effectiveness in macro, finance, marketing and monetary economics (see [6] among others).
This special issue aims to contribute to this literature by collecting a set of carefully evaluated papers that are grouped amongst three topics: parallelized Bayesian algorithms; combining inference and decision analysis in economics and finance; and handling large sets of macro series for forecasting.
The focus in the first set of papers is on the use of parallelized, simulation-based Bayesian econometric methods in order to analyze complex econometric problems. With the recent advances in computational procedures, it is possible to speed up computations with different techniques. Computations can be performed in multiple cores or clusters by running independent parts of the code in parallel. Such parts of the code may involve the same econometric model applied to different data sets, or the non-recursive pieces of the main algorithm used. Additionally, the operations can be performed making use of GPU multiprocessor architecture which is efficient in handling large matrix operations and provides massive parallel processing possibility.
The opening paper is by John Geweke and it explains the Sequentially Adaptive Bayesian Learning (SABL) algorithm and the corresponding SABL software (see [7,8]). The paper utilizes methodological innovations in SABL including optimization of irregular and multi-modal functions and produces the conventional maximum likelihood asymptotic variance matrix as a by-product. The practical application is analyzing the secular and cyclical half-life and the cycle period of real gross domestic product.
The next paper by Nalan Baştürk, Stefano Grassi, Lennart Hoogerheide and Herman K. van Dijk deals with practical experience with four canonical econometric models using the Parallelized Mixture of t Distributions estimated by an Importance Sampling weighted Expectation-Maximization (ParMitISEM) Algorithm. The results show that the parallelization of the MitISEM algorithm on Graphics Processing Units and multi-core Central Processing Units is straightforward and fast to program using MATLAB. Moreover the speed performance of the Graphics Processing Unit version is much higher than the Central Processing Unit one. The econometric applications include inference for a mixture GARCH model, an Instrumental Variable regression model and a new Keynesian Philips curve model. These models involve very nonstandard distributions in high dimensions.
The third paper by Arnaud Dufays deals with an evolutionary sequential Monte Carlo sampler applied to the analysis of change-point models. In this paper a tempered and time (TNT) algorithm is developed which combines the (off-line) tempered Sequential Monte Carlo (SMC) inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. This update is inspired from the heuristic differential evolution optimization literature and many extensions of the algorithm are possible. The algorithm is applied to the marginal likelihood calculation of change-point GARCH models.
The second set of papers refers to the combination of inference and decision analysis in economics and finance involving complex models and computational challenges. The first paper by Urbi Garay, Enrique ter Horst, German Molina and Abel Rodriguez deals with Bayesian nonparametric measurement of factor Betas and with clustering of financial returns. The authors propose a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. The performance of the approach is illustrated through simulation studies and an application to hedge fund returns. The approach exhibits fast adaptation to abrupt changes in the parameters, especially in periods which can be identified as times of stressful market events which reflect the dynamic positioning of hedge fund portfolio managers.
In the next paper return and risk of pairs trading strategies are explored by David Ardia, Lukasz T. Gatarek, Lennart Hoogerheide and Herman K. van Dijk using a simulation-based Bayesian procedure for predicting stable ratios of stock prices in a co-integration model. In the paper two cases of pairs trading strategies are investigated: a conditional statistical arbitrage method and an implicit statistical arbitrage method. The paper presents the effect that an encompassing prior under an orthogonal normalization has for the selection of pairs of cointegrated stock prices and for the estimation and prediction of the spread between cointegrated stock prices and its uncertainty. The model is applied to the stocks in the Dow Jones Composite Average index where it is shown that orthogonal normalization is important for the estimation and prediction of the spread.
The third paper in this set is by Samuel Malone, Robert Gramacy and Enrique ter Horst and it deals with timing foreign exchange markets. The authors use predictable and traded foreign exchange market risk factors as fundamentals to improve short-horizon exchange rate forecasts. In addition, they employ Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between the fundamentals and exchange rates. The paper explains how, through a model averaging Monte Carlo scheme, the BTGP is able to simultaneously exploit smoothness and rough breaks in between-variable dynamics. It is shown that trading strategies based on ex ante BTGP forecasts deliver the highest out-of-sample risk-adjusted returns for the median currency, as well as for both predictable, traded risk factors.
The fourth paper in this set is by Haroon Mumtaz and deals with the evolving transmission of uncertainty shocks in the United Kingdom. The authors investigate if the impact of uncertainty shocks on the UK economy has changed over time. For this purpose, the authors propose an extended time-varying VAR model that allows the estimation of a measure of uncertainty, where this measure encompasses volatility from the real and financial sectors of the economy. The proposed model simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. For the UK data, it is shown that the impact of uncertainty shocks on these variables has declined over time.
The last paper in this special issue deals with handling large sets of macro series for forecasting and structural analysis. In the paper by Roberto Casarin, Giulia Mantoan and Francesco Ravazzolo a Bayesian calibration method using generalized pools of predictive distributions is introduced. Combining more opinions and calibrating them to maximize the forecast accuracy is shown to be a crucial issue in several economic problems. The authors apply a Bayesian beta mixture model in order to derive a combined and calibrated density function. They compare linear, harmonic and logarithmic pooling schemes by using simulation experiments and an empirical application to a large database of stock data. The sequential forecasting in the financial application is based on a parallel implementation of the estimation algorithms. The simulation experiments show that in a beta-mixture calibration framework the three combination schemes are substantially equivalent. The financial application shows, on the contrary, that the linear pooling together with beta mixture calibration is achieving the best results in terms of calibrated forecast.
The guest editors want to thank all referees for a speedy and high quality evaluation procedure.

References

  1. A. Lee, C. Yau, M.B. Giles, A. Doucet, and C.C. Holmes. “On the Utility of Graphic Cards to Perform Massively Parallel Simulation with Advanced Monte Carlo Methods.” J. Comput. Graph. Stat. 19 (2010): 769–789. [Google Scholar] [CrossRef] [PubMed]
  2. E.M. Aldrich, J. Fernández-Villaverde, A.R. Gallant, and J.F. Rubio Ramırez. “Tapping the Supercomputer Under Your Desk: Solving Dynamic Equilibrium Models with Graphics Processors.” J. Econ. Dyn. Control 35 (2011): 386–393. [Google Scholar] [CrossRef]
  3. S. Morozov, and S. Mathur. “Massively Parallel Computation Using Graphics Processors with Application to Optimal Experimentation in Dynamic Control.” Comput. Econ. 40 (2012): 151–182. [Google Scholar] [CrossRef]
  4. L. Einav, and J. Levin. “Economics in the Age of Big Data.” Science 346 (2014): 715–718. [Google Scholar] [CrossRef] [PubMed]
  5. H. Varian. “Machine learning: New tricks for econometrics.” J. Econ. Perspect. 28 (2014): 3–28. [Google Scholar] [CrossRef]
  6. R. Casarin, S. Grassi, F. Ravazzolo, and H.K. van Dijk. Dynamic Predictive Density Combinations for Large Data Sets in Economics and Finance. Technical Report 2015–084/III; Amsterdam and Rotterdam, The Netherlands: Tinbergen Institute, 2015. [Google Scholar]
  7. J. Geweke, and G. Durham. “Massively Parallel Sequential Monte Carlo for Bayesian Inference.” 2011. Available online: http://dx.doi.org/10.2139/ssrn.1964731 (accessed on 05 February 2016).
  8. J. Geweke, H. Xu, B. Peng, and S. Yin. MATLAB and SABL Toolbox. Version 2015a; Sydney, Australia: University of Technology of Sydney, 2015. [Google Scholar]

Share and Cite

MDPI and ACS Style

Baştürk, N.; Casarin, R.; Ravazzolo, F.; Van Dijk, H.K. Computational Complexity and Parallelization in Bayesian Econometric Analysis. Econometrics 2016, 4, 9. https://doi.org/10.3390/econometrics4010009

AMA Style

Baştürk N, Casarin R, Ravazzolo F, Van Dijk HK. Computational Complexity and Parallelization in Bayesian Econometric Analysis. Econometrics. 2016; 4(1):9. https://doi.org/10.3390/econometrics4010009

Chicago/Turabian Style

Baştürk, Nalan, Roberto Casarin, Francesco Ravazzolo, and Herman K. Van Dijk. 2016. "Computational Complexity and Parallelization in Bayesian Econometric Analysis" Econometrics 4, no. 1: 9. https://doi.org/10.3390/econometrics4010009

APA Style

Baştürk, N., Casarin, R., Ravazzolo, F., & Van Dijk, H. K. (2016). Computational Complexity and Parallelization in Bayesian Econometric Analysis. Econometrics, 4(1), 9. https://doi.org/10.3390/econometrics4010009

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop