Next Issue
Volume 6, June
Previous Issue
Volume 5, December
 
 

Econometrics, Volume 6, Issue 1 (March 2018) – 14 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
18 pages, 761 KiB  
Article
Statistical Inference on the Canadian Middle Class
by Russell Davidson
Econometrics 2018, 6(1), 14; https://doi.org/10.3390/econometrics6010014 - 13 Mar 2018
Cited by 1 | Viewed by 7089
Abstract
Conventional wisdom says that the middle classes in many developed countries have recently suffered losses, in terms of both the share of the total population belonging to the middle class, and also their share in total income. Here, distribution-free methods are developed for [...] Read more.
Conventional wisdom says that the middle classes in many developed countries have recently suffered losses, in terms of both the share of the total population belonging to the middle class, and also their share in total income. Here, distribution-free methods are developed for inference on these shares, by means of deriving expressions for their asymptotic variances of sample estimates, and the covariance of the estimates. Asymptotic inference can be undertaken based on asymptotic normality. Bootstrap inference can be expected to be more reliable, and appropriate bootstrap procedures are proposed. As an illustration, samples of individual earnings drawn from Canadian census data are used to test various hypotheses about the middle-class shares, and confidence intervals for them are computed. It is found that, for the earlier censuses, sample sizes are large enough for asymptotic and bootstrap inference to be almost identical, but that, in the twenty-first century, the bootstrap fails on account of a strange phenomenon whereby many presumably different incomes in the data are rounded to one and the same value. Another difference between the centuries is the appearance of heavy right-hand tails in the income distributions of both men and women. Full article
(This article belongs to the Special Issue Econometrics and Income Inequality)
Show Figures

Figure 1

21 pages, 352 KiB  
Article
An Overview of Modified Semiparametric Memory Estimation Methods
by Marie Busch and Philipp Sibbertsen
Econometrics 2018, 6(1), 13; https://doi.org/10.3390/econometrics6010013 - 12 Mar 2018
Cited by 3 | Viewed by 7134
Abstract
Several modified estimation methods of the memory parameter have been introduced in the past years. They aim to decrease the upward bias of the memory parameter in cases of low frequency contaminations or an additive noise component, especially in situations with a short-memory [...] Read more.
Several modified estimation methods of the memory parameter have been introduced in the past years. They aim to decrease the upward bias of the memory parameter in cases of low frequency contaminations or an additive noise component, especially in situations with a short-memory process being contaminated. In this paper, we provide an overview and compare the performance of nine semiparametric estimation methods. Among them are two standard methods, four modified approaches to account for low frequency contaminations and three procedures developed for perturbed fractional processes. We conduct an extensive Monte Carlo study for a variety of parameter constellations and several DGPs. Furthermore, an empirical application of the log-absolute return series of the S&P 500 shows that the estimation results combined with a long-memory test indicate a spurious long-memory process. Full article
Show Figures

Figure 1

17 pages, 1121 KiB  
Article
Response-Based Sampling for Binary Choice Models With Sample Selection
by Maria Felice Arezzo and Giuseppina Guagnano
Econometrics 2018, 6(1), 12; https://doi.org/10.3390/econometrics6010012 - 07 Mar 2018
Cited by 6 | Viewed by 7482
Abstract
Sample selection models attempt to correct for non-randomly selected data in a two-model hierarchy where, on the first level, a binary selection equation determines whether a particular observation will be available for the second level (outcome equation). If the non-random selection mechanism induced [...] Read more.
Sample selection models attempt to correct for non-randomly selected data in a two-model hierarchy where, on the first level, a binary selection equation determines whether a particular observation will be available for the second level (outcome equation). If the non-random selection mechanism induced by the selection equation is ignored, the coefficient estimates in the outcome equation may be severely biased. When the selection mechanism leads to many censored observations, few data are available for the estimation of the outcome equation parameters, giving rise to computational difficulties. In this context, the main reference is Greene (2008) who extends the results obtained by Manski and Lerman (1977), and develops an estimator which requires the knowledge of the true proportion of occurrences in the outcome equation. We develop a method that exploits the advantages of response-based sampling schemes in the context of binary response models with a sample selection, relaxing this assumption. Estimation is based on a weighted version of Heckman’s likelihood, where the weights take into account the sampling design. In a simulation study, we found that, for the outcome equation, the results obtained with our estimator are comparable to Greene’s in terms of mean square error. Moreover, in a real data application, it is preferable in terms of the percentage of correct predictions. Full article
Show Figures

Figure 1

28 pages, 371 KiB  
Article
Jackknife Bias Reduction in the Presence of a Near-Unit Root
by Marcus J. Chambers and Maria Kyriacou
Econometrics 2018, 6(1), 11; https://doi.org/10.3390/econometrics6010011 - 05 Mar 2018
Cited by 2 | Viewed by 7110
Abstract
This paper considers the specification and performance of jackknife estimators of the autoregressive coefficient in a model with a near-unit root. The limit distributions of sub-sample estimators that are used in the construction of the jackknife estimator are derived, and the joint moment [...] Read more.
This paper considers the specification and performance of jackknife estimators of the autoregressive coefficient in a model with a near-unit root. The limit distributions of sub-sample estimators that are used in the construction of the jackknife estimator are derived, and the joint moment generating function (MGF) of two components of these distributions is obtained and its properties explored. The MGF can be used to derive the weights for an optimal jackknife estimator that removes fully the first-order finite sample bias from the estimator. The resulting jackknife estimator is shown to perform well in finite samples and, with a suitable choice of the number of sub-samples, is shown to reduce the overall finite sample root mean squared error, as well as bias. However, the optimal jackknife weights rely on knowledge of the near-unit root parameter and a quantity that is related to the long-run variance of the disturbance process, which are typically unknown in practice, and so, this dependence is characterised fully and a discussion provided of the issues that arise in practice in the most general settings. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
16 pages, 1527 KiB  
Article
Top Incomes, Heavy Tails, and Rank-Size Regressions
by Christian Schluter
Econometrics 2018, 6(1), 10; https://doi.org/10.3390/econometrics6010010 - 02 Mar 2018
Cited by 3 | Viewed by 7133
Abstract
In economics, rank-size regressions provide popular estimators of tail exponents of heavy-tailed distributions. We discuss the properties of this approach when the tail of the distribution is regularly varying rather than strictly Pareto. The estimator then over-estimates the true value in the leading [...] Read more.
In economics, rank-size regressions provide popular estimators of tail exponents of heavy-tailed distributions. We discuss the properties of this approach when the tail of the distribution is regularly varying rather than strictly Pareto. The estimator then over-estimates the true value in the leading parametric income models (so the upper income tail is less heavy than estimated), which leads to test size distortions and undermines inference. For practical work, we propose a sensitivity analysis based on regression diagnostics in order to assess the likely impact of the distortion. The methods are illustrated using data on top incomes in the UK. Full article
(This article belongs to the Special Issue Econometrics and Income Inequality)
Show Figures

Figure 1

15 pages, 287 KiB  
Article
A Spatial-Filtering Zero-Inflated Approach to the Estimation of the Gravity Model of Trade
by Rodolfo Metulini, Roberto Patuelli and Daniel A. Griffith
Econometrics 2018, 6(1), 9; https://doi.org/10.3390/econometrics6010009 - 22 Feb 2018
Cited by 25 | Viewed by 9689
Abstract
Nonlinear estimation of the gravity model with Poisson-type regression methods has become popular for modelling international trade flows, because it permits a better accounting for zero flows and extreme values in the distribution tail. Nevertheless, as trade flows are not independent from each [...] Read more.
Nonlinear estimation of the gravity model with Poisson-type regression methods has become popular for modelling international trade flows, because it permits a better accounting for zero flows and extreme values in the distribution tail. Nevertheless, as trade flows are not independent from each other due to spatial and network autocorrelation, these methods may lead to biased parameter estimates. To overcome this problem, eigenvector spatial filtering (ESF) variants of the Poisson/negative binomial specifications have been proposed in the literature on gravity modelling of trade. However, no specific treatment has been developed for cases in which many zero flows are present. This paper contributes to the literature in two ways. First, by employing a stepwise selection criterion for spatial filters that is based on robust (sandwich) p-values and does not require likelihood-based indicators. In this respect, we develop an ad hoc backward stepwise function in R. Second, using this function, we select a reduced set of spatial filters that properly accounts for importer-side and exporter-side specific spatial effects, as well as network effects, both at the count and the logit processes of zero-inflated methods. Applying this estimation strategy to a cross-section of bilateral trade flows between a set of 64 countries for the year 2000, we find that our specification outperforms the benchmark models in terms of model fitting, both considering the AIC and in predicting zero (and small) flows. Full article
24 pages, 341 KiB  
Article
Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices
by Fei Jin and Lung-fei Lee
Econometrics 2018, 6(1), 8; https://doi.org/10.3390/econometrics6010008 - 22 Feb 2018
Cited by 4 | Viewed by 7388
Abstract
An information matrix of a parametric model being singular at a certain true value of a parameter vector is irregular. The maximum likelihood estimator in the irregular case usually has a rate of convergence slower than the n -rate in a regular case. [...] Read more.
An information matrix of a parametric model being singular at a certain true value of a parameter vector is irregular. The maximum likelihood estimator in the irregular case usually has a rate of convergence slower than the n -rate in a regular case. We propose to estimate such models by the adaptive lasso maximum likelihood and propose an information criterion to select the involved tuning parameter. We show that the penalized maximum likelihood estimator has the oracle properties. The method can implement model selection and estimation simultaneously and the estimator always has the usual n -rate of convergence. Full article
27 pages, 581 KiB  
Article
A Multivariate Kernel Approach to Forecasting the Variance Covariance of Stock Market Returns
by Ralf Becker, Adam Clements and Robert O'Neill
Econometrics 2018, 6(1), 7; https://doi.org/10.3390/econometrics6010007 - 17 Feb 2018
Cited by 1 | Viewed by 7794
Abstract
This paper introduces a multivariate kernel based forecasting tool for the prediction of variance-covariance matrices of stock returns. The method introduced allows for the incorporation of macroeconomic variables into the forecasting process of the matrix without resorting to a decomposition of the matrix. [...] Read more.
This paper introduces a multivariate kernel based forecasting tool for the prediction of variance-covariance matrices of stock returns. The method introduced allows for the incorporation of macroeconomic variables into the forecasting process of the matrix without resorting to a decomposition of the matrix. The model makes use of similarity forecasting techniques and it is demonstrated that several popular techniques can be thought as a subset of this approach. A forecasting experiment demonstrates the potential for the technique to improve the statistical accuracy of forecasts of variance-covariance matrices. Full article
(This article belongs to the Special Issue Volatility Modeling)
Show Figures

Figure 1

20 pages, 360 KiB  
Article
Estimating Unobservable Inflation Expectations in the New Keynesian Phillips Curve
by Francesca Rondina
Econometrics 2018, 6(1), 6; https://doi.org/10.3390/econometrics6010006 - 05 Feb 2018
Cited by 3 | Viewed by 8305
Abstract
This paper uses an econometric model and Bayesian estimation to reverse engineer the path of inflation expectations implied by the New Keynesian Phillips Curve and the data. The estimated expectations roughly track the patterns of a number of common measures of expected inflation [...] Read more.
This paper uses an econometric model and Bayesian estimation to reverse engineer the path of inflation expectations implied by the New Keynesian Phillips Curve and the data. The estimated expectations roughly track the patterns of a number of common measures of expected inflation available from surveys or computed from financial data. In particular, they exhibit the strongest correlation with the inflation forecasts of the respondents in the University of Michigan Survey of Consumers. The estimated model also shows evidence of the anchoring of long run inflation expectations to a value that is in the range of the target inflation rate. Full article
Show Figures

Figure 1

19 pages, 7486 KiB  
Article
Assessing News Contagion in Finance
by Paola Cerchiello and Giancarlo Nicola
Econometrics 2018, 6(1), 5; https://doi.org/10.3390/econometrics6010005 - 03 Feb 2018
Cited by 23 | Viewed by 11539
Abstract
The analysis of news in the financial context has gained a prominent interest in the last years. This is because of the possible predictive power of such content especially in terms of associated sentiment/mood. In this paper, we focus on a specific aspect [...] Read more.
The analysis of news in the financial context has gained a prominent interest in the last years. This is because of the possible predictive power of such content especially in terms of associated sentiment/mood. In this paper, we focus on a specific aspect of financial news analysis: how the covered topics modify according to space and time dimensions. To this purpose, we employ a modified version of topic model LDA, the so-called Structural Topic Model (STM), that takes into account covariates as well. Our aim is to study the possible evolution of topics extracted from two well known news archive—Reuters and Bloomberg—and to investigate a causal effect in the diffusion of the news by means of a Granger causality test. Our results show that both the temporal dynamics and the spatial differentiation matter in the news contagion. Full article
(This article belongs to the Special Issue Big Data in Economics and Finance)
Show Figures

Figure 1

20 pages, 367 KiB  
Article
From the Classical Gini Index of Income Inequality to a New Zenga-Type Relative Measure of Risk: A Modeller’s Perspective
by Francesca Greselin and Ričardas Zitikis
Econometrics 2018, 6(1), 4; https://doi.org/10.3390/econometrics6010004 - 25 Jan 2018
Cited by 16 | Viewed by 28125
Abstract
The underlying idea behind the construction of indices of economic inequality is based on measuring deviations of various portions of low incomes from certain references or benchmarks, which could be point measures like the population mean or median, or curves like the hypotenuse [...] Read more.
The underlying idea behind the construction of indices of economic inequality is based on measuring deviations of various portions of low incomes from certain references or benchmarks, which could be point measures like the population mean or median, or curves like the hypotenuse of the right triangle into which every Lorenz curve falls. In this paper, we argue that, by appropriately choosing population-based references (called societal references) and distributions of personal positions (called gambles, which are random), we can meaningfully unify classical and contemporary indices of economic inequality, and various measures of risk. To illustrate the herein proposed approach, we put forward and explore a risk measure that takes into account the relativity of large risks with respect to small ones. Full article
(This article belongs to the Special Issue Econometrics and Income Inequality)
Show Figures

Figure 1

15 pages, 1010 KiB  
Article
Spurious Seasonality Detection: A Non-Parametric Test Proposal
by Aurelio F. Bariviera, Angelo Plastino and George Judge
Econometrics 2018, 6(1), 3; https://doi.org/10.3390/econometrics6010003 - 19 Jan 2018
Cited by 3 | Viewed by 7178
Abstract
This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact [...] Read more.
This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies. Full article
Show Figures

Figure 1

2 pages, 145 KiB  
Editorial
Acknowledgement to Reviewers of Econometrics in 2017
by Econometrics Editorial Office
Econometrics 2018, 6(1), 2; https://doi.org/10.3390/econometrics6010002 - 10 Jan 2018
Viewed by 5751
Abstract
Peer review is an essential part in the publication process, ensuring that Econometrics maintains high quality standards for its published papers. In 2017, a total of 47 papers were published in the journal.[...] Full article
139 KiB  
Editorial
Recent Developments in Cointegration
by Katarina Juselius
Econometrics 2018, 6(1), 1; https://doi.org/10.3390/econometrics6010001 - 31 Dec 2017
Cited by 4 | Viewed by 6978
(This article belongs to the Special Issue Recent Developments in Cointegration)
Previous Issue
Next Issue
Back to TopTop