entropy-logo

Journal Browser

Journal Browser

Entropy Application for Forecasting

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (20 August 2019) | Viewed by 39612

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
Department of Applied Economics, University of Oviedo, Campus del Cristo s/n, 33006 Oviedo, Asturias, Spain
Interests: inequality and poverty; economic forecasting; inequality of opportunity; sustainable growth

E-Mail Website
Guest Editor
Department of Applied Economics, University of Oviedo, Campus del Cristo s/n 33006 Oviedo, Asturias , Spain
Interests: economic modeling and forecasting; economic inequality; forecasting evaluation

Special Issue Information

Dear Colleagues,

The increasing in forecasting availability and the controversial debate about the advantages of alternative forecasting methods suggest the need of further research in this field, including both theoretical developments and innovative applications. Within this context, Information Theory provides a suitable framework for the analysis of forecasting uncertainty.

This special issue of Entropy emphasizes research that addresses forecasting problems using Information Theory. Theoretical and empirical contributions are welcome, including but not limited to, forecasting techniques, forecast uncertainty, comparison and blending of forecasts, forecasting evaluation and quality, scenario-based forecasting and other related areas.

Prof. Dr. Ana Jesus Lopez-Menendez
Prof. Dr. Rigoberto Pérez-Suárez
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • uncertainty
  • forecasting methods
  • forecasting evaluation
  • accuracy
  • M-competition
  • combined forecasts
  • scenarios

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 168 KiB  
Editorial
Entropy Application for Forecasting
by Ana Jesús López-Menéndez and Rigoberto Pérez-Suárez
Entropy 2020, 22(6), 604; https://doi.org/10.3390/e22060604 - 29 May 2020
Cited by 3 | Viewed by 2177
Abstract
The information theory developed by Shannon [...] Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)

Research

Jump to: Editorial

18 pages, 7712 KiB  
Article
Time Series Complexities and Their Relationship to Forecasting Performance
by Mirna Ponce-Flores, Juan Frausto-Solís, Guillermo Santamaría-Bonfil, Joaquín Pérez-Ortega and Juan J. González-Barbosa
Entropy 2020, 22(1), 89; https://doi.org/10.3390/e22010089 - 10 Jan 2020
Cited by 15 | Viewed by 6824
Abstract
Entropy is a key concept in the characterization of uncertainty for any given signal, and its extensions such as Spectral Entropy and Permutation Entropy. They have been used to measure the complexity of time series. However, these measures are subject to the discretization [...] Read more.
Entropy is a key concept in the characterization of uncertainty for any given signal, and its extensions such as Spectral Entropy and Permutation Entropy. They have been used to measure the complexity of time series. However, these measures are subject to the discretization employed to study the states of the system, and identifying the relationship between complexity measures and the expected performance of the four selected forecasting methods that participate in the M4 Competition. This relationship allows the decision, in advance, of which algorithm is adequate. Therefore, in this paper, we found the relationships between entropy-based complexity framework and the forecasting error of four selected methods (Smyl, Theta, ARIMA, and ETS). Moreover, we present a framework extension based on the Emergence, Self-Organization, and Complexity paradigm. The experimentation with both synthetic and M4 Competition time series show that the feature space induced by complexities, visually constrains the forecasting method performance to specific regions; where the logarithm of its metric error is poorer, the Complexity based on the emergence and self-organization is maximal. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

13 pages, 261 KiB  
Article
An Entropy-Based Machine Learning Algorithm for Combining Macroeconomic Forecasts
by Carles Bretó, Priscila Espinosa, Penélope Hernández and Jose M. Pavía
Entropy 2019, 21(10), 1015; https://doi.org/10.3390/e21101015 - 19 Oct 2019
Cited by 6 | Viewed by 3282
Abstract
This paper applies a Machine Learning approach with the aim of providing a single aggregated prediction from a set of individual predictions. Departing from the well-known maximum-entropy inference methodology, a new factor capturing the distance between the true and the estimated aggregated predictions [...] Read more.
This paper applies a Machine Learning approach with the aim of providing a single aggregated prediction from a set of individual predictions. Departing from the well-known maximum-entropy inference methodology, a new factor capturing the distance between the true and the estimated aggregated predictions presents a new problem. Algorithms such as ridge, lasso or elastic net help in finding a new methodology to tackle this issue. We carry out a simulation study to evaluate the performance of such a procedure and apply it in order to forecast and measure predictive ability using a dataset of predictions on Spanish gross domestic product. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
19 pages, 1990 KiB  
Article
Demand Forecasting Approaches Based on Associated Relationships for Multiple Products
by Ming Lei, Shalang Li and Shasha Yu
Entropy 2019, 21(10), 974; https://doi.org/10.3390/e21100974 - 05 Oct 2019
Cited by 4 | Viewed by 3195
Abstract
As product variety is an important feature for modern enterprises, multi-product demand forecasting is essential to support order decision-making and inventory management. However, these well-established forecasting approaches for multi-dimensional time series, such as Vector Autoregression (VAR) or dynamic factor model (DFM), all cannot [...] Read more.
As product variety is an important feature for modern enterprises, multi-product demand forecasting is essential to support order decision-making and inventory management. However, these well-established forecasting approaches for multi-dimensional time series, such as Vector Autoregression (VAR) or dynamic factor model (DFM), all cannot deal very well with time series with high or ultra-high dimensionality, especially when the time series are short. Considering that besides the demand trends in historical data, that of associated products (including highly correlated ones or ones having significantly causality) can also provide rich information for prediction, we propose new forecasting approaches for multiple products in this study. The demand of associated products is treated as predictors to add in AR model to improve its prediction accuracy. If there are many time series associated with the object, we introduce two schemes to simplify variables to avoid over-fitting. Then procurement data from a grid company in China is applied to test forecasting performance of the proposed approaches. The empirical results reveal that compared with four conventional models, namely single exponential smoothing (SES), autoregression (AR), VAR and DFM respectively, the new approaches perform better in terms of forecasting errors and inventory simulation performance. They can provide more effective guidance for actual operational activities. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

28 pages, 3133 KiB  
Article
A Statistical Method for Estimating Activity Uncertainty Parameters to Improve Project Forecasting
by Mario Vanhoucke and Jordy Batselier
Entropy 2019, 21(10), 952; https://doi.org/10.3390/e21100952 - 28 Sep 2019
Cited by 9 | Viewed by 4725
Abstract
Just like any physical system, projects have entropy that must be managed by spending energy. The entropy is the project’s tendency to move to a state of disorder (schedule delays, cost overruns), and the energy process is an inherent part of any project [...] Read more.
Just like any physical system, projects have entropy that must be managed by spending energy. The entropy is the project’s tendency to move to a state of disorder (schedule delays, cost overruns), and the energy process is an inherent part of any project management methodology. In order to manage the inherent uncertainty of these projects, accurate estimates (for durations, costs, resources, ) are crucial to make informed decisions. Without these estimates, managers have to fall back to their own intuition and experience, which are undoubtedly crucial for making decisions, but are are often subject to biases and hard to quantify. This paper builds further on two published calibration methods that aim to extract data from real projects and calibrate them to better estimate the parameters for the probability distributions of activity durations. Both methods rely on the lognormal distribution model to estimate uncertainty in activity durations and perform a sequence of statistical hypothesis tests that take the possible presence of two human biases into account. Based on these two existing methods, a new so-called statistical partitioning heuristic is presented that integrates the best elements of the two methods to further improve the accuracy of estimating the distribution of activity duration uncertainty. A computational experiment has been carried out on an empirical database of 83 empirical projects. The experiment shows that the new statistical partitioning method performs at least as good as, and often better than, the two existing calibration methods. The improvement will allow a better quantification of the activity duration uncertainty, which will eventually lead to a better prediction of the project schedule and more realistic expectations about the project outcomes. Consequently, the project manager will be able to better cope with the inherent uncertainty (entropy) of projects with a minimum managerial effort (energy). Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

23 pages, 4417 KiB  
Article
Evolved-Cooperative Correntropy-Based Extreme Learning Machine for Robust Prediction
by Wenjuan Mei, Zhen Liu, Yuanzhang Su, Li Du and Jianguo Huang
Entropy 2019, 21(9), 912; https://doi.org/10.3390/e21090912 - 19 Sep 2019
Cited by 3 | Viewed by 2584
Abstract
In recent years, the correntropy instead of the mean squared error has been widely taken as a powerful tool for enhancing the robustness against noise and outliers by forming the local similarity measurements. However, most correntropy-based models either have too simple descriptions of [...] Read more.
In recent years, the correntropy instead of the mean squared error has been widely taken as a powerful tool for enhancing the robustness against noise and outliers by forming the local similarity measurements. However, most correntropy-based models either have too simple descriptions of the correntropy or require too many parameters to adjust in advance, which is likely to cause poor performance since the correntropy fails to reflect the probability distributions of the signals. Therefore, in this paper, a novel correntropy-based extreme learning machine (ELM) called ECC-ELM has been proposed to provide a more robust training strategy based on the newly developed multi-kernel correntropy with the parameters that are generated using cooperative evolution. To achieve an accurate description of the correntropy, the method adopts a cooperative evolution which optimizes the bandwidths by switching delayed particle swarm optimization (SDPSO) and generates the corresponding influence coefficients that minimizes the minimum integrated error (MIE) to adaptively provide the best solution. The simulated experiments and real-world applications show that cooperative evolution can achieve the optimal solution which provides an accurate description on the probability distribution of the current error in the model. Therefore, the multi-kernel correntropy that is built with the optimal solution results in more robustness against the noise and outliers when training the model, which increases the accuracy of the predictions compared with other methods. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

18 pages, 1373 KiB  
Article
A Neutrosophic Forecasting Model for Time Series Based on First-Order State and Information Entropy of High-Order Fluctuation
by Hongjun Guan, Zongli Dai, Shuang Guan and Aiwu Zhao
Entropy 2019, 21(5), 455; https://doi.org/10.3390/e21050455 - 01 May 2019
Cited by 17 | Viewed by 3162
Abstract
In time series forecasting, information presentation directly affects prediction efficiency. Most existing time series forecasting models follow logical rules according to the relationships between neighboring states, without considering the inconsistency of fluctuations for a related period. In this paper, we propose a new [...] Read more.
In time series forecasting, information presentation directly affects prediction efficiency. Most existing time series forecasting models follow logical rules according to the relationships between neighboring states, without considering the inconsistency of fluctuations for a related period. In this paper, we propose a new perspective to study the problem of prediction, in which inconsistency is quantified and regarded as a key characteristic of prediction rules. First, a time series is converted to a fluctuation time series by comparing each of the current data with corresponding previous data. Then, the upward trend of each of fluctuation data is mapped to the truth-membership of a neutrosophic set, while a falsity-membership is used for the downward trend. Information entropy of high-order fluctuation time series is introduced to describe the inconsistency of historical fluctuations and is mapped to the indeterminacy-membership of the neutrosophic set. Finally, an existing similarity measurement method for the neutrosophic set is introduced to find similar states during the forecasting stage. Then, a weighted arithmetic averaging (WAA) aggregation operator is introduced to obtain the forecasting result according to the corresponding similarity. Compared to existing forecasting models, the neutrosophic forecasting model based on information entropy (NFM-IE) can represent both fluctuation trend and fluctuation consistency information. In order to test its performance, we used the proposed model to forecast some realistic time series, such as the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX), the Shanghai Stock Exchange Composite Index (SHSECI), and the Hang Seng Index (HSI). The experimental results show that the proposed model can stably predict for different datasets. Simultaneously, comparing the prediction error to other approaches proves that the model has outstanding prediction accuracy and universality. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

22 pages, 982 KiB  
Article
Assessing the Performance of Hierarchical Forecasting Methods on the Retail Sector
by José Manuel Oliveira and Patrícia Ramos
Entropy 2019, 21(4), 436; https://doi.org/10.3390/e21040436 - 24 Apr 2019
Cited by 14 | Viewed by 4310
Abstract
Retailers need demand forecasts at different levels of aggregation in order to support a variety of decisions along the supply chain. To ensure aligned decision-making across the hierarchy, it is essential that forecasts at the most disaggregated level add up to forecasts at [...] Read more.
Retailers need demand forecasts at different levels of aggregation in order to support a variety of decisions along the supply chain. To ensure aligned decision-making across the hierarchy, it is essential that forecasts at the most disaggregated level add up to forecasts at the aggregate levels above. It is not clear if these aggregate forecasts should be generated independently or by using an hierarchical forecasting method that ensures coherent decision-making at the different levels but does not guarantee, at least, the same accuracy. To give guidelines on this issue, our empirical study investigates the relative performance of independent and reconciled forecasting approaches, using real data from a Portuguese retailer. We consider two alternative forecasting model families for generating the base forecasts; namely, state space models and ARIMA. Appropriate models from both families are chosen for each time-series by minimising the bias-corrected Akaike information criteria. The results show significant improvements in forecast accuracy, providing valuable information to support management decisions. It is clear that reconciled forecasts using the Minimum Trace Shrinkage estimator (MinT-Shrink) generally improve on the accuracy of the ARIMA base forecasts for all levels and for the complete hierarchy, across all forecast horizons. The accuracy gains generally increase with the horizon, varying between 1.7% and 3.7% for the complete hierarchy. It is also evident that the gains in forecast accuracy are more substantial at the higher levels of aggregation, which means that the information about the individual dynamics of the series, which was lost due to aggregation, is brought back again from the lower levels of aggregation to the higher levels by the reconciliation process, substantially improving the forecast accuracy over the base forecasts. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Graphical abstract

12 pages, 283 KiB  
Article
A Data-Weighted Prior Estimator for Forecast Combination
by Esteban Fernández-Vázquez, Blanca Moreno and Geoffrey J.D. Hewings
Entropy 2019, 21(4), 429; https://doi.org/10.3390/e21040429 - 23 Apr 2019
Cited by 4 | Viewed by 2857
Abstract
Forecast combination methods reduce the information in a vector of forecasts to a single combined forecast by using a set of combination weights. Although there are several methods, a typical strategy is the use of the simple arithmetic mean to obtain the combined [...] Read more.
Forecast combination methods reduce the information in a vector of forecasts to a single combined forecast by using a set of combination weights. Although there are several methods, a typical strategy is the use of the simple arithmetic mean to obtain the combined forecast. A priori, the use of this mean could be justified when all the forecasters have had the same performance in the past or when they do not have enough information. In this paper, we explore the possibility of using entropy econometrics as a procedure for combining forecasts that allows to discriminate between bad and good forecasters, even in the situation of little information. With this purpose, the data-weighted prior (DWP) estimator proposed by Golan (2001) is used for forecaster selection and simultaneous parameter estimation in linear statistical models. In particular, we examine the ability of the DWP estimator to effectively select relevant forecasts among all forecasts. We test the accuracy of the proposed model with a simulation exercise and compare its ex ante forecasting performance with other methods used to combine forecasts. The obtained results suggest that the proposed method dominates other combining methods, such as equal-weight averages or ordinal least squares methods, among others. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
13 pages, 518 KiB  
Article
Soft Randomized Machine Learning Procedure for Modeling Dynamic Interaction of Regional Systems
by Yuri S. Popkov
Entropy 2019, 21(4), 424; https://doi.org/10.3390/e21040424 - 20 Apr 2019
Cited by 4 | Viewed by 2899
Abstract
The paper suggests a randomized model for dynamic migratory interaction of regional systems. The locally stationary states of migration flows in the basic and immigration systems are described by corresponding entropy operators. A soft randomization procedure that defines the optimal probability density functions [...] Read more.
The paper suggests a randomized model for dynamic migratory interaction of regional systems. The locally stationary states of migration flows in the basic and immigration systems are described by corresponding entropy operators. A soft randomization procedure that defines the optimal probability density functions of system parameters and measurement noises is developed. The advantages of soft randomization with approximate empirical data balance conditions are demonstrated, which considerably reduces algorithmic complexity and computational resources demand. An example of migratory interaction modeling and testing is given. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

17 pages, 954 KiB  
Article
Acknowledging Uncertainty in Economic Forecasting. Some Insight from Confidence and Industrial Trend Surveys
by Ana Jesús López-Menéndez and Rigoberto Pérez-Suárez
Entropy 2019, 21(4), 413; https://doi.org/10.3390/e21040413 - 18 Apr 2019
Cited by 5 | Viewed by 2902
Abstract
The role of uncertainty has become increasingly important in economic forecasting, due to both theoretical and empirical reasons. Although the traditional practice consisted of reporting point predictions without specifying the attached probabilities, uncertainty about the prospects deserves increasing attention, and recent literature has [...] Read more.
The role of uncertainty has become increasingly important in economic forecasting, due to both theoretical and empirical reasons. Although the traditional practice consisted of reporting point predictions without specifying the attached probabilities, uncertainty about the prospects deserves increasing attention, and recent literature has tried to quantify the level of uncertainty perceived by different economic agents, also examining its effects and determinants. In this context, the present paper aims to analyze the uncertainty in economic forecasting, paying attention to qualitative perceptions from confidence and industrial trend surveys and making use of the related ex-ante probabilities. With this objective, two entropy-based measures (Shannon’s and quadratic entropy) are computed, providing significant evidence about the perceived level of uncertainty. Our empirical findings show that survey’s respondents are able to distinguish between current and prospective uncertainty and between general and personal uncertainty. Furthermore, we find that uncertainty negatively affects economic growth. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Figure 1

Back to TopTop