Forecasting the Demand for Container Throughput Using a Mixed-Precision Neural Architecture Based on CNN–LSTM
Abstract
:1. Introduction
2. Literature Review
3. Methods
3.1. Proposed Method
3.1.1. Input Layer
3.1.2. CNN Layer
3.1.3. LSTM Layer
3.2. Baseline Models
3.2.1. Random Forest Regression (RFR)
3.2.2. Linear Regression (LR)
3.2.3. Adaptive Boosting (AdaBoost)
3.2.4. SVR
3.3. Rolling Forecasting
3.3.1. Parameter Settings
3.3.2. Forecasting Performance Criteria
4. Results and Discussion
4.1. Data Source
4.2. Comparison of Models
5. Conclusions and Future Works
Author Contributions
Funding
Conflicts of Interest
References
- Guerrero, D.; Rodrigue, J.-P. The waves of containerization: Shifts in global maritime transportation. J. Transp. Geogr. 2014, 34, 151–164. [Google Scholar] [CrossRef] [Green Version]
- Bassan, S. Evaluating seaport operation and capacity analysis—Preliminary methodology. Marit. Policy Manag. 2007, 34, 3–19. [Google Scholar] [CrossRef]
- Jeevan, J.; Ghaderi, H.; Bandara, Y.M.; Saharuddin, A.; Othman, M. The Implications of the Growth of Port Throughput on the Port Capacity: The Case of Malaysian Major Container Seaports. Int. J. E-Navig. Marit. Econ. 2015, 3, 84–98. [Google Scholar] [CrossRef] [Green Version]
- Schulze, P.M.; Prinz, A. Forecasting container transshipment in Germany. Appl. Econ. 2009, 41, 2809–2815. [Google Scholar] [CrossRef] [Green Version]
- Wang, J.; Jia, R.; Zhao, W.; Wu, J.; Dong, Y. Application of the largest Lyapunov exponent and non-linear fractal extrapolation algorithm to short-term load forecasting. Chaos Solitons Fractals 2012, 45, 1277–1287. [Google Scholar] [CrossRef]
- Akay, D.; Atak, M. Grey prediction with rolling mechanism for electricity demand forecasting of Turkey. Energy 2007, 32, 1670–1675. [Google Scholar] [CrossRef]
- Guo, Y.; Nazarian, E.; Ko, J.; Rajurkar, K. Hourly cooling load forecasting using time-indexed ARX models with two-stage weighted least squares regression. Energy Convers. Manag. 2014, 80, 46–53. [Google Scholar] [CrossRef]
- Xiao, J.; Xiao, Y.; Fu, J.; Lai, K.K. A transfer forecasting model for container throughput guided by discrete PSO. J. Syst. Sci. Complex. 2014, 27, 181–192. [Google Scholar] [CrossRef]
- Mo, L.; Xie, L.; Jiang, X.; Teng, G.; Xu, L.; Xiao, J. GMDH-based hybrid model for container throughput forecasting: Selective combination forecasting in nonlinear subseries. Appl. Soft Comput. 2018, 62, 478–490. [Google Scholar] [CrossRef]
- Niu, M.; Hu, Y.; Sun, S.; Liu, Y. A novel hybrid decomposition-ensemble model based on VMD and HGWO for container throughput forecasting. Appl. Math. Model. 2018, 57, 163–178. [Google Scholar] [CrossRef]
- Chen, S.-H.; Chen, J.-N. Forecasting container throughputs at ports using genetic programming. Expert Syst. Appl. 2010, 37, 2054–2058. [Google Scholar] [CrossRef]
- Syafi’I, K.K.; Takebayashi, M. Forecasting the Demand of Container Throughput in Indonesia. Mem. Constr. Eng. Res. Inst. 2005, 47, 69–78. [Google Scholar]
- Guo, Z.; Song, X.; Ye, J. A verhulst model on time series error corrected for port throughput forecasting. J. East. Asia Soc. Transp. Stud. 2005, 6, 881–891. [Google Scholar]
- Gosasang, V.; Chandraprakaikul, W.; Kiattisin, S. A Comparison of Traditional and Neural Networks Forecasting Techniques for Container Throughput at Bangkok Port. Asian J. Shipp. Logist. 2011, 27, 463–482. [Google Scholar] [CrossRef] [Green Version]
- Geng, J.; Li, M.-W.; Dong, Z.-H.; Liao, Y.-S. Port throughput forecasting by MARS-RSVR with chaotic simulated annealing particle swarm optimization algorithm. Neurocomputing 2015, 147, 239–250. [Google Scholar] [CrossRef]
- Bengio, Y. Learning Deep Architectures for AI. Found. Trends Mach. Learn. 2009, 2, 1–127. [Google Scholar] [CrossRef]
- Bengio, Y.; Courville, A.; Vincent, P. Representation Learning: A Review and New Perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1798–1828. [Google Scholar] [CrossRef]
- Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [Green Version]
- Yang, C.-H.; Moi, S.-H.; Ou-Yang, F.; Chuang, L.-Y.; Hou, M.-F.; Lin, Y.-D. Identifying Risk Stratification Associated With a Cancer for Overall Survival by Deep Learning-Based CoxPH. IEEE Access 2019, 7, 67708–67717. [Google Scholar] [CrossRef]
- Stoean, R.; Stoean, C.; Atencia, M.; Rodríguez-Labrada, R.; Joya, G. Ranking Information Extracted from Uncertainty Quantification of the Prediction of a Deep Learning Model on Medical Time Series Data. Mathematics 2020, 8, 1078. [Google Scholar] [CrossRef]
- Tsai, Y.-S.; Hsu, L.-H.; Hsieh, Y.-Z.; Lin, S.-S. The Real-Time Depth Estimation for an Occluded Person Based on a Single Image and OpenPose Method. Mathematics 2020, 8, 1333. [Google Scholar] [CrossRef]
- Jinsakul, N.; Tsai, C.-F.; Tsai, C.-E.; Wu, P. Enhancement of Deep Learning in Image Classification Performance Using Xception with the Swish Activation Function for Colorectal Polyp Preliminary Screening. Mathematics 2019, 7, 1170. [Google Scholar] [CrossRef] [Green Version]
- Fawaz, H.I.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.-A. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 2019, 33, 917–963. [Google Scholar] [CrossRef] [Green Version]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. In Proceedings of the 9th International Conference on Artificial Neural Networks (ICANN 1999), Edinburgh, UK, 7–10 September 1999. [Google Scholar]
- Breiman, L.; Last, M.; Rice, J. Random forests. In Machine Learning; Springer: Berlin/Heidelberg, Germany, 2001; Volume 45, pp. 5–32. [Google Scholar]
- Nathans, L.L.; Oswald, F.L.; Nimon, K. Interpreting multiple linear regression: A guidebook of variable importance. Pract. Assess. Res. Eval. 2012, 17, 1–19. [Google Scholar]
- Freund, Y.; Schapire, R.; Abe, N. A short introduction to boosting. J. Jpn. Soc. Artif. Intell. 1999, 14, 771–780. [Google Scholar]
- Vapnik, V. The Nature of Statistical Learning Theory; Springer Science & Business Media: New York, NY, USA, 2013. [Google Scholar]
- Zendehboudi, A.; Baseer, M.; Saidur, R. Application of support vector machine models for forecasting solar and wind energy resources: A review. J. Clean. Prod. 2018, 199, 272–285. [Google Scholar] [CrossRef]
- Drucker, H.; Burges, C.J.; Kaufman, L.L.; Smola, A.J.; Vapnik, V. Support vector regression machines. In Proceedings of the Advances in Neural Information Processing Systems 9 (Nips 1996), Denver, CO, USA, 2–5 December 1997; pp. 155–161. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O. Scikit-learn: Machine learning in python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference on Machine Learning (ICML-10), Haifa, Israel, 21–24 June 2010; pp. 807–814. [Google Scholar]
- Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice, 3rd ed.; OTexts: Melbourne, Australia, 2013. [Google Scholar]
- Chen, C.; Twycross, J.; Garibaldi, J.M. A new accuracy measure based on bounded relative error for time series forecasting. PLoS ONE 2017, 12, e0174202. [Google Scholar] [CrossRef] [Green Version]
- Hyndman, R.J.; Koehler, A.B. Another look at measures of forecast accuracy. Int. J. Forecast. 2006, 22, 679–688. [Google Scholar] [CrossRef] [Green Version]
- Diebold, F.X.; Mariano, R.S. Comparing Predictive Accuracy. J. Bus. Econ. Stat. 2002, 20, 134–144. [Google Scholar] [CrossRef]
- Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
- Wilcoxon, F. Individual Comparisons by Ranking Methods. In Breakthroughs in Statistics; Springer: Berlin, Germany, 1992; pp. 196–202. [Google Scholar]
- Siegel, S.; Castellan, N.J. Nonparametric Statistics for the Behavioral Sciences; McGraw-Hill: New York, NY, USA, 1956; Volume 7. [Google Scholar]
- Hong, W.-C.; Li, M.-W.; Geng, J.; Zhang, Y. Novel chaotic bat algorithm for forecasting complex motion of floating platforms. Appl. Math. Model. 2019, 72, 425–443. [Google Scholar] [CrossRef]
- Fan, G.-F.; Peng, L.-L.; Hong, W.-C. Short term load forecasting based on phase space reconstruction algorithm and bi-square kernel regression model. Appl. Energy 2018, 224, 13–33. [Google Scholar] [CrossRef]
- Dong, Y.; Zhang, Z.; Hong, W.-C. A Hybrid Seasonal Mechanism with a Chaotic Cuckoo Search Algorithm with a Support Vector Regression Model for Electric Load Forecasting. Energies 2018, 11, 1009. [Google Scholar] [CrossRef] [Green Version]
Notation | Explanation |
---|---|
LSTM Input | |
Current input vector from the deep features extracted from the CNN | |
Memory cell state from the previous LSTM unit | |
Previous hidden state output from the previous LSTM unit | |
LSTM Output | |
New updated memory | |
Current hidden state output | |
Current output | |
LSTM Nonlinearities | |
Activation function: Sigmoid | |
Activation function: Tanh | |
LSTM Vector Operation | |
Pointwise multiplication | |
Pointwise addition | |
Weights | |
Recurrent weight vector for the forget gate, input gate, output gate, and memory cell | |
Input weight vector for the forget gate, input gate, output gate, and memory cell | |
Input bias vector for the forget gate, input gate, output gate, and memory cell | |
Gates | |
Forget gate: responsible for determining whether the old information is remembered | |
Input gate: responsible for determining whether the input is adopted | |
Input gate: responsible for determining whether the output is adopted | |
Loss Function | |
The parameters need to leaning from LSTM | |
y | Ground truth value |
Predicted value | |
Denotes 12 month container throughput data that must be predicted, where i = 1, 2, …, 12 |
Port | Max | Min | Mean | Median | Q1 | Q3 | IQR | SD | CV (%) |
---|---|---|---|---|---|---|---|---|---|
Kaohsiung | 189.126 | 29.504 | 71.283 | 53.171 | 53.171 | 113.914 | 60.743 | 36.385 | 51.0% |
Hualien | 166.882 | 28.328 | 84.094 | 75.604 | 56.852 | 107.491 | 50.638 | 32.222 | 38.3% |
Taichung | 67.146 | 25.114 | 44.416 | 43.797 | 38.408 | 48.918 | 10.510 | 8.531 | 19.2% |
Anping | 97.937 | 4.878 | 23.047 | 13.194 | 9.621 | 33.245 | 23.624 | 18.764 | 81.4% |
Suao | 24.467 | 3.118 | 12.357 | 12.201 | 9.649 | 14.848 | 5.199 | 3.968 | 32.1% |
LR | AdaBoost | RFR | SVR | CNN–LSTM | ||
---|---|---|---|---|---|---|
Kaohsiung | MAPE (%) | 10.93 | 9.92 | 9.85 | 8.40 | 5.78 * |
SMAPE (%) | 96.90 | 9.20 | 9.20 | 7.50 | 5.70 * | |
MAAPE (%) | 10.70 | 9.70 | 9.70 | 8.10 | 5.60 * | |
RMSE | 128.21 | 114.39 | 113.90 | 108.05 | 72.55 * | |
MASE | 0.96 | 0.87 | 0.87 | 0.72 | 0.87 | |
MBRAE | 0.51 | 0.51 | 0.52 | 0.48 | 0.51 | |
Hualien | MAPE (%) | 14.27 | 14.15 | 12.87 | 12.43 | 7.46 * |
SMAPE (%) | 14.00 | 13.60 | 12.30 | 11.00 | 9.60 * | |
MAAPE (%) | 14.00 | 13.80 | 12.60 | 11.90 | 9.70 * | |
RMSE | 11.71 | 11.65 | 10.81 | 10.57 | 6.67 * | |
MASE | 0.68 | 0.67 | 0.61 | 0.54 | 0.62 | |
MBRAE | 0.39 | 0.40 | 0.37 | 0.36 | 0.37 | |
Taichung | MAPE (%) | 19.26 | 19.02 | 32.15 | 11.18 | 9.39 * |
SMAPE (%) | 18.90 | 18.50 | 19.10 | 10.70 | 10.40 * | |
MAAPE (%) | 18.10 | 17.90 | 18.40 | 10.90 | 10.80 * | |
RMSE | 143.50 | 143.21 | 141.35 | 79.73 | 68.98 * | |
MASE | 1.45 | 1.43 | 1.47 | 0.84 | 0.75 * | |
MBRAE | 0.53 | 0.53 | 0.53 | 0.47 | 0.43 * | |
Anping | MAPE (%) | 22.23 | 21.85 | 21.86 | 16.78 | 12.47 * |
SMAPE (%) | 20.20 | 20.70 | 19.90 | 14.70 | 11.50 * | |
MAAPE (%) | 20.90 | 20.60 | 20.70 | 15.90 | 12.30 * | |
RMSE | 3.16 | 3.27 | 3.09 | 2.42 | 1.85 * | |
MASE | 1.05 | 1.079 | 1.04 | 0.76 | 0.72 * | |
MBRAE | 0..45 | 0.46 | 0.46 | 0.38 | 0.39 | |
Suao | MAPE (%) | 29.01 | 28.69 | 28.88 | 26.88 | 18.65 * |
SMAPE (%) | 29.30 | 29.50 | 29.20 | 24.20 | 20.10 * | |
MAAPE (%) | 27.30 | 26.90 | 27.20 | 25.50 | 18.50 * | |
RMSE | 13.43 | 13.79 | 13.42 | 10.55 | 9.67 * | |
MASE | 0.85 | 0.85 | 0.85 | 0.72 | 0.57 * | |
MBRAE | 0.46 | 0.45 | 0.46 | 0.42 | 0.35 * | |
Avg | MAPE (%) | 19.14 | 18.73 | 21.12 | 15.14 | 10.75 * |
SMAPE (%) | 35.86 | 18.30 | 17.94 | 13.62 | 11.46 * | |
MAAPE (%) | 18.20 | 17.78 | 17.72 | 14.46 | 11.38 * | |
RMSE | 60.00 | 57.26 | 56.52 | 42.26 | 31.94 * | |
MASE | 1.00 | 0.98 | 0.97 | 0.71 | 0.70 * | |
MBRAE | 0.48 | 0.47 | 0.47 | 0.42 | 0.41 * |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, C.-H.; Chang, P.-Y. Forecasting the Demand for Container Throughput Using a Mixed-Precision Neural Architecture Based on CNN–LSTM. Mathematics 2020, 8, 1784. https://doi.org/10.3390/math8101784
Yang C-H, Chang P-Y. Forecasting the Demand for Container Throughput Using a Mixed-Precision Neural Architecture Based on CNN–LSTM. Mathematics. 2020; 8(10):1784. https://doi.org/10.3390/math8101784
Chicago/Turabian StyleYang, Cheng-Hong, and Po-Yin Chang. 2020. "Forecasting the Demand for Container Throughput Using a Mixed-Precision Neural Architecture Based on CNN–LSTM" Mathematics 8, no. 10: 1784. https://doi.org/10.3390/math8101784
APA StyleYang, C. -H., & Chang, P. -Y. (2020). Forecasting the Demand for Container Throughput Using a Mixed-Precision Neural Architecture Based on CNN–LSTM. Mathematics, 8(10), 1784. https://doi.org/10.3390/math8101784