Exploring the Advancements and Future Research Directions of Artificial Neural Networks: A Text Mining Approach
Abstract
:1. Introduction
2. Background
2.1. Artificial Neural Network
2.2. Applications of ANN-Based Approaches
2.3. Text Mining for ANN
3. Research Methodology
- What do we learn about the most common areas of ANN directions from this data set?
- During the sample period of January 2000–2020, what trends were seen in ANN directions?
- What are the most important areas of study in ANN that will shape the field in the future?
4. Data Analysis and Discussion
4.1. Descriptive Analysis
4.2. Text Mining Analysis (Clustering)
4.2.1. Clustering Results
4.2.2. Word Frequency Distribution
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- McClelland, J.L.; Rumelhart, D.E.; PDP Research Group. Parallel Distributed Processing, Volume 2: Explorations in the Microstructure of Cognition: Psychological and Biological Models; MIT Press: Cambridge, MA, USA, 1987; Volume 2. [Google Scholar]
- Lee, T.S.; Mumford, D. Hierarchical Bayesian inference in the visual cortex. JOSA A 2003, 20, 1434–1448. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Suganuma, M.; Shirakawa, S.; Nagao, T. A genetic programming approach to designing convolutional neural network architectures. In Proceedings of the Genetic and Evolutionary Computation Conference, Berlin, Germany, 15–19 July 2017; pp. 497–504. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
- Hubel, D.H.; Wiesel, T.N. Receptive Fields of Single Neurons in the Cat’s Striate Cortex. J. Physiol. 1959, 148, 574–591. [Google Scholar] [CrossRef] [PubMed]
- Hinton, G.E.; Osindero, S.; Teh, Y.W. A Fast Learning Algorithm for Deep Belief Nets. Neural Comput. 2006, 18, 1527–1554. [Google Scholar] [CrossRef] [PubMed]
- Salakhutdinov, R.; Larochelle, H. Efficient Learning of Deep Boltzmann Machines. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, AISTATS 2010, Chia Laguna Resort, Sardinia, Italy, 13–15 May 2010; Volume 9, pp. 693–700. [Google Scholar]
- Liu, J.-W.; Chi, G.-H.; Luo, X.-L. Contrastive Divergence Learning of Restricted Boltzmann Machine. In Proceedings of the 2012 Second International Conference on Electric Technology and Civil Engineering, Washington, DC, USA, 18–20 May 2012; pp. 712–715. [Google Scholar]
- Gong, M.; Liu, J.; Li, H.; Cai, Q.; Su, L. A Multiobjective Sparse Feature Learning Model for Deep Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2015, 26, 3263–3277. [Google Scholar] [CrossRef] [PubMed]
- Mikolov, T.; Sutskever, I.; Chen, K.; Corrado, G.S.; Dean, J. Distributed representations of words and phrases and their compositionality. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 5–10 December 2013; Volume 26. [Google Scholar]
- Zhang, Y.; Wallace, B. A sensitivity analysis of (and practitioners’ guide to) convolutional neural networks for sentence classification. arXiv 2015, arXiv:1510.03820. [Google Scholar]
- Grekousis, G. Artificial neural networks and deep learning in urban geography: A systematic review and meta-analysis. Comput. Environ. Urban Syst. 2019, 74, 244–256. [Google Scholar] [CrossRef]
- Che, Z.; Purushotham, S.; Cho, K.; Sontag, D.; Liu, Y. Recurrent neural networks for multivariate time series with missing values. Sci. Rep. 2018, 8, 6085. [Google Scholar] [CrossRef] [Green Version]
- Md, A.Q.; Kapoor, S.; Chris Junni, A.V.; Sivaraman, A.K.; Tee, K.F.; Sabireen, H.; Janakiraman, N. Novel optimization approach for stock price forecasting using multi-layered sequential LSTM. Appl. Soft Comput. 2023, 134, 109830. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
- Bahdanau, D.; Cho, K.; Bengio, Y. Neural machine translation by jointly learning to align and translate. arXiv 2014, arXiv:1409.0473. [Google Scholar]
- Louati, A.; Lahyani, R.; Aldaej, A.; Mellouli, R.; Nusir, M. Mixed integer linear programming models to solve a real-life vehicle routing problem with pickup and delivery. Appl. Sci. 2021, 11, 9551. [Google Scholar] [CrossRef]
- Louati, A.; Masmoudi, F.; Lahyani, R. Traffic disturbance mining and feedforward neural network to enhance the immune network control performance. In Proceedings of the Seventh International Congress on Information and Communication Technology: ICICT 2022, London, UK, 21–24 February 2022; Volume 1, pp. 99–106. [Google Scholar]
- Louati, A.; Louati, H.; Li, Z. Deep learning and case-based reasoning for predictive and adaptive traffic emergency management. J. Supercomput. 2021, 77, 4389–4418. [Google Scholar] [CrossRef]
- Lecun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Hinton, G.E.; Srivastava, N.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R.R. Improving neural networks by preventing co-adaptation of feature detectors. arXiv 2012, arXiv:1207.0580. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Hinton, G.E.; Deng, L.; Yu, D.; Dahl, G.E.; Mohamed, A.R.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.; Sainath, T.N.; et al. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Process. Mag. 2012, 29, 82–97. [Google Scholar] [CrossRef]
- Wen, T.-H.; Vandyke, D.; Mrkšić, N.; Gašić, M.; Rojas-Barahona, L.M.; Su, P.-H.; Ultes, S.; Young, S. A network-based end-to-end trainable task-oriented dialogue model. In Proceedings of the 2018 IEEE Spoken Language Technology Workshop (SLT), Athens, Greece, 18–21 December 2018; pp. 312–320. [Google Scholar]
- Widrow, B.; Lehr, M.E. 30 Years of Adaptive Neural Networks: Perception, Motor Control, and Cognition; World Scientific: Singapore, 1990. [Google Scholar]
- Alpaydin, E. Introduction to Machine Learning; MIT Press: Cambridge, MA, USA, 2010. [Google Scholar]
- Gandomi, A.A.; Haider, M. Beyond the hype: Big data concepts, methods, and analytics. Int. J. Inf. Manag. 2015, 35, 137–144. [Google Scholar] [CrossRef] [Green Version]
- Louati, H.; Bechikh, S.; Louati, A.; Hung, C.C.; Said, L.B. Deep convolutional neural network architecture design as a bi-level optimization problem. Neurocomputing 2021, 439, 44–62. [Google Scholar] [CrossRef]
- Louati, H.; Bechikh, S.; Louati, A.; Aldaej, A.; Said, L.B. Evolutionary optimization of convolutional neural network architecture design for thoracic X-ray image classification. In Advances and Trends in Artificial Intelligence. Artificial Intelligence Practices, Proceedings of the 34th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE 2021, Kuala Lumpur, Malaysia, 26–29 July 2021; Springer: Berlin/Heidelberg, Germany, 2021; pp. 121–132. [Google Scholar]
- Louati, H.; Bechikh, S.; Louati, A.; Aldaej, A.; Said, L.B. Joint design and compression of convolutional neural networks as a Bi-level optimization problem. Neural Comput. Appl. 2022, 34, 15007–15029. [Google Scholar] [CrossRef]
- Louati, A. A hybridization of deep learning techniques to predict and control traffic disturbances. Artif. Intell. Rev. 2020, 53, 5675–5704. [Google Scholar] [CrossRef]
- Louati, A.; Louati, H.; Nusir, M.; Hardjono, B. Multi-agent deep neural networks coupled with LQF-MWM algorithm for traffic control and emergency vehicles guidance. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 5611–5627. [Google Scholar] [CrossRef]
- Louati, H.; Louati, A.; Bechikh, S.; Masmoudi, F.; Aldaej, A.; Kariri, E. Topology optimization search of deep convolution neural networks for CT and X-ray image classification. BMC Med. Imaging 2022, 22, 120. [Google Scholar] [CrossRef] [PubMed]
- Louati, H.; Louati, A.; Bechikh, S.; Ben Said, L. Design and Compression Study for Convolutional Neural Networks Based on Evolutionary Optimization for Thoracic X-ray Image Classification. In Computational Collective Intelligence, Proceedings of the 14th International Conference, ICCCI 2022, Hammamet, Tunisia, 28–30 September 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 283–296. [Google Scholar]
- Louati, H.; Bechikh, S.; Louati, A.; Aldaej, A.; Said, L.B. Evolutionary optimization for CNN compression using thoracic X-ray image classification. In Advances and Trends in Artificial Intelligence. Theory and Practices in Artificial Intelligence, Proceedings of the 35th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE 2022, Kitakyushu, Japan, 19–22 July 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 112–123. [Google Scholar]
- Feng, Y.; Lu, X.; Wang, H. Design research on user experience based on text mining. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 9–18. [Google Scholar]
- Jin, Y.; Wang, J.; Liu, Y. Text mining in product design research: A literature review. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 149–159. [Google Scholar]
- Li, X.; Li, Q.; Li, Z. Text mining in industrial design research: A literature review. J. Ambient. Intell. Humaniz. Comput. 2019, 10, 10175–10184. [Google Scholar]
- Wang, J.; Li, X.; Liu, Y. Text mining in design thinking research: A literature review. J. Ambient. Intell. Humaniz. Comput. 2018, 9, 5397–5405. [Google Scholar]
- Zheng, Y.; Wang, J.; Liu, Y. Text mining in design for sustainability research: A literature review. J. Ambient. Intell. Humaniz. Comput. 2018, 9, 3357–3365. [Google Scholar]
- Wang, J.; Li, X.; Liu, Y. Text mining in design education research: A literature review. J. Ambient. Intell. Humaniz. Comput. 2017, 8, 5685–5693. [Google Scholar]
- Zhang, W.; Liu, Y.; Dai, Y. Sentiment analysis with text mining techniques and artificial neural networks. Inf. Sci. 2020, 520, 92–107. [Google Scholar]
- Mohammed, A.H.; Al-Sarawi, S.F.; Dass, S.C. Text classification using artificial neural networks and text mining techniques. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 5649–5658. [Google Scholar]
- Sharma, A.; Sharma, S. A review on artificial neural network based named entity recognition using text mining techniques. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 4585–4600. [Google Scholar]
- Liu, L.; Liu, Y.; Dai, Y. Opinion mining and sentiment analysis with text mining and artificial neural networks. Inf. Sci. 2019, 476, 225–242. [Google Scholar]
- Zhang, Y.; Tan, Y.; Liu, Y. Stock price prediction with text mining and artificial neural networks. J. Ambient. Intell. Humaniz. Comput. 2018, 9, 53–62. [Google Scholar]
Advantages | Disadvantages | Limitations |
---|---|---|
High accuracy in modeling complex systems | Overfitting | Sensitivity to noise and irrelevant inputs |
Capable of handling large amounts of data | Slow convergence in training | Lack of interpretability |
Can handle non-linear relationships | Prone to local minima | Requires careful selection of architecture and hyperparameters |
Can perform unsupervised learning | Can be computationally intensive | Can be sensitive to the choice of activation function |
Can perform feature extraction | Can be prone to over-fitting | Requires large amounts of computational resources |
Can handle imbalanced data | Can be prone to over- training | Requires extensive preprocessing and feature engineering |
Can model complex, non-linear relationships | Can be sensitive to the choice of loss function | Requires careful initialization of weights |
Can perform online learning | Can have poor generalization performance | Can be sensitive to the choice of optimization algorithm |
Journal Name | Total Articles |
---|---|
Neural processing Letters | 1414 |
Journal of Neural Transmission | 2429 |
Neural Computing and Applications | 4747 |
Artificial Intelligence Review | 782 |
Optical Memory and Neural Networks | 440 |
Journal of Computational Neuroscience | 799 |
Total | 10,661 |
Rank | Keyword | Keywords Count |
---|---|---|
1 | Neural network | 678 |
2 | Artificial neural network | 392 |
3 | Classification | 222 |
4 | Machine learning | 177 |
5 | Deep learning | 169 |
6 | Genetic algorithm | 163 |
7 | Support vector machine | 153 |
8 | Particle swarm optimization | 145 |
9 | Feature extraction | 128 |
10 | Optimization | 125 |
11 | Feature selection | 123 |
12 | Recurrent Neural Network | 110 |
13 | Extreme learning machine | 106 |
14 | Clustering | 103 |
15 | Dopamine | 101 |
16 | Face recognition | 98 |
17 | Pattern recognition | 90 |
18 | Data mining | 98 |
19 | Parkinson’s disease | 87 |
20 | Schizophrenia | 85 |
21 | Support vector machines | 80 |
22 | Synchronization | 78 |
23 | Depression | 76 |
24 | Genetic algorithm | 75 |
25 | fuzzy logic | 65 |
26 | Hippocampus | 64 |
27 | Prediction | 64 |
28 | Dimensionality reduction | 59 |
29 | Exponential stability | 58 |
30 | Cognition | 58 |
31 | Artificial intelligence | 57 |
32 | Adaptive control | 56 |
33 | Stability | 54 |
34 | Dementia | 53 |
35 | Deep brain stimulation | 52 |
36 | Oxidativestress | 52 |
37 | Electroencephalogram | 51 |
38 | Swarm intelligence | 51 |
39 | Support vector regression | 46 |
40 | Forecasting | 46 |
41 | Serotonin | 46 |
42 | Polymorphism | 46 |
43 | Linear matrix inequality | 45 |
44 | Time-varying delay | 45 |
45 | Reinforcement learning | 44 |
46 | Botulinum toxin | 43 |
47 | Parkinson’s disease | 42 |
48 | Adhd | 42 |
49 | Modeling | 41 |
50 | Evolutionary algorithm | 41 |
Dimension | 2007–2011 | 2012–2015 | 2016–2019 | Trend Lines |
---|---|---|---|---|
Mental disease | 142 | 138 | 153 | |
Mental curing | 57 | 37 | 35 | |
Brain | 84 | 70 | 110 | |
Deep learning | 544 | 702 | 1110 | |
Algorithm | 91 | 218 | 392 | |
Data mining | 195 | 278 | 501 | |
Stability | 22 | 32 | 58 | |
Reasoning | 29 | 32 | 62 | |
Transmission | 53 | 47 | 47 | |
Machine learning | 30 | 75 | 178 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kariri, E.; Louati, H.; Louati, A.; Masmoudi, F. Exploring the Advancements and Future Research Directions of Artificial Neural Networks: A Text Mining Approach. Appl. Sci. 2023, 13, 3186. https://doi.org/10.3390/app13053186
Kariri E, Louati H, Louati A, Masmoudi F. Exploring the Advancements and Future Research Directions of Artificial Neural Networks: A Text Mining Approach. Applied Sciences. 2023; 13(5):3186. https://doi.org/10.3390/app13053186
Chicago/Turabian StyleKariri, Elham, Hassen Louati, Ali Louati, and Fatma Masmoudi. 2023. "Exploring the Advancements and Future Research Directions of Artificial Neural Networks: A Text Mining Approach" Applied Sciences 13, no. 5: 3186. https://doi.org/10.3390/app13053186
APA StyleKariri, E., Louati, H., Louati, A., & Masmoudi, F. (2023). Exploring the Advancements and Future Research Directions of Artificial Neural Networks: A Text Mining Approach. Applied Sciences, 13(5), 3186. https://doi.org/10.3390/app13053186