An Automated Recognition of Work Activity in Industrial Manufacturing Using Convolutional Neural Networks
Abstract
:1. Introduction
2. Research Literature
3. Materials and Methods
3.1. The Service Used as a Case Study
- stopping the furnace operating,
- checking the solid fuel tank,
- checking the gear motor and auger,
- assembling the auger and the gear motor, and
- tightening the mounting screws of the gear motor and mounting the cleanout.
3.2. New Approach to the Automatic Generation of Work Instructions
Algorithm 1: |
initialization; Input ← (k, l);//data from algorithm 6 z ← 0; for i = 1 to 3 do begin inc(z); if (l + z − 1 == size(I[k][])) then begin inc(k); z = 0; l = 1; end; send → (I[k, l + z]); end; end |
Algorithm 2: |
initialization; activity = get(activity) stage = get(stage) time = find(act_ time, next(activity, stage)) if time == 0 then time = find(act_ time, next(activity, stage)) a = time_now while (a < time) begin send → activity, stage end; send → next(activity, stage) end |
4. Research Results—An Integrated System to Support the Automatic Training of New Employees
4.1. Preparing the System to Work
4.2. The Correct Operation of the System
4.3. The Results of the Experiments
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Jimeno-Morenilla, A.; Azariadis, P.; Molina-Carmona, R.; Kyratzi, S.; Moulianitis, V. Technology enablers for the implementation of Industry 4.0 to traditional manufacturing sectors: A review. Comput. Ind. 2021, 125, 103390. [Google Scholar] [CrossRef]
- Forkan, A.R.M.; Montori, F.; Georgakopoulos, D.; Jayaraman, P.P.; Yavari, A.; Morshed, A. An industrial IoT solution for evaluating workers’ performance via activity recognition. In Proceedings of the International Conference on Distributed Computing Systems 2019, Richardson, TX, USA, 7–9 July 2019; pp. 1393–1403. [Google Scholar] [CrossRef]
- Wang, B.; Xue, Y.; Yan, J.; Yang, X.; Zhou, Y. Human-Centered Intelligent Manufacturing: Overview and Perspectives. Chin. J. Eng. Sci. 2020, 22, 139. [Google Scholar] [CrossRef]
- Maekawa, T.; Nakai, D.; Ohara, K.; Namioka, Y. Toward practical factory activity recognition: Unsupervised understanding of repetitive assembly work in a factory. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2016, Heidelberg Germany, 12–16 September 2016; pp. 1088–1099. [Google Scholar] [CrossRef] [Green Version]
- Stisen, A.; Mathisen, A.; Sorensen, S.K.; Blunck, H.; Kjargaard, M.B.; Prentow, T.S. Task phase recognition for highly mobile workers in large building complexes. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communications, PerCom 2016, Sydney, Australia, 14–18 March 2016. [Google Scholar] [CrossRef] [Green Version]
- Yin, S.; Li, X.; Gao, H.; Kaynak, O. Data-Based Techniques Focused on Modern Industry: An Overview. IEEE Trans. Ind. Electron. 2015, 62, 657–667. [Google Scholar] [CrossRef]
- Luo, X.; Li, H.; Yang, X.; Yu, Y.; Cao, D. Capturing and Understanding Workers’ Activities in Far-Field Surveillance Videos with Deep Action Recognition and Bayesian Nonparametric Learning. Comput. Civ. Infrastruct. Eng. 2019, 34, 333–351. [Google Scholar] [CrossRef]
- Yu, Y.; Yang, X.; Li, H.; Luo, X.; Guo, H.; Fang, Q. Joint-Level Vision-Based Ergonomic Assessment Tool for Construction Workers. J. Constr. Eng. Manag. 2019, 145, 04019025. [Google Scholar] [CrossRef]
- Yu, Y.; Guo, H.; Ding, Q.; Li, H.; Skitmore, M. An experimental study of real-time identification of construction workers’ unsafe behaviors. Autom. Constr. 2017, 82, 193–206. [Google Scholar] [CrossRef] [Green Version]
- Felsberger, A.; Reiner, G. Sustainable Industry 4.0 in Production and Operations Management: A Systematic Literature Review. Sustainability 2020, 12, 7982. [Google Scholar] [CrossRef]
- Patalas-Maliszewska, J.; Halikowski, D. A Deep Learning-Based Model for the Automated Assessment of the Activity of a Single Worker. Sensors 2020, 20, 2571. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Two-stream convolutional networks for action recognition in videos. Adv. Neural Inf. Process Syst. 2014, 1, 568–576. [Google Scholar]
- Limin, Y.; Yue, L.I.; Bin, D.U.; Hao, P. Dynamic gesture recognition based on key feature points trajectory. Optoelectron. Technol. 2015, 35, 187–190. [Google Scholar]
- Kaiming, H.; Georgia, G.; Piotr, D.; Ros, G. Mask r-cnn. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 99, 1. [Google Scholar]
- Toshev, A.; Szegedy, C. DeepPose: Human pose estimation via deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1653–1660. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, realtime object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. A Framework for Automatic Generation of Augmented Reality Maintenance Repair Instructions based on Convolutional Neural Networks. Procedia 2020, 93, 977–982. [Google Scholar] [CrossRef]
- Pham, Q.T.; Pham-Nguyen, A.; Misra, S.; Damaševičius, R. Increasing innovative working behaviour of information technology employees in vietnam by knowledge management approach. Computers 2020, 9, 61. [Google Scholar] [CrossRef]
- Ašeriškis, D.; Damaševičius, R. Gamification of a project management system. In Proceedings of the 7th International Conference on Advances in Computer-Human Interactions, ACHI 2014, Barcelona, Spain, 23–27 March 2014; pp. 200–207. [Google Scholar]
- Al-Amin, M.; Qin, R.; Tao, W.; Doell, D.; Lingard, R.; Yin, Z.; Leu, M.C. Fusing and refining convolutional neural network models for assembly action recognition in smart manufacturing. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2020. [Google Scholar] [CrossRef]
- Calvetti, D.; Mêda, P.; Gonçalves, M.C.; Sousa, H. Worker 4.0: The future of sensored construction sites. Buildings 2020, 10, 169. [Google Scholar] [CrossRef]
- Rebmann, A.; Emrich, A.; Fettke, P. Enabling the discovery of manual processes using a multi-modal activity recognition approach. In Business Process Management Workshops; Lecture Notes in Business Information, Processing; Di Francescomarino, C., Dijkman, R., Zdun, U., Eds.; Springer: Cham, Switzerland, 2019; Volume 362. [Google Scholar] [CrossRef]
- Knoch, S.; Ponpathirkoottam, S.; Fettke, P.; Loos, P. Technology-enhanced process elicitation of worker activities in manufacturing. In Business Process Management Workshops; BPM 2017; Lecture Notes in Business Information Processing; Teniente, E., Weidlich, M., Eds.; Springer: Cham, Switzerland, 2017; Volume 308. [Google Scholar] [CrossRef]
- Zou, H.; Zhou, Y.; Yang, J.; Spanos, C.J. Towards occupant activity driven smart buildings via WiFi-enabled IoT devices and deep learning. Energy Build. 2018, 177, 12–22. [Google Scholar] [CrossRef]
- Ijjina, E.P.; Chalavadi, K.M. Human action recognition in RGB-D videos using motion sequence information and deep learning. Pattern Recogn. 2017, 72, 504–516. [Google Scholar] [CrossRef]
- Chen, Z.; Jiang, C.; Xie, L. Building occupancy estimation and detection: A review. Energy Build. 2018, 169, 260–270. [Google Scholar] [CrossRef]
- Mannhardt, F.; Bovo, R.; Oliveira, M.F.; Julier, S. A taxonomy for combining activity recognition and process discovery in industrial environments. In Intelligent Data Engineering and Automated Learning—IDEAL 2018; IDEAL 2018; Lecture Notes in Computer, Science; Yin, H., Camacho, D., Novais, P., Tallón-Ballesteros, A., Eds.; Springer: Cham, Switzerland, 2018; Volume 11315. [Google Scholar] [CrossRef] [Green Version]
- Tadeusiewicz, R. Image Recognition; PWN: Warszawa, Poland, 1991. [Google Scholar]
- Ge, H.; Zhu, Z.; Lou, K.; Wei, W.; Liu, R.; Damaševičius, R.; Woźniak, M. Classification of infrared objects in manifold space using kullback-leibler divergence of gaussian distributions of image points. Symmetry 2020, 12, 434. [Google Scholar] [CrossRef] [Green Version]
- Zhou, B.; Duan, X.; Ye, D.; Wei, W.; Woźniak, M.; Połap, D.; Damaševičius, R. Multi-level features extraction for discontinuous target tracking in remote sensing image monitoring. Sensors 2019, 19, 4855. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Goodfellow, I.; Bengio, Y.; Courville, A.; Bengio, Y. Deep Learning; MIT Press: Cambridge, UK, 2016; Volume 1, No. 2. [Google Scholar]
- Hassan, M.M.; Uddin, M.Z.; Mohamed, A.; Almogren, A. A robust human activity recognition system using smartphone sensors and deep learning. Future Gener. Comput. Syst. 2018, 81, 307–313. [Google Scholar] [CrossRef]
- Tao, W.; Lai, Z.H.; Leu, M.C.; Yin, Z. Worker activity recognition in smart manufacturing using IMU and sEMG signals with convolutional neural networks. Procedia Manuf. 2018, 26, 1159–1166. [Google Scholar] [CrossRef]
- Zhang, F.; Niu, K.; Xiong, J.; Jin, B.; Gu, T.; Jiang, Y.; Zhang, D. Towards a diffraction-based sensing approach on human activity recognition. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2019, 3, 1–25. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Zheng, X.; Wang, M.; Ordieres-Meré, J. Comparison of data preprocessing approaches for applying deep learning to human activity recognition in the context of industry 4.0. Sensors 2018, 18, 2146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tao, W.; Leu, M.C.; Yin, Z. Multi-modal recognition of worker activity for human-centered intelligent manufacturing. Eng. Appl. Artif. Intell. 2020, 95, 103868. [Google Scholar] [CrossRef]
- Ryselis, K.; Petkus, T.; Blažauskas, T.; Maskeliūnas, R.; Damaševičius, R. Multiple kinect based system to monitor and analyze key performance indicators of physical training. Hum.-Cent. Comput. Inf. Sci. 2020, 10, 51. [Google Scholar] [CrossRef]
- Rude, D.J.; Adams, S.; Beling, P.A. Task recognition from joint tracking data in an operational manufacturing cell. J. Intell. Manuf. 2018, 29, 1203–1217. [Google Scholar] [CrossRef]
- Kulikajevas, A.; Maskeliunas, R.; Damaševičius, R. Detection of sitting posture using hierarchical image composition and deep learning. PeerJ Comput. Sci. 2021, 7, e442. [Google Scholar] [CrossRef]
- Zhang, M.; Chen, S.; Zhao, X.; Yang, Z. Research on construction workers’ activity recognition based on smartphone. Sensors 2018, 18, 2667. [Google Scholar] [CrossRef] [Green Version]
- Xia, Q.; Korpela, J.; Namioka, Y.; Maekawa, T. Robust unsupervised factory activity recognition with body-worn accelerometer using temporal structure of multiple sensor data motifs. Proc. ACM Interact. Mobile Wearable Ubiquitous Technol. 2020, 4, 1–30. [Google Scholar] [CrossRef]
- Menolotto, M.; Komaris, D.; Tedesco, S.; O’flynn, B.; Walsh, M. Motion capture technology in industrial applications: A systematic review. Sensors 2020, 20, 5687. [Google Scholar] [CrossRef]
- Yang, J.; Liu, Y.; Liu, Z.; Wu, Y.; Li, T.; Yang, Y. A Framework for Human Activity Recognition Based on WiFi CSI Signal Enhancement. Int. J. Antennas Propag. 2021, 6654752, 1–18. [Google Scholar] [CrossRef]
- Afza, F.; Khan, M.A.; Sharif, M.; Kadry, S.; Manogaran, G.; Saba, T.; Ashraf, I.; Damaševičius, R. A framework of human action recognition using length control features fusion and weighted entropy-variances based feature selection. Image Vis. Comput. 2021, 106, 104090. [Google Scholar] [CrossRef]
- Helmi, A.M.; Al-qaness, M.A.A.; Dahou, A.; Damaševičius, R.; Krilavičius, T.; Elaziz, M.A. A Novel Hybrid Gradient-Based Optimizer and Grey Wolf Optimizer Feature Selection Method for Human Activity Recognition Using Smartphone Sensors. Entropy 2021, 23, 1065. [Google Scholar] [CrossRef]
- Priya, S.J.; Rani, A.J.; Subathra, M.S.P.; Mohammed, M.A.; Damaševičius, R.; Ubendran, N. Local Pattern Transformation Based Feature Extraction for Recognition of Parkinson’s Disease Based on Gait Signals. Diagnostics 2021, 11, 1395. [Google Scholar] [CrossRef]
- Wozniak, M.; Wieczorek, M.; Silka, J.; Polap, D. Body pose prediction based on motion sensor data and recurrent neural network. IEEE Trans. Ind. Inform. 2021, 17, 2101–2111. [Google Scholar] [CrossRef]
- Li, M.; Jiang, Z.; Liu, Y.; Chen, S.; Wozniak, M.; Scherer, R.; Damasevicius, R.; Wei, W.; Li, Z.; Li, Z. Sitsen: Passive sitting posture sensing based on wireless devices. Int. J. Distrib. Sens. Netw. 2021, 17, 15501477211024846. [Google Scholar] [CrossRef]
- Mujahid, A.; Awan, M.J.; Yasin, A.; Mohammed, M.A.; Damaševičius, R.; Maskeliūnas, R.; Abdulkareem, K.H. Real-Time Hand Gesture Recognition Based on Deep Learning YOLOv3 Model. Appl. Sci. 2021, 11, 4164. [Google Scholar] [CrossRef]
- Maskeliunas, R.; Damaševicius, R.; Segal, S. A review of internet of things technologies for ambient assisted living environments. Future Internet 2019, 11, 259. [Google Scholar] [CrossRef] [Green Version]
- Dammalapati, H.; Swamy Das, M. An efficient criminal segregation technique using computer vision. In Proceedings of the IEEE 2021 International Conference on Computing, Communication, and Intelligent Systems, ICCCIS 2021, Greater Noida, India, 19–20 February 2021; pp. 636–641. [Google Scholar] [CrossRef]
- Raudonis, V.; Maskeliūnas, R.; Stankevičius, K.; Damaševičius, R. Gender, Age, Colour, Position and Stress: How They Influence Attention at Workplace. In Computational Science and Its Applications—ICCSA 2017; ICCSA 2017; Lecture Notes in Computer Science; Gervasi, O., Murgante, B., Misra, S., Borruso, G., Torre, C.M., Rocha, A.M.A.C., Apduhan, B.O., Stankova, E., Cuzzocrea, A., Eds.; Springer: Cham, Switzerland, 2017; Volume 10408. [Google Scholar] [CrossRef]
- Akhavian, R.; Behzadan, A.H. Smartphone-based construction workers’ activity recognition and classification. Autom. Constr. 2016, 71 Pt 2, 198–209. [Google Scholar] [CrossRef]
- Al Jassmi, H.; Al Ahmad, M.; Ahmed, S. Automatic recognition of labor activity: A machine learning approach to capture activity physiological patterns using wearable sensors. Constr. Innov. 2021, 2, 555–575. [Google Scholar] [CrossRef]
- Yu, Y.; Li, H.; Cao, J.; Luo, X. Three-dimensional working pose estimation in industrial scenarios with monocular camera. IEEE Internet Things J. 2021, 8, 1740–1748. [Google Scholar] [CrossRef]
- Sherafat, B.; Ahn, C.R.; Akhavian, R.; Behzadan, A.H.; Golparvar-Fard, M.; Kim, H.; Azar, E.R. Automated methods for activity recognition of construction workers and equipment: State-of-the-art review. J. Constr. Eng. Manag. 2020, 146, 03120002. [Google Scholar] [CrossRef]
- Angah, O.; Chen, A.Y. Tracking multiple construction workers through deep learning and the gradient based method with re-matching based on multi-object tracking accuracy. Autom. Constr. 2020, 119, 103308. [Google Scholar] [CrossRef]
- Hu, H.; Cheng, K.; Li, Z.; Chen, J.; Hu, H. Workflow recognition with structured two-stream convolutional networks. Pattern Recognit. Lett. 2020, 130, 267–274. [Google Scholar] [CrossRef]
- Ding, L.; Fang, W.; Luo, H.; Love, P.E.; Zhong, B.; Ouyang, X. A deep hybrid learning model to detect unsafe behavior: Integrating convolution neural networks and long short-term memory. Autom. Constr. 2018, 86, 118–124. [Google Scholar] [CrossRef]
- Zhao, J.; Obonyo, E. Convolutional long short-term memory model for recognizing construction workers’ postures from wearable inertial measurement units. Adv. Eng. Inform. 2020, 46, 101177. [Google Scholar] [CrossRef]
- Yang, K.; Ahn, C.R.; Kim, H. Deep learning-based classification of work-related physical load levels in construction. Adv. Eng. Inform. 2020, 45, 101104. [Google Scholar] [CrossRef]
- Sakalle, A.; Tomar, P.; Bhardwaj, H.; Acharya, D.; Bhardwaj, A. A LSTM based deep learning network for recognizing emotions using wireless brainwave driven system. Expert Syst. Appl. 2021, 173, 114516. [Google Scholar] [CrossRef]
- Gong, F.; Ma, Y.; Zheng, P.; Song, T. A deep model method for recognizing activities of workers on offshore drilling platform by multistage convolutional pose machine. J. Loss Prev. Process Ind. 2020, 64, 104043. [Google Scholar] [CrossRef]
- Zeng, M.; Nguyen, L.T.; Yu, B.; Mengshoel, O.J.; Zhu, J.; Wu, P.; Zhang, J.Y. Convolutional Neural Networks for Human Activity Recognition using Mobile Sensors. In Proceedings of the 6th International Conference on Mobile Computing, Applications and Services, Austin, TX, USA, 6–7 November 2014; pp. 197–205. [Google Scholar]
- Jaouedi, N.; Boujnah, N.; Bouhlel, M.S. Deep Learning Approach for Human Action Recognition Using Gated Recurrent Unit Neural Networks and Motion Analysis. J. Comput. Sci. 2019, 15, 1040–1049. [Google Scholar] [CrossRef] [Green Version]
- Pohlt, C.; Schlegl, T.; Wachsmuth, S. Human work activity recognition for working cells in industrial production contexts. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Bari, Italy, 6–9 October 2019; pp. 4225–4230. [Google Scholar] [CrossRef]
- Tao, W.; Al-Amin, M.; Chen, H.; Leu, M.C.; Yin, Z.; Qin, R. Real-time assembly operation recognition with fog computing and transfer learning for human-centered intelligent manufacturing. Procedia Manuf. 2020, 48, 926–931. [Google Scholar] [CrossRef]
- Son, H.; Choi, H.; Seong, H.; Kim, C. Detection of construction workers under varying poses and changing background in image sequences via very deep residual networks. Autom. Constr. 2019, 99, 27–38. [Google Scholar] [CrossRef]
- Sun, H.; Ning, G.; Zhao, Z.; Huang, Z.; He, Z. Automated work efficiency analysis for smart manufacturing using human pose tracking and temporal action localization. J. Vis. Commun. Image Represent. 2020, 73, 102948. [Google Scholar] [CrossRef]
- Maliszewska, P.; Halikowski, D.; Patalas-Maliszewska, J. A Model for Generating Workplace Procedures Using a CNN-SVM Architecture. Symmetry 2019, 11, 1151. [Google Scholar] [CrossRef] [Green Version]
- Wogu, I.A.P.; Misra, S.; Assibong, P.A.; Olu-Owolabi, E.F.; Maskeliūnas, R.; Damasevicius, R. Artificial intelligence, smart classrooms and online education in the 21st century: Implications for human development. J. Cases Inf. Technol. 2019, 21, 66–79. [Google Scholar] [CrossRef] [Green Version]
- Roberts, D.; Torres Calderon, W.; Tang, S.; Golparvar-Fard, M. Vision-based construction worker activity analysis informed by body posture. J. Comput. Civ. Eng. 2020, 34, 04020017. [Google Scholar] [CrossRef]
- Neuhausen, M.; Pawlowski, D.; König, M. Comparing classical and modern machine learning techniques for monitoring pedestrian workers in top-view construction site video sequences. Appl. Sci. 2020, 10, 8466. [Google Scholar] [CrossRef]
- Sharma, R.; Pachori, R.B.; Sircar, P. Automated emotion recognition based on higher order statistics and deep learning algorithm. Biomed. Signal Process. Control 2020, 58, 101867. [Google Scholar] [CrossRef]
- Sathyanarayana, A.; Joty, S.; Fernandez-Luque, L.; Ofli, F.; Srivastava, J.; Elmagarmid, A.; Taheri, S.; Arora, T. Impact of Physical Activity on Sleep: A Deep Learning Based Exploration; Cornell University: Ithaca, NY, USA, 2016. [Google Scholar]
Paper | Method | Dataset | Type | Accuracy |
---|---|---|---|---|
[60] | CNN + LSTM | original dataset created by the authors | safe actions/unsafe action | 97/92 |
[34] | CNN + SVM | bearing vibration dataset, Case Western Reserve University | ISAX based features | 88 |
[33] | CNN, RNN | original dataset created by the authors | 97 | |
[75] | CNN + LSTM | DEAP dataset | 2 (Hv/Lv))/SEED-dataset 3 | 84.16/90.81 |
[66] | RNN + Gated Recurrent Unit | UCF Sports, UCF101, KTH | KTH (GMM + KF) | 71.1 |
[39] | Markov models, Naive Bayes, K-means | original dataset created by the authors | point-by-point/point-by-point with FSHMM | 70/65.8 |
This paper | CNN, CNN + SVM/YOLOv3 | original dataset created by the authors | Steps and objects of the service procedure | 94.01/73.15 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Patalas-Maliszewska, J.; Halikowski, D.; Damaševičius, R. An Automated Recognition of Work Activity in Industrial Manufacturing Using Convolutional Neural Networks. Electronics 2021, 10, 2946. https://doi.org/10.3390/electronics10232946
Patalas-Maliszewska J, Halikowski D, Damaševičius R. An Automated Recognition of Work Activity in Industrial Manufacturing Using Convolutional Neural Networks. Electronics. 2021; 10(23):2946. https://doi.org/10.3390/electronics10232946
Chicago/Turabian StylePatalas-Maliszewska, Justyna, Daniel Halikowski, and Robertas Damaševičius. 2021. "An Automated Recognition of Work Activity in Industrial Manufacturing Using Convolutional Neural Networks" Electronics 10, no. 23: 2946. https://doi.org/10.3390/electronics10232946
APA StylePatalas-Maliszewska, J., Halikowski, D., & Damaševičius, R. (2021). An Automated Recognition of Work Activity in Industrial Manufacturing Using Convolutional Neural Networks. Electronics, 10(23), 2946. https://doi.org/10.3390/electronics10232946