AI-Based Algorithms in IoT-Edge Computing

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Evolutionary Algorithms and Machine Learning".

Deadline for manuscript submissions: closed (15 February 2023) | Viewed by 14836

Special Issue Editor

Special Issue Information

Dear Colleagues,

In the 5G era, Internet-of-Things (IoT) applications will increasingly become part of people’s daily lives. IoT-Edge Computing (IEC) is a promising technology to facilitate the progress of the Internet-of-Things in the 5G era. The IEC equipment is deployed in proximity to IoT users to provide computation with low latency. The efficiency and effectiveness of IoT edge computing are strongly correlated to the features of user behaviors. The dynamics and variety of user behaviors will influence the decision-making of operators and equipment deployment of IEC from all digitally-connected environments. Thus, a holistic user behaviors analysis is desirable for improving the efficiency and effectiveness of IoT edge computing.

Artificial intelligence (AI) algorithms have recently been adapted to various research domains, including computer vision, natural language processing, voice recognition, etc. In addition, AI-based algorithms in line with IoT-edge computing have made a key breakthrough and technical direction in achieving high efficiency and adaptability in a variety of new applications, such as smart wearable devices in healthcare, smart automotive industry, recommender systems, and financial analysis. Recently, AI algorithms emerged in the edge networking and IoT application domain. The design and application of AI techniques/algorithms for edge IoT network management, operations, and automation can improve the way we address networking today, such as topology discovery, network measurement, network monitoring, network modeling, network control, and so on. On the other hand, network design and optimization for AI applications address a complementing topic, namely the support of AI-based systems through novel networking techniques, including new architectures and performance models for IoT edge computing. The networking research community looks upon all of these challenges as opportunities in the Machine Learning era, showing edge computing applications in the IoT.

The main aim of this Special Issue is to integrate novel approaches efficiently, focusing on the performance evaluation and the comparison with existing solutions of AI-enabled algorithms on IoT edge computing.

Topics of interest include, but are not limited to, the following scope:

  • AI-enabled algorithms for edge computing architectures, frameworks, platforms, and protocols for IoT;
  • Machine learning techniques in edge computing for IoT;
  • Edge network architecture and optimization for AI applications at scale;
  • AI Algorithms for dynamic and large-scale topology discovery;
  • AI algorithms for wireless network resource management and control;
  • Energy-efficient edge network operations via AI algorithms;
  • Deep learning and reinforcement learning in network control and management;
  • Self-learning and adaptive networking protocols and algorithms;
  • AI modeling and performance analysis in edge computing for IoT.

Prof. Dr. Arun Kumar Sangaiah
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • internet of things (IoT)
  • artificial intelligence (AI)
  • edge computing
  • machine learning algorithms

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 1669 KiB  
Article
A Cognitive Model for Technology Adoption
by Fariborz Sobhanmanesh, Amin Beheshti, Nicholas Nouri, Natalia Monje Chapparo, Sandya Raj and Richard A. George
Algorithms 2023, 16(3), 155; https://doi.org/10.3390/a16030155 - 10 Mar 2023
Cited by 5 | Viewed by 4558
Abstract
The widespread adoption of advanced technologies, such as Artificial Intelligence (AI), Machine Learning, and Robotics, is rapidly increasing across the globe. This accelerated pace of change is drastically transforming various aspects of our lives and work, resulting in what is now known as [...] Read more.
The widespread adoption of advanced technologies, such as Artificial Intelligence (AI), Machine Learning, and Robotics, is rapidly increasing across the globe. This accelerated pace of change is drastically transforming various aspects of our lives and work, resulting in what is now known as Industry 4.0. As businesses integrate these technologies into their daily operations, it significantly impacts their work tasks and required skill sets. However, the approach to technological transformation varies depending on location, industry, and organization. However, there are no published methods that can adequately forecast the adoption of technology and its impact on society. It is essential to prepare for the future impact of Industry 4.0, and this requires policymakers and business leaders to be equipped with scientifically validated models and metrics. Data-driven scenario planning and decision-making can lead to better outcomes in every area of the business, from learning and development to technology investment. However, the current literature falls short in identifying effective and globally applicable strategies to predict the adoption rate of emerging technologies. Therefore, this paper proposes a novel parametric mathematical model for predicting the adoption rate of emerging technologies through a unique data-driven pipeline. This approach utilizes global indicators for countries to predict the technology adoption curves for each country and industry. The model is thoroughly validated, and the paper outlines highly promising evaluation results. The practical implications of this proposed approach are significant because it provides policymakers and business leaders with valuable insights for decision-making and scenario planning. Full article
(This article belongs to the Special Issue AI-Based Algorithms in IoT-Edge Computing)
Show Figures

Figure 1

20 pages, 3553 KiB  
Article
An Energy-Aware Load Balancing Method for IoT-Based Smart Recycling Machines Using an Artificial Chemical Reaction Optimization Algorithm
by Sara Tabaghchi Milan, Mehdi Darbandi, Nima Jafari Navimipour and Senay Yalcın
Algorithms 2023, 16(2), 115; https://doi.org/10.3390/a16020115 - 14 Feb 2023
Cited by 2 | Viewed by 1940
Abstract
Recycling is very important for a sustainable and clean environment. Developed and developing countries are both facing the problem of waste management and recycling issues. On the other hand, the Internet of Things (IoT) is a famous and applicable infrastructure used to provide [...] Read more.
Recycling is very important for a sustainable and clean environment. Developed and developing countries are both facing the problem of waste management and recycling issues. On the other hand, the Internet of Things (IoT) is a famous and applicable infrastructure used to provide connection between physical devices. It is an important technology that has been researched and implemented in recent years that promises to positively influence several industries, including recycling and trash management. The impact of the IoT on recycling and waste management is examined using standard operating practices in recycling. Recycling facilities, for instance, can use IoT to manage and keep an eye on the recycling situation in various places while allocating the logistics for transportation and distribution processes to minimize recycling costs and lead times. So, companies can use historical patterns to track usage trends in their service regions, assess their accessibility to gather resources, and arrange their activities accordingly. Additionally, energy is a significant aspect of the IoT since several devices will be linked to the internet, and the devices, sensors, nodes, and objects are all energy-restricted. Because the devices are constrained by their nature, the load-balancing protocol is crucial in an IoT ecosystem. Due to the importance of this issue, this study presents an energy-aware load-balancing method for IoT-based smart recycling machines using an artificial chemical reaction optimization algorithm. The experimental results indicated that the proposed solution could achieve excellent performance. According to the obtained results, the imbalance degree (5.44%), energy consumption (11.38%), and delay time (9.05%) were reduced using the proposed method. Full article
(This article belongs to the Special Issue AI-Based Algorithms in IoT-Edge Computing)
Show Figures

Figure 1

12 pages, 4118 KiB  
Article
Intrusion Detection for Electric Vehicle Charging Systems (EVCS)
by Mohamed ElKashlan, Heba Aslan, Mahmoud Said Elsayed, Anca D. Jurcut and Marianne A. Azer
Algorithms 2023, 16(2), 75; https://doi.org/10.3390/a16020075 - 31 Jan 2023
Cited by 14 | Viewed by 5071
Abstract
The market for Electric Vehicles (EVs) has expanded tremendously as seen in the recent Conference of the Parties 27 (COP27) held at Sharm El Sheikh, Egypt in November 2022. This needs the creation of an ecosystem that is user-friendly and secure. Internet-connected Electric [...] Read more.
The market for Electric Vehicles (EVs) has expanded tremendously as seen in the recent Conference of the Parties 27 (COP27) held at Sharm El Sheikh, Egypt in November 2022. This needs the creation of an ecosystem that is user-friendly and secure. Internet-connected Electric Vehicle Charging Stations (EVCSs) provide a rich user experience and add-on services. Eventually, the EVCSs are connected to a management system, which is the Electric Vehicle Charging Station Management System (EVCSMS). Attacking the EVCS ecosystem remotely via cyberattacks is rising at the same rate as physical attacks and vandalism happening on the physical EVCSs. The cyberattack is more severe than the physical attack as it may affect thousands of EVCSs at the same time. Intrusion Detection is vital in defending against diverse types of attacks and unauthorized activities. Fundamentally, the Intrusion Detection System’s (IDS) problem is a classification problem. The IDS tries to determine if each traffic stream is legitimate or malicious, that is, binary classification. Furthermore, the IDS can identify the type of malicious traffic, which is called multiclass classification. In this paper, we address IoT security issues in EVCS by using different machine learning techniques and using the native IoT dataset to discover fraudulent traffic in EVCSs, which has not been performed in any previous research. We also compare different machine learning classifier algorithms for detecting Distributed Denial of Service (DDoS) attacks in the EVCS network environment. A typical Internet of Things (IoT) dataset obtained from actual IoT traffic is used in the paper. We compare classification algorithms that are placed in line with the traffic and contain DDoS attacks targeting the EVCS network. The results obtained from this research improve the stability of the EVCS system and significantly reduce the number of cyberattacks that could disrupt the daily life activities associated with the EVCS ecosystem. Full article
(This article belongs to the Special Issue AI-Based Algorithms in IoT-Edge Computing)
Show Figures

Figure 1

19 pages, 723 KiB  
Article
An Actor-Based Formal Model and Runtime Environment for Resource-Bounded IoT Services
by Ahmed Abdelmoamen Ahmed
Algorithms 2022, 15(11), 390; https://doi.org/10.3390/a15110390 - 23 Oct 2022
Cited by 3 | Viewed by 1829
Abstract
With sensors becoming increasingly ubiquitous, there is tremendous potential for Internet of Things (IoT) services that can take advantage of the data collected by these sensors. Although there are a growing number of technologies focused on IoT services, there is relatively limited foundational [...] Read more.
With sensors becoming increasingly ubiquitous, there is tremendous potential for Internet of Things (IoT) services that can take advantage of the data collected by these sensors. Although there are a growing number of technologies focused on IoT services, there is relatively limited foundational work on them. This is partly because of the lack of precise understanding, specification, and analysis of such services, and, consequently, there is limited platform support for programming them. In this paper, we present a formal model for understanding and enabling reasoning about distributed IoT services. The paper first studies the key properties of the IoT services profoundly, and then develops an approach for fine-grained resource coordination and control for such services. The resource model identifies the core mechanisms underlying IoT services, informing design and implementation decisions about them if implemented over a middleware or a platform. We took a multi-agent systems approach to represent IoT services, broadly founded in the actors model of concurrency. Actor-based services can be built by composing simpler services. Furthermore, we created a proximity model to represent an appropriate notion of IoT proximity. This model represents the dynamically evolving relationship between the service’s sensing and acting capabilities and the environments in which these capabilities are exercised. The paper also presents the design of a runtime environment to support the implementation of IoT services. Key mechanisms required by such services will be implemented in a distributed middleware. Full article
(This article belongs to the Special Issue AI-Based Algorithms in IoT-Edge Computing)
Show Figures

Figure 1

Back to TopTop