Next Article in Journal
Modern Trends and Recent Applications of Hyperspectral Imaging: A Review
Previous Article in Journal
Human-Machine Interaction: A Vision-Based Approach for Controlling a Robotic Hand Through Human Hand Movements
Previous Article in Special Issue
RACER: A Lightweight Distributed Consensus Algorithm for the IoT with Peer-Assisted Latency-Aware Traffic Optimisation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of IoT-Based Hybrid Autonomous Networked Robots

by
Maki K. Habib
1,2,* and
Chimsom I. Chukwuemeka
1
1
Robotics, Control and Smart Systems (RCSS), School of Sciences and Engineering, American University in Cairo (AUC), New Campus, New Cairo 11835, Egypt
2
Mechanical Engineering Department, University in Cairo (AUC), New Campus, New Cairo 11835, Egypt
*
Author to whom correspondence should be addressed.
Technologies 2025, 13(5), 168; https://doi.org/10.3390/technologies13050168
Submission received: 5 February 2025 / Revised: 4 April 2025 / Accepted: 10 April 2025 / Published: 23 April 2025
(This article belongs to the Special Issue IoT-Enabling Technologies and Applications)

Abstract

:
Autonomous Networked Robot (ANR) systems feature multi-robot systems (MRSs) and wireless sensor networks (WSNs). These systems help to extend coverage, maximize efficiency in data routing, and provide practical and reliable task management, among others. This article presents the development and implementation of an IoT-based hybrid ANR system integrated with different cloud platforms. The system comprises two main components: the physical hybrid ANR, the simulation development environment (SDE) with hardware in the loop (HIL), and the necessary core interfaces. Both are integrated to facilitate system component development, simulation, testing, monitoring, and validation. The operational environment (local and/or distributed) of the designed system is divided into zones, and each zone comprises static IoT-based sensor nodes (SSNs) and a mobile robot with integrated onboard IoT-based sensor nodes (O-SSNs) called the mobile robot sensor node (MRSN). Global MRSNs (G-MRSNs) navigate spaces not covered by a zone. The mobile robots navigate within/around their designated spaces and to any of their SSNs. The SSNs and the O-SSN of each zone are supported by the ZigBee protocol, forming a WSN. The MRSNs and G-MRSNs communicate their collected data from different zones to the base station (BS) through the IoT base station gateway (IoT-BSG) using wireless serial protocol. The base station analyzes and visualizes the received data through GUIs and communicates data through the IoT/cloud using the Wi-Fi protocol. The developed system is demonstrated for event detection and surveillance. Experimental results of the implemented/simulated ANR system and HIL experiments validate the performance of the developed IoT-based hybrid architecture.

1. Introduction

Robotics is an evolving interdisciplinary research field that synergizes research and applications from robotics, sensor networking, computer science, artificial intelligence (AI), and machine learning [1]. This field has been applied in various domains, such as mining for search and rescue operations [2], military target detection and tracking [3], healthcare and monitoring [4], and data harvesting and aggregation systems [5]. In recent years, its applications have extended to include cloud robotics and big data, supported by advanced AI technologies such as deep reinforcement learning (DRL), large language models (LLMs), computer vision, and data analytics, which have significantly enhanced robotic intelligence and operational capabilities [6,7,8].
Autonomous Networked Robotics (ANR) systems represent a primary focus within the field of networked robotics. These systems involve the cooperation of multi-robot systems (MRSs) and wireless sensor networks (WSNs) to execute and coordinate tasks in both local and distributed environments. ANR systems offer multiple benefits, including extended perception, enhanced spatial coverage and exploration capabilities, efficient data collection and handling, reliable routing, and effective task decomposition and management [5]. Recent advancements in collaborative frameworks, such as semantic mapping, have improved the precision and speed of environmental perception, which is critical for large-scale ANR operations [9].
Despite these advancements, ANR systems face significant challenges, including the need for scalable and capable architectures for WSNs and MRSs, efficient communication protocols to manage dynamic and distributed networks of robots and sensor nodes, low-cost sensor deployments, and reliable task navigation and decomposition mechanisms. Advances in system identification techniques, which integrate data-driven and physics-based approaches, have contributed to bridging the gap between simulation and real-world robotics [10]. Additionally, improvements in middleware and communication protocols, such as those used in multi-robot mesh networks, have facilitated ANR systems’ deployment in extreme environments [11]. The foundational development of the hybrid ANR system, including its early conceptual design and simulation tests, has been extensively documented in prior work [12], with further implementation details provided in [13,14]. Furthermore, integrating multidisciplinary approaches is crucial for addressing the growing complexity of ANR systems. Combining unmanned aerial vehicles (UAVs), embedded platforms, and advanced sensing technologies has improved scalability, flexibility, and system reliability [15].
Given these challenges, this paper presents the development of a hybrid ANR system with IoT/cloud integration. The developed system has been demonstrated within a large-scale surveillance application. Initial conceptual development was published in foundational work [12], and the basic simulation and testing were presented in [13,14]. The remainder of this article is structured as follows: Section 2 introduces the ANR literature review, while Section 3 discusses the research challenges and problem statement. Section 4 presents the architecture of the developed system. Section 5 introduces the simulation development environment (SDE), which includes hardware-in-the-loop (HIL) components. Section 6 describes the hardware implementation, and Section 7 concludes with final remarks and future research directions.

2. Related Works in ANR

In [16], deployment, localization, path planning, and task allocation are the four significant functional activities of the MRS/WSN components within ANR systems. In [17], an ANR-based system comprising mobile sensor networks (MSNs) was used to detect empowered intruders through an intrusion detection application. The work utilized the k-barrier coverage probability method, demonstrating improved performance in covering and tracking intruders through simulation. Future extensions of this work propose adopting probabilistic sensing models for real-world implementations. In data collection applications, the authors in [18] developed a collaborative and compressed mobile sensing (CCMS) algorithm for efficient data collection and exchange in building scalar maps. Simulation and experimental results demonstrated reduced power consumption and improved area coverage. Similarly, [19] presented an AUV-based LSN (ALSN) architecture for underwater pipeline monitoring featuring three nodes: linear sensor nodes, AUVs, and a surface sink. The AUVs efficiently collect and deliver data while reducing multi-hop communication issues such as collisions and data loss. Recent advancements like SGSR-Net improve SLAM accuracy through structure-guided LiDAR super-resolution techniques for enhanced localization and mapping [20]. In factory automation applications, WSNs and automated guided vehicles (AGVs) were deployed under the Wireless Networks for Industrial Automation–Factory Automation (WIA-FA) framework [21]. This framework improves logistical sorting performance using centralized management and star network topologies. Modern extensions, including 5G-enabled systems, provide near-zero latency and reliable communication, significantly advancing AGV operations [22]. The techniques and metrics discussed in this paragraph are summarized in Table 1.
In addition, a distributed coverage algorithm controlling networked robots using connected hexagonal lattices was presented in [25]. Each robot within the network communicates with the others, and an event-triggered four-level hierarchical distributed control system coordinates the whole system. The robots tracked and occupied virtual targets within the network through experimentation. In [26], an indoor air monitoring and quality improvement system was introduced, utilizing distributed sensors and mobile robots. The sensors monitored air quality and identified areas requiring improvement, determining the best robot positions to influence environmental conditions. SLAM techniques such as Adaptive Monte Carlo Localization (AMCL) were used for mobile robot positioning and navigation, with validation through simulations. Future work proposed enhancing control strategies for real-world deployments. A similar environment regulation system employing mobile robots and WSNs was presented in [27]. The system used a Gaussian model based on Expectation Maximization (EM) to estimate real environmental distributions and applied gradient descent algorithms to control robots for better area coverage. Simulations showed faster movement of the multi-robot system, improving environmental coverage and regulatory effectiveness. Recent advancements in distributed robotics frameworks, such as XBot2D, dynamically balance computational loads and optimize energy consumption in real-time monitoring applications [24]. A cooperative robotic system for plume tracking was introduced in [28]. Plume dynamics were modeled using an advection–diffusion partial differential equation (PDE), comprising a sensing robot and a tracking robot. Numerical simulations validated the system for effective monitoring. Advanced LiDAR-based SLAM frameworks, such as SGSR-Net, have improved robot localization and environmental mapping accuracy [20]. The advancements in distributed robotics and SLAM technologies are detailed in Table 2.
Furthermore, in [32], a hybrid target-search strategy comprising a network of static sensors supported by mobile search efforts was proposed to facilitate a target-search application. Mobile robot teams were utilized for time-phased deployment of static sensors at optimal locations, informed by extensive planning from given search scenario information. After deployment, static and mobile sensors collaboratively tracked targets and developed navigation trajectories for robots to reach the targets. Simulation results validated the proposed hybrid track-and-search model, demonstrating its feasibility in complex environments. In [33], a mobile robot navigation technique based on static sensors positioned as tags in the operational environment defined target points for inspection tasks. This technique proved effective for equipment monitoring and alarm detection in industrial environments using static wireless sensors combined with mobile robot inspections. Research in [34] deployed multi-robot systems (MRSs) for monitoring environmental pollution. The system successfully coordinated the detection of polluted areas without requiring IoT or cloud infrastructure. Recent advancements in distributed robotics architectures, such as digital twins, enable real-time application planning and simulation, improving deployment efficiency and scalability [35]. In a similar environmental monitoring application, the authors in [36] developed a robotic-aided automated IoT network. The system featured an unmanned ground vehicle (UGV) navigating specific areas and utilizing IoT-enabled nodes for detecting environmental parameters. Recent innovations, such as UAV-supported IoT systems, further enhance these networks by enabling aerial monitoring and data collection, reducing latency, and increasing area coverage [37].
In surveillance applications, a two-tier network (6LoWPAN and IEEE 802.11) ANR system for indoor intruder surveillance was presented in [38]. The system featured a market-based task allocation scheme and a hybrid GA/ACO path planning algorithm for multiple robots. Simulations conducted on stage/player environments demonstrated effective intruder detection and robot path tracking. Furthermore, an ANR surveillance system for heat and lighting conditions was developed [39]. This system utilized a single ZigBee network to connect sensor nodes and robots, with implementation results showcasing reliable communication between the robot and sensor nodes. However, robot control in this system relied on a PC interface, highlighting the need for further system autonomy. The surveillance applications discussed here, including UAV-assisted IoT networks, are elaborated in Table 2.
An indoor surveillance ANR-based application for tracking abnormal human behaviors has been developed, combining mobile robots, RFID, and camera-based WSNs. The system was successfully simulated and implemented, demonstrating efficient intruder tracking as communicated by the WSN. UAV-based monitoring systems have further expanded surveillance capabilities. For instance, [40] presented UAVs equipped with motes, child motes, and infrared cameras for patrolling and environmental monitoring. Experimental results revealed that motes on UAVs significantly reduced overhead network traffic, improved data retrieval, and minimized packet loss. Recent advancements in UAV-supported networks, such as those utilizing AI-driven target recognition systems, have enhanced intruder detection efficiency and coverage [41]. In [42], an ANR system for urban surveillance was introduced, featuring static and mobile sensor nodes capable of detecting outdoor conditions such as combustible gases, temperature, and humidity. A human–machine interface (HMI) base station provided real-time data visualization. Implementation results validated the system’s ability to effectively detect, communicate, and visualize data. Modern IoT-integrated ANR systems with digital twins now offer enhanced scalability and visualization capabilities for urban surveillance applications [35].

3. Research Challenges, Problem Statement, and Research Contributions

Despite the advances in ANR systems research and applications, as presented in Section 2, several areas of growth and advancement still exist. The design of MRSs and WSNs can be improved and flexibly suited to different applications, such as surveillance. Therefore, this section briefly discusses some open areas in ANR systems research and the developed hybrid ANR system.

3.1. Open Areas in ANR Research

The following open areas have been identified in MRSs:
  • Task decomposition and management.
  • Effective deployment and coverage by the robot team.
  • Integration of advanced SLAM techniques to enhance robot mapping, localization, and path planning in static and dynamic environments.
  • For WSNs, the following open areas have been identified:
  • Effective real-time sensor data processing and decision-making techniques.
  • Data loss due to network overload and interference.
In addition, integrating an MRS and WSNs as an ANR system presents additional challenges:
  • Effective architectures for MRS/WSN integration and cooperation.
  • Scalable task distribution and dynamic management to handle large operational environments.
  • Incorporation of IoT-based technologies for further data processing and analytics.
  • Development interfaces supporting the system simulation, hardware-in-loop (HIL) simulation, and implementation.
  • Seamless integration of edge computing frameworks to reduce latency in data processing and analytics.
  • Enhanced interoperability between IoT nodes and cloud platforms to enable cross-platform deployment.

3.2. Problem Statement, Expected Contributions, and System Design Operational Flow of the Selected Application

An effective architecture for ANR systems is needed based on the challenges mentioned, representing the knowledge gap in the field. The architecture should feature cooperative interaction between the MRS and the integration of WSNs, enabling their robustness and scalability. It should also feature interactive interfaces to visualize detected events and track robot navigation. Finally, it should integrate the system with IoT nodes and cloud technologies for further data collection, processing, and analytics.
By assessing what has been performed in the field and understanding the technical challenges and the development needs, this paper introduces the answer to develop an IoT-based hybrid ANR architecture that is characterized by the following:
  • Distributed WSNs. Each consists of static IoT-based sensor nodes (SSNs) and an onboard SSN (O-SSN) attached to each mobile robot. These WSNs can detect events and collect, analyze, and communicate data.
  • The MRS comprises multiple robots. Each robot is integrated with one O-SSN, and each with its O_SSN is classified as an MRSN, similar to each G-MRSN. Both MRSNs and G_MRSNs are capable of conducting navigation and coordination tasks. All O_SSNs communicate their data to an IoT base station, forming another WSN.
  • The WSNs are a two-tier network consisting of a ZigBee (2.4 GHz) protocol network supporting communication between each SSN in the WSN of a zone to the O-SSN of that zone and a wireless serial (sub-1 GHz) protocol network supporting the communication of all O-SSNs on mobile robots to the base station (BS) through its IoT base station gateway (IoT-BSG).
  • An IoT base station (BS) comprising an IoT base station gateway (IoT-BSG) for sensor node data aggregation, integration of low-power wide-area network (LPWAN) protocols for extended coverage, and graphic user interfaces (GUIs) for sensor node data to visualize robot navigation and integrate local data storage and analytics. In addition, the base station can extend storage and analytics capabilities through the IoT/cloud, supported by the Wi-Fi communication protocol, to help integrate different cloud platform services.
The presented IoT-based hybrid ANR system architecture and its designed features contribute the following:
  • The development of an IoT-based hybrid ANR system architecture. This architecture consists of the following:
    • Flexible modular designs for seamless scalability and interoperability.
    • The physical development and implementation of new ANR systems with their associated features and requirements.
    • The software development environment (SDE) facilitates simulation-based new ANR system development, simulation, and testing. This environment also supports the integration and testing of HIL. In addition, it facilitates GUIs for monitoring the activities of any physically implemented ANR systems.
    • Concurrent integration between the developed and physically implemented ANR systems with the same system developed using the SDE part of the architecture to facilitate concurrent monitoring, testing, validation, and effective deployment.
    • Integrating the developed ANR system architecture with cloud technologies through IoT/cloud to extend data storage and analytics.
Points (iii) and (iv) are implemented at the base station (BS) level, where all of the core interfaces are implemented to support the system’s GUI and IoT/cloud and facilitate the integration between the simulation-based and physically implemented ANR systems integrated with cloud services.
b.
The developed IoT-based hybrid ANR system architecture was used to demonstrate the development of the physical and simulation levels for surveillance and intruder detection applications in a large operational environment, integrating MRS, WSNs, IoT, and cloud technologies.
A surveillance and event detection application in a large environment (local and/or distributed) is used to demonstrate the developed IoT-based hybrid ANR system architecture:
  • The operational environment is divided into zones. Each zone contains multiple SSNs that form an WSN within the zone space. These SSNs are distributed to ensure proper coverage. Additionally, zone boundaries are dynamically configurable based on operational needs, enabling flexibility in resource allocation. Each zone is assigned one MRSN for navigation and event response.
  • One or more G-MRSNs are assigned to navigate areas within the operational environment that are not covered by any of the zones’ spaces. G-MRSNs leverage adaptive algorithms for path planning to ensure efficient navigation in unstructured environments.
  • The sensor nodes in each zone space contain non-imaging fire and intruder event detection sensors. The zone SSNs detect events and communicate the data to the O-SSN of the zone-assigned MRSN. The O-SSN relays the detected events to the BS via the IoT-BSG. For enhanced reliability, redundant communication protocols are implemented to prevent data loss during transmission. Additionally, the O-SSN of each G-MRSN detects and communicates its data to the BS via the IoT-BSG.
  • The mobile robots assigned to the zones can coordinate, plan, and navigate paths to the known SSN locations when emergency alerts are issued by any of the SSNs of the relevant zone. To improve response times, mobile robots utilize real-time decision-making models for task prioritization during emergencies. When there is no such emergency, the robot regularly navigates a defined assigned path within/around its zones and spaces.
  • The BS features the development of the core interfaces for the ANR system simulation, HIL simulation, and physical ANR development/implementation. This includes the integration of BS GUIs and IoT/cloud platforms. The BS also incorporates modular interface designs for seamless expansion to support future upgrades and additional sensors.
The layout describing the system design operational flow to illustrate the developed system is illustrated in Figure 1.

4. IoT-Based Hybrid ANR System Architecture Development

The IoT-based hybrid ANR system architecture is the framework upon which the principal conceptual design of the system with its various components is developed and interacted with to meet the system’s operational structure. As shown in Figure 2, the system architecture features the core development environments comprising the BS GUIs and IoT/cloud platforms, which can exchange data either with components from the simulation environments only, the physical environments only, or HIL test cases. The architectures of sensor nodes (SSNs and O-SSNs), robots (MRSNs and G-MRSNs), and the BS are presented.
Unlike conventional IoT-based monitoring and robotic coordination systems, the proposed IoT-based hybrid ANR system introduces key innovations that improve real-time event detection, network resilience, and multi-robot scalability. The following aspects differentiate this system from existing architectures:
  • Adaptive Multi-Tier Wireless Sensor Network (WSN) Integration—Unlike traditional single-tier WSN architectures, this system incorporates a two-tier WSN structure, where static sensor nodes (SSNs) handle local event detection, and mobile sensor nodes (O-SSNs) extend coverage dynamically. This approach reduces latency and communication overhead, ensuring more efficient real-time response.
  • Decentralized Multi-Robot Coordination Framework—Instead of a centralized task allocation model, our system employs a hybrid negotiation-based approach, combining the Contract Net Protocol (CNP) with event-driven task prioritization. This enhances scalability in multi-zone deployments, ensuring efficient coordination without overwhelming a single control node.
  • Hybrid Simulation and Hardware-in-the-Loop (HIL) Validation—Traditional IoT-based robotic systems are often validated either in simulation or real-world experiments. This system uniquely integrates a Hybrid Development Framework, where SDE-based simulation is used for pre-deployment optimization, and HIL testing ensures practical feasibility in real-world conditions.
  • Cloud-Integrated Analytics for Real-Time Data Processing—Conventional IoT-based systems often rely on onboard or local edge computing. Our system enhances real-time monitoring by incorporating a cloud-based analytics layer, where event data are visualized and processed remotely, ensuring improved scalability and remote access.
The structural components illustrated in Figure 2, Figure 3, Figures 5 and 7 have been designed specifically to address challenges such as network congestion, multi-robot coordination complexity, and real-time event prioritization. These improvements enable a more adaptable, scalable, and computationally efficient IoT-driven robotic network.

4.1. Design Considerations Based on Identified Challenges

The system architecture was designed to address key challenges identified in Section 3, including network reliability, real-time responsiveness, and scalability. To ensure real-time monitoring, a two-tier WSN architecture was implemented to enhance data aggregation and reduce centralized dependencies. To improve multi-robot coordination, a hybrid task allocation framework combining distributed event-based control and centralized mission assignment was adopted. For scalability, modular deployment zones were designed, utilizing ZigBee for localized WSN communication and a long-range wireless serial protocol for base station connectivity. These design choices were made to address the identified challenges and ensure system adaptability to varying deployment environments.

4.2. Sensor Node Architecture

The sensor nodes’ operational architecture features their structure and event classification system.

4.2.1. Sensor Node Structure

The IoT-based SSN and O-SSN structures are based on the generic sensor node structure [43]. The SSN comprises analog sensors: gas (carbon monoxide—CO and smoke) and temperature/relative humidity (T/RH) sensors, and digital sensors: flame and motion sensors standard to both SSNs and O-SSNs. Also, the SSNs contain the ZigBee communication interface and LCDs and LEDs for event display.
The O-SSNs have similar sensors but feature three communication interfaces: ZigBee for communication with SSNs, wireless serial for communication with the IoT-BSG, and wired serial for connection with the robot motion control. Also, only LEDs are used for indicators.
Both sensor nodes feature processors that process and classify the sensor input data. They are powered through connections to 5V DC supplies. Figure 3 shows the SSN structure.
To maintain accuracy, sensor nodes undergo initial factory calibration followed by periodic recalibration every three months. The calibration process includes offset correction for temperature and humidity sensors and multi-point verification for gas concentration sensors against standardized reference gases. Additionally, an automatic drift compensation algorithm ensures long-term stability by adjusting sensor baselines based on historical data.

4.2.2. Sensor Node Event Classification System

The sensor nodes’ event classification system involves a fuzzy logic decision-making algorithm for the analog sensors (T/RH, CO gas, and smoke) combined with the digital inputs of the digital sensors (flame and motion). A fuzzy set was generated for each analog sensor, comprising three member functions (MFs)—standard, high, and critical. The fuzzy set diagrams and MFs are shown in Figure 4.
The mapping of the analog sensor inputs to the fuzzy/MFs generated a seven-rule fuzzy inference system that classified events into critical, high, or regular alert events. The rules include the following:
Critical Alert Events
  • If temperature and humidity are critical, THEN an event alert is critical.
  • If CO and smoke concentrations are critical, THEN an event alert is critical.
High Alert Events
3.
If the temperature OR humidity is critical, THEN the event alert is high.
4.
If the temperature and humidity are high, THEN the event alert is high.
5.
If CO or smoke concentration is critical, THEN the event alert is high.
6.
If CO and smoke concentrations are high, THEN the event alert is high.
Normal Alert Events
7.
The event alert is standard if temperature, humidity, CO, and smoke concentrations are normal.
The final event classification, presented in Table 3, combined the fuzzy logic decision-making system output with input from the sensors. Table 3 presents selected representative combinations and is not intended to be an exhaustive list of all possible sensor input states, which are thoroughly addressed through the fuzzy rule set.
After event classification, each sensor node communicates its detected data and classified event in the following format: <node number, temperature, RH, CO gas concentration (conc.), smoke concentration (conc.), event classification>, where critical events are encoded as integer ‘1’. In contrast, high alert events are integer ‘0’.
The fuzzy logic event classification system was designed to capture correlated sensor readings for improved event detection accuracy. The simultaneous inclusion of temperature and humidity, as well as CO and smoke, was intentional, as these variables tend to co-vary during hazardous events, such as fires or gas leaks. However, reducing redundancy by selecting a single dominant variable per category could optimize computational efficiency. Future implementations could explore hierarchical fuzzy classification, where dependent variables are considered sequentially rather than simultaneously.
Regarding Table 3, the fuzzy rules were structured to classify critical alert conditions based on dominant signals, but certain combinations (e.g., low flame and low motion) are currently not explicitly addressed. Expanding the fuzzy rule base to account for edge-case scenarios could improve classification accuracy.
While the current system relies on predefined fuzzy membership functions, future enhancements could introduce adaptive fuzzy inference systems, where membership functions dynamically adjust based on real-time sensor feedback. Although machine learning-based classifiers (e.g., neural networks) provide adaptive learning capabilities, they typically require large training datasets and higher computational resources, making them less practical for real-time embedded applications. Fuzzy logic remains an attractive choice due to its low processing overhead and real-time decision-making capability in resource-constrained IoT environments.
The fuzzy logic system was chosen because it allows event classification without the need for large training datasets. While neural networks (NNs) could improve adaptability, they require extensive training and high computational resources, making them less suitable for real-time deployment in embedded IoT systems. Future work will explore hybrid fuzzy–NN models that dynamically adjust membership functions based on real-time sensor feedback [44].

4.2.3. Task Allocation and Path Planning Methods

The Contract Net Protocol (CNP) was selected for task allocation due to its scalability and decentralized decision-making, making it well suited for IoT-driven multi-robot systems. Similarly, A* path planning was chosen for its computational efficiency and determinism. Alternative methods, such as Particle Swarm Optimization (PSO) for navigation or reinforcement learning-based task allocation, could improve adaptability in dynamic environments. However, they introduce higher computational costs and training requirements, which are limiting factors for real-time embedded implementations. Future work will explore hybrid task allocation and path planning frameworks to improve dynamic response adaptability.

4.3. Mobile Robots Architecture

The operational architecture of mobile robots (MRSNs and G-MRSNs) features their structure and coordination navigation algorithms.

4.3.1. Mobile Robot Structure

In addition to the robots’ O-SSN, there is the robot’s motion control, which enables each robot’s coordination and navigation within the operational environment. The general operational structure for the MRSN, including its O-SSN and motion control, is illustrated in Figure 5.

4.3.2. Mobile Robots’ Coordination Algorithm

Each MRSN can navigate an assigned path within/around its zone. When an SSN within the same zone issues an alert, the MRSN receives it and calculates its distance to the alerting SSN. If the distance exceeds the reference navigation distance, the MRSN navigates to the alerting SSN. Otherwise, the MRSN announces the task to all the G-MRSNs and selects the closest one to navigate to the alerting SSN location. The coordination approach between the MRSN and selecting a suitable G-MRSN is based on the Contract Net Protocol (CNP) proposed by Reid G. Smith in 1980 [45]. The algorithm is illustrated in Figure 6.

4.3.3. Mobile Robot Navigation Algorithm

The ability of mobile robots to autonomously navigate within their assigned zones is crucial for effective system operation. Each MRSN utilizes a path planning algorithm to determine the most efficient route to a target location, such as an alerting SSN or a designated patrol area.
To achieve efficient and deterministic path planning, the system employs A* path planning, which guarantees optimum route selection in structured environments [46].
A* path planning was selected due to its computational efficiency and guaranteed optimality in structured environments [46]. While RRT* provides better adaptability in dynamic environments, it is computationally expensive and introduces non-deterministic behavior, making it unsuitable for real-time navigation in resource-constrained IoT robotic applications. Recent studies have explored advancements in RRT-based approaches to enhance computational efficiency and practical applicability in dynamic environments [44,47]. Future implementations may explore RRT* for applications requiring dynamic re-planning, provided computational constraints can be managed.
The A* path planning algorithm is presented in Algorithm 1.
Algorithm 1: A* path planning
1: Input: Occupancy Grid Map with n nodes
2:      f(n), g(n) and h(n); O(n)–open list; C(n)–closed list
3: Output: Least costly path to goal location
4: While O(n) =
5:     select best n from O(n) | f(n_best) ≤ f(n), n O(n)
6:     C(n) ← n_best
7:     if n_best = goal(n), exit
8:      expand n_best: x Star(n_best), that are not in C(n)
9:      if x O(n) then
10:       add x → O(n)
11:     else if g(n_best) + h(n_best) < g(x) then
12:      n_best ← x
13:     end if
14: return to the least costly path

4.4. IoT Base Station (BS) Operational Architecture

The BS features the IoT-BSG, the BS GUIs, and cloud platform integration for storage and analytics. Figure 7 shows its operational structure.
  • The Sensor Node SM Development using the PCD-SE
The sensor node SMs were developed in the PCD-SE using various part libraries according to structures. The SSN, O-SSN, and IoT-BSG SMs were created for the system simulation. Figure 8 shows the SSN example SM as designed in the PCD-SE.

4.4.1. The IoT Base Station Gateway (IoT-BSG)

The IoT-BSG receives and aggregates sensor nodes’ data communicated by the O-SSNs. The data are then sent to the BS sensor node GUI and uploaded directly to one of the cloud platforms. It features a wireless serial interface for communication with the O-SSNs, a Wi-Fi development interface for direct connection to online cloud platforms, a processor for data pre-processing, and a UART interface for connection to the BS sensor node data GUI.

4.4.2. The BS GUIs

The BS GUIs are used to visualize the results from the sensor node detection and robot navigation. There are two BS GUIs:
  • The sensor node data (SND) GUI, which aggregates and visualizes data and detected events from the sensor nodes, and then uploads these data to one of the cloud platforms
  • The MRS GUI visualizes and tracks the robots’ navigation paths and locations.

4.4.3. The IoT/Cloud Platforms

The cloud platforms aggregate, visualize, and store the sensor nodes’ data. Data analytics are also performed on the stored data to evaluate the conditions within the operational environment. Two cloud platforms are used; they include the following:
  • The Microsoft® (MS) cloud platform consists of SQL data to store the sensor node data and a web application to visualize the stored data. The MS cloud platform receives sensor node data from the SND GUI.
  • The ThingSpeak® cloud online platform features visualization channels that display up to eight data fields. A data channel was created for each sensor node. The cloud platform also facilitates data analytics through its integration with Matlab® toolboxes (2022a).
The ThingSpeak cloud platform was the preferred platform for analytics because of its Matlab® toolbox integration.

5. SDE-Based System Simulations

The IoT-based hybrid ANR system simulation involves the simulation development environment (SDE) with HIL support, which provides the framework for the system architecture simulation.

5.1. The SDE

The SDE constitutes the integrated development of the sensor nodes and mobile robot’s simulation models (SMs), their respective simulation environments (SEs), network emulator, and the GUIs for data visualization that are featured by the support of hardware-in-the-loop (HIL) module-based integration. The SDE structure is illustrated in Figure 9. The SDE features four main development capabilities:
  • The sensor node SM development using the Proteus Circuit Design SE (PCD-SE) and Arduino IDE.
  • The mobile robots SM and navigation environment (NE) development using the LabVIEW® Robotics Environment Simulator (LRES) and MRS GUI.
  • The BS sensor data aggregation and visualization will be performed by integrating IoT-BSG, SND GUIs, and cloud platforms.
  • The HIL module-based testing.

5.1.1. Mobile Robot SM Development Using the LRES/MRS GUI

The mobile robots’ SMs were developed in the LRES/MRS GUI using CAD models of LabVIEW DaNI 1.0 robots [48]. The structural model was developed in the LRES, while its control and navigation algorithms were developed in the MRS GUI. Both environments were incorporated. The operational environment SM involved a four-zoned environment with representations of the sensor nodes and obstacles, designed in SolidWorks® and imported into the LRES. The zones constituted zone 1 (SSN1, SSN3, and MRSN1), zone 2 (SSN5, SSN7, and MRSN2), zone 3 (SSN2, SSN4, and MRSN3) and zone 2 (SSN6, SSN8, and MRSN4). A 2D occupancy grid map used by the robots for path planning and navigation was generated from the operational environment SM, constituting a 40 × 30 cell (0.5 × 0.5 m per cell) grid, representing an example large environment such as a warehouse. Figure 10a shows a robot SM, while Figure 10b shows the operational environment SM with default mobile robots’ positions and the generated occupancy grid map with the SSN locations.

5.1.2. The Network Emulator

The network emulator facilitated the two-tier network of the ANR system and other necessary connections to enable the system simulation. It created virtual com ports connecting the following SMs through Matlab: SSNs to O-SSNs, O-SSNs to the IoT-BSG, O-SSNs to robot control, and the IoT-BSG to the ThingSpeak cloud.

5.1.3. The HIL Support and Integration

The SDE’s HIL support feature enables HIL simulation. The roles and advantages of HIL simulations are presented in [49,50]. The HIL simulation involved integrating whole zone modules (SSNs and assigned MRSNs) into the SEs for testing and validation. This integration was facilitated by HIL BSG development, which consists of the hardware and SM IoT-BSGs, with the setup illustrated in Figure 11.

5.2. Simulation Result

The presented results consist of sensor node event detection, classification, and communication, the MRSN/G-MRSN CNP coordination, and the SND GUI visualizations of an SSN both as an SM and when tested in the HIL setup. Additional results are presented in the hardware implementation.
The system simulation involves three SEs: the PCD-SE for detecting and communicating sensor node SMs, the LRES/MRS GUI for mobile robot SM navigation, and the BS SND GUI for sensor data visualization.

5.2.1. PCD-SE Simulation Results

For the SSNs, O-SSNs, and IoT-BSG SM operating in the PCD-SE, pre-collected data within the PCD-SE part libraries representing the various sensors constituting the sensor nodes are processed, classified, and displayed on the SM LCDs and LEDs. The detected data and classified events are communicated through the PCD-SE communication interface libraries as the network emulator facilitates the receiving senor node SMs, robot control, SND GUI, or ThingSpeak cloud platform. Figure 12 shows one SSN LCD and a virtual monitor interface displaying the detected data and classified events.
During the HIL simulation, zone 1 and 2 SSN and O-SSN SMs were disconnected in the PCD-SE, and their corresponding hardware was incorporated after the HIL BSG was set up. Figure 13 shows the SSN3 hardware during the HIL simulation.
Note: The display in Figure 13 focuses on key event parameters rather than showing all sensor values simultaneously. More parameters are displayed in Figure 12, though the system selectively presents data based on the detected event.

5.2.2. LRES/MRS GUI Simulation Results

For the mobile robot SMs, when the SSNs issue no alerts, they navigate the assigned paths within/around their zones or spaces. When a warning is issued, the zone MRSN navigates to the alerting SSN if its distance to the alerting SSN is less than a reference navigation distance; otherwise, it coordinates with other G-MRSNs. The coordination example between MRSN3 and the G-MRSN is presented in Figure 14.
During the HIL simulation, the MRSN1 or MRSN2 SM represents its corresponding hardware in an operational SM designed according to the actual lab environment with the hardware SSNs. Hence, the algorithms implemented on the mobile robot could be tested in the LRES/MRS GUI SE. Figure 15 shows the actual lab environment SM and the hardware MRSN connected to the LRES/MRS GUI SE. Figure 16 shows MRSN1 SM navigation to the SSN3 location.

5.2.3. BS SND GUI Visualizations

The IoT-BSG SM communicates the detected date by the sensor nodes and the classified events to the SND GUI. The SND GUI aggregates and visualizes the data and uploads them to the MS Cloud platform. Figure 17a, b show SSN3 SND data visualization during system and HIL simulations. The difference in the amount of data aggregated by the same SSN as an SM and as real hardware is attributed to the calibration time required by hardware gas sensors and the mode of operation. Both figures show the latest detected data and the classified event from the PCD-SE and the lab environment, respectively.
The neural network was trained using a dataset of 5000 labeled sensor readings collected over multiple test scenarios. Model validation was conducted using 10-fold cross-validation, achieving an average accuracy of 92.5% for %EMC prediction and 89.7% for PI classification. Real-time testing confirmed that the model maintains stable accuracy under deployment conditions.

5.3. Multi-Robot Task Allocation and Coverage Strategies

Effective multi-robot coordination is crucial for optimizing task allocation and ensuring efficient coverage in dynamic environments. Regarding multi-robot task allocation and coverage, advanced methods such as Obstacle-Avoiding Voronoi Cells for collaborative hunting and Immune–Endocrine-inspired maritime patrolling strategies have been proposed in the recent literature. These strategies optimize robot distribution and efficiency in complex, cluttered environments. While our work primarily employs the Contract Net Protocol (CNP) for decentralized task allocation, future implementations could integrate adaptive swarm intelligence techniques to enhance multi-robot coverage and mission efficiency in large-scale IoT networks.

6. IoT-Based Hybrid ANR System Implementation

The IoT-based hybrid ANR system was implemented in a lab environment to test and validate the system architecture in real time. The developed hardware SSNs, O-SSNS, MRSNs, and IoT-BSG are presented with the lab environment layout for the system implementation. The testing results, including IoT/cloud visualization and analytics, are presented.

6.1. Hardware Sensor Nodes and Mobile Robots

The hardware sensor nodes and robots were developed using components as represented in their structures. For example, the SSN comprises an Arduino® microcontroller (MCU) as the data processor, a ZigBee module, a DHT-11 T/RH sensor, MQ2 and MQ7 sensors for smoke and CO gas sensors, respectively, IR flame and PIR motion sensors, LEDs, and an LCD. The O-SSNs are similar to the SSNs, except that the Arduino MCU has more communication ports to accommodate the wireless serial module. The IoT-BSG has NodeMCU Wi-Fi development for direct connection to the ThingSpeak cloud and its Arduino MCU and wireless serial module. Figure 18 shows the SSN, O-SSN, and IoT-BSG hardware modules.
The hardware mobile robots (MRSNs) comprise the O-SSN on the mobile robot platform (DaNI 1.0 robots), which has a Wi-Fi router that enables the robots’ programming and communication with the LabVIEW MRS GUI. Figure 19 shows the MRSN and its communication types to the IoT BS.

6.2. Test Environment Layout

Figure 20 shows the lab environment for the implementation of the IoT-based ANR system and its occupancy grid map.

6.3. Implementation Results

The experimental setup was run for 1 h and 30 min, during which the lab environment door was opened several times to allow external disturbances to influence the change in temperature and humidity within the lab.

6.3.1. Sensor Node Detection Results

Figure 21 shows examples of sensor nodes before and after event detection. The LCDs show the detected data and the classified event, with the LEDs changing from green (normal) before detection to red (high alert) after detection.

6.3.2. Mobile Robot Navigation Results

With only two zones, the robot navigation within/around their zones and navigation to alert SSNs are demonstrated. Figure 22 shows MRSN1 navigation within its zone, while Figure 23 shows MRSN2 navigation to alert SSN5 in zone 2.

6.3.3. ThingSpeak Data Analytics

Various types of analytics can be performed on the cloud data in ThingSpeak. In this work, a preservation metrics analysis [51] was performed to predict the storage qualities of the various zones within the operational environment. The preservation metrics included the following.

Dewpoint

Dewpoint offers insight into how much moisture (water vapor) is contained within an environment. It is calculated from temperature (T)—°C and relative humidity (RH) values within the zone environment. It is expressed in (1), called the Magnus formula [52].
T D P = c l o g R H 100 + b T c + T b l o g R H 100 + b T c + T
where b is the water vapor constant (b = 243.5 °C) and c is the barometric pressure constant (c = 17.67).

Equilibrium Moisture Content (%EMC)

%EMC can be used to predict the risk of environmentally induced metal corrosion. It is expressed in (2) [53].
% E M C = 1800 W K R H 1 K R H + K 1 K R H + 2 K 1 K 2 K 2 R H 2 1 + K 1 K R H + 2 K 1 K 2 K 2 R H 2
W, K, K1, and K2 are the absorption model coefficients and functions of temperature (T) and relative humidity (RH). See the results in Figure 24.

Preservation Index

The PI estimates the rate of environmentally induced chemical decay in organic materials that causes aging. It is expressed in (3) [54,55].
P I = e E 134.9 R H 8.314 T   +   0.0284 R H     28.023 365
where E is the activation energy, E = 95220 J/mol
Firstly, a regression analysis between the %EMC and PI against the dewpoint is performed for each zone to verify their patterns and relationships, since the dewpoint is one of the leading causes of metal corrosion and organic degradation. The regression analysis results are shown in Figure 25, whereby the %EMC is relatively constant as the dewpoint changes across the two zones. At the same time, the PI decreases as the dewpoint increases across both zones.
Then, to accurately predict the %EMC and PI within the different zones, a neural network (NN) was trained on the T/RH data from each zone. The NN structure is shown in Figure 26. After the NN is trained, the %EMC and PI are predicted by inputting each zone’s running average of the T/RH data into the trained network.
The results of the NN prediction are presented in Table 4 based on the measure of risk associated with the %EMC and PI values, as reported in [51,56]. From the results, it can be deduced that the zones favor storing metallic materials more than organic materials if the temperature and RH within the zones remain relatively constant. These analytics can help determine the storage time and type of material to be stored within each zone. Generally, the conditions within the zones favor a warehouse storage facility with a constant movement of stored goods in and out of the facility.

6.4. Scalability Considerations and Future Enhancements

While the experimental validation focused on a two-zone setup, the system was designed with scalability as a core consideration. The modular nature of the architecture ensures that additional zones can be seamlessly integrated without requiring significant modifications. The system achieves scalability through the following:
  • Multi-Tier Wireless Sensor Networks (WSNs): A two-tier WSN—minimizes centralized communication dependencies, reducing the risk of bottlenecks as the number of sensor nodes increases.
  • Decentralized Task Allocation: Contract Net Protocol (CNP)—enables dynamic task assignment across multiple robots, distributing workloads effectively in multi-zone environments.
  • Cloud-Based Processing: The IoT Base Station (BS) with cloud analytics—offloads computational demands from local devices, enabling real-time monitoring and scalable data processing.
As deployments scale beyond 50+ zones, potential challenges such as network congestion, task allocation efficiency, and increased response latency must be carefully managed. Future work will focus on conducting scalability simulations and analyzing the following:
  • Network performance in large-scale deployments, including communication delays and congestion effects.
  • Task allocation efficiency as zones increase, assessing the impact on robot coordination.
  • Latency optimization strategies, such as adaptive routing and edge-based task delegation, to maintain real-time responsiveness in large networks.

6.5. Wireless Communication Performance

Wireless communication performance was evaluated in terms of packet loss rates, interference resilience, and bandwidth utilization. These metrics were assessed using real-time network performance monitoring tools integrated into the IoT base station, with data logged and analyzed via ZigBee diagnostic tools and protocol analyzers. ZigBee (2.4 GHz) exhibited an average packet loss of 2.5%, increasing to 7% in high-interference environments, while the sub-1 GHz protocol maintained a stable 1.2% packet loss rate. These values align with established performance benchmarks for WSN communication, where ZigBee networks are more susceptible to interference from Wi-Fi and Bluetooth operating in the same frequency band [57].
Bandwidth utilization peaked at 65% during multi-robot operations, ensuring sufficient headroom for additional data streams. This estimate is based on typical network traffic models for multi-robot IoT systems, where data-intensive operations such as event detection and control signaling dynamically influence available bandwidth [58].
To mitigate interference in overlapping zones, the system employs adaptive frequency hopping and real-time channel selection based on interference monitoring. These techniques, widely used in IoT-based wireless networks, ensure stable communication and minimize packet collisions in multi-zone deployments [59]. Future optimizations will focus on enhancing real-time interference detection and adaptive routing strategies to further improve network performance in large-scale implementations.

6.6. System Stability Considerations

The system is expected to maintain stable response times and classification accuracy over extended deployments based on its observed performance during testing. However, long-term stability may be influenced by periodic sensor drift and network congestion under high-load conditions. To ensure continued reliability, scheduled sensor recalibration and dynamic congestion management strategies will be necessary in real-world deployments. Future testing will focus on evaluating system performance over extended durations in varying environmental conditions.

6.7. Cybersecurity Considerations

Given the sensitive nature of surveillance data, the system incorporates basic security measures and is designed to support future enhancements for robust cybersecurity. The following strategies are considered for protecting sensor nodes and robots against cyberattacks:
Encryption and Secure Communication: Future iterations will implement AES-128 encryption for securing data transmission between sensor nodes, robots, and the IoT base station. Secure communication protocols such as TLS/SSL for cloud integration will be explored.
  • Authentication and Access Control: The system will incorporate mutual authentication mechanisms using hashed authentication keys (HMACs) and Public Key Infrastructure (PKI) to prevent unauthorized access to sensor and robot networks.
  • Data Integrity Verification: To ensure the authenticity of transmitted data, Message Authentication Codes (MACs) and cryptographic hashing (SHA-256) will be used to detect potential tampering.
  • Network Security and Intrusion Detection: Future implementations will integrate firewalls, anomaly detection, and intrusion detection systems (IDSs) to monitor and mitigate potential cyber threats.
  • Edge-Based Security Mechanisms: Lightweight security models at the edge layer will enhance protection against denial-of-service (DoS) attacks and spoofing attempts targeting sensor nodes.

7. Conclusions

The development of an IoT-based hybrid ANR system platform was presented and successfully demonstrated. This system integrates the implementation of a physical IoT-based ANR system with a software development environment (SDE), enabling seamless development, simulation, and testing. Both components were effectively combined with two cloud technologies, highlighting a robust hybrid architecture.

7.1. Concluded Achievements

The system has demonstrated the following significant achievements:
  • Reliable IoT-based modules were developed as nodes for wireless sensor networks (WSNs), consisting of static sensor nodes (SSNs) for each zone and Onboard Static Sensor Nodes (O-SSNs) for mobile robots. These nodes successfully detected and classified events using a fuzzy logic decision-making system and communicated the results to the base station (BS) through a two-tier network.
  • Multi-robot system (MRS) capabilities were validated, with robots categorized into Mobile Robot Sensor Nodes (MRSNs) and Global MRSNs (G-MRSNs). Both groups exhibited effective coordination, planning, and navigation in dynamic environments.
  • IoT base station (BS) functionalities included GUIs for real-time visualization of sensor data and robot navigation and cloud integration for advanced data storage and analytics.
  • The software development environment (SDE) proved flexible in enabling system simulations, hardware-in-the-loop (HIL) testing, and real-world implementations, providing a robust algorithm development and validation platform.
  • Cloud analytics supported real-time decision-making by processing zone data, offering actionable insights and long-term analytics.
  • Dynamic path planning and localization algorithms were successfully implemented, demonstrating robustness in dynamic environments.

7.2. Future Enhancements

Building upon these achievements, the following enhancements are proposed to address identified opportunities for improvement:
  • Enhanced sensing capabilities by equipping robots with additional sensors, such as cameras, to capture richer and more detailed event data, improving detection accuracy and reliability within relevant zones.
  • Autonomous action triggers based on cloud analytics outputs, enabling real-time responses at the zone level to enhance system adaptability and responsiveness.
  • Targeted HIL testing by expanding HIL simulation capabilities to validate individual hardware components comprehensively, ensuring system robustness before deployment.
  • Advanced learning models by integrating machine learning techniques, such as reinforcement learning, to improve robot coordination, adaptive decision-making, and path optimization in complex, unstructured environments.
  • Swarm coordination algorithms to enable decentralized, large-scale collaboration among robots, making the system suitable for tasks like disaster response, environmental monitoring, and resource mapping.
This work establishes a foundational architecture for IoT-based hybrid ANR systems, offering a comprehensive platform for multi-robot coordination, IoT integration, and real-time analytics. The proposed enhancements extend the system’s capabilities, adaptability, and scalability, ensuring relevance across diverse and dynamic operational environments.

Author Contributions

Methodology, M.K.H. and C.I.C.; Software, C.I.C.; Validation, C.I.C.; Investigation, M.K.H.; Writing—original draft, M.K.H. and C.I.C.; Supervision, M.K.H. The authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The necessary data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

AcronymDefinition
ANRAutonomous Networked Robot
MRSMulti-Robot System
WSNWireless Sensor Network
IoTInternet of Things
HILHardware-in-the-Loop
SDESimulation Development Environment
SSNStatic Sensor Node
O-SSNOnboard Static Sensor Node
MRSNMobile Robot Sensor Node
G-MRSNGlobal Mobile Robot Sensor Node
BSBase Station
IoT-BSGIoT Base Station Gateway
CNPContract Net Protocol
GUIGraphical User Interface
LRESLabVIEW Robotics Environment Simulator
PCD-SEProteus Circuit Design Simulation Environment
NENavigation Environment
LPWANLow-Power Wide-Area Network
RFIDRadio-Frequency Identification
AIArtificial Intelligence
AMCLAdaptive Monte Carlo Localization
SLAMSimultaneous Localization and Mapping
PSOParticle Swarm Optimization
HMIHuman–Machine Interface
UAVUnmanned Aerial Vehicle
UGVUnmanned Ground Vehicle
SQLStructured Query Language
ROSRobot Operating System
NNNeural Network
EMCEquilibrium Moisture Content
PIPreservation Index
DRLDeep Reinforcement Learning
LLMLarge Language Model
SGSR-NetStructure Semantics Guided LiDAR Super-Resolution Network

References

  1. Parker, L.E.; Rus, D.; Sukhatme, G.S. Multiple Mobile Robot Systems. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer: Cham, Switzerland, 2016; pp. 1109–1134. [Google Scholar]
  2. Weiss, M.D.; Peak, J.; Schwengler, T. A Statistical Radio Range Model for a Robot MANET in a Subterranean Mine. IEEE Trans. Veh. Technol. 2008, 57, 2658–2666. [Google Scholar] [CrossRef]
  3. Hsieh, M.A.; Cowley, A.; Keller, J.F.; Chaimowicz, L.; Grocholsky, B.; Kumar, V.; Taylor, C.J.; Endo, Y.; Arkin, R.C.; Jung, B.; et al. Adaptive Teams of Autonomous Aerial and Ground Robots for Situational Awareness. J. Field Robot. 2007, 24, 991–1014. [Google Scholar] [CrossRef]
  4. Chen, M.; Ma, Y.; Ullah, S.; Cai, W.; Song, E. ROCHAS: Robotics and Cloud-assisted Healthcare System for Empty Nester. In Proceedings of the 8th International Conference on Body Area Networks ICST, Boston, MA, USA, 30 September–2 October 2013; pp. 217–220. [Google Scholar]
  5. Qadori, H.Q.; Zukarnain, Z.A.; Hanapi, Z.M.; Subramaniam, S. FuMAM: Fuzzy-Based Mobile Agent Migration Approach for Data Gathering in Wireless Sensor Networks. IEEE Access 2018, 6, 15643–15652. [Google Scholar] [CrossRef]
  6. Waibel, M.; Beetz, M.; Civera, J.; d’Andrea, R.; Elfring, J.; Galvez-Lopez, D.; Häussermann, K.; Janssen, R.; Montiel, J.M.; Perzylo, A.; et al. RoboEarth. IEEE Robot. Autom. Mag. 2011, 18, 69–82. [Google Scholar] [CrossRef]
  7. Kamei, K.; Sato, M.; Nishio, S.; Hagita, N. Cloud networked robotics. IEEE Netw. 2012, 26, 28–34. [Google Scholar]
  8. Banjanović-Mehmedović, L.; Husaković, A.; Ribić, A.G.; Prljaca, N.; Karabegović, I. Advancements in robotic intelligence: The role of computer vision, DRL, transformers, and LLMs. In Artificial Intelligence in Industry 4.0:The Future that Comes True; Academy of Sciences and Arts of Bosnia and Herzegovina: Sarajevo, Bosnia and Herzegovina, 2024. [Google Scholar] [CrossRef]
  9. Hu, K.; Zhan, L.; Zou, L.; Han, Y.; Bi, T.; Muntean, G.-M. CoSAR: Multi-Robot Collaborative Semantic Mapping over Wireless Networks. In Proceedings of the 2023 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), Beijing, China, 14–16 June 2023; pp. 1–6. [Google Scholar] [CrossRef]
  10. Lee, T.; Kwon, J.; Wensing, P.M.; Park, F.C. Robot Model Identification and Learning: A Modern Perspective. Annu. Rev. Control. Robot. Auton. Syst. 2024, 7, 311–334. [Google Scholar] [CrossRef]
  11. Chovet, L.; Garcia, G.; Bera, A.; Richard, A.; Yoshida, K.; Olivares-Mendez, M. Performance Comparison of ROS2 Middlewares for Multi-robot Mesh Networks in Planetary Exploration. arXiv 2024. [Google Scholar] [CrossRef]
  12. Chukwuemeka, C.I.; Habib, M. Development of Autonomous Networked Robots (ANR) for Surveillance: Conceptual Design and Requirements. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; pp. 3757–3763. [Google Scholar]
  13. Chukwuemeka, C.I.; Habib, M.K. Design of a Two-Tier WSN-based IoT Surveillance System with Cloud Integration. In Proceedings of the 2019 20th International Conference on Research and Education in Mechatronics (REM), Wels, Austria, 23–24 May 2019. [Google Scholar]
  14. Chukwuemeka, C.I.; Habib, M.K. Integrated Development of Collaborative Mobile Robots and WSNs Supported by Cloud Service. In Proceedings of the 2020 6th International Conference on Mechatronics and Robotics Engineering (ICMRE), Barcelona, Spain, 12–15 February 2020; pp. 26–31. [Google Scholar]
  15. Nakib, S.; Jawhar, I.; Sindian, S.; Wu, J. Networking of multi-robot systems: Issues and requirements. Int. J. Sens. Netw. 2023, 43, 88–98. [Google Scholar] [CrossRef]
  16. Wichmann, A.; Okkalioglu, B.D.; Korkmaz, T. The Integration of Mobile (Tele) Robotics and Wireless Sensor Networks: A Survey. Comput. Commun. 2014, 51, 21–35. [Google Scholar]
  17. Huang, H.; Gong, T.; Zhang, R.; Yang, L.-L.; Zhang, J.; Xiao, F. Intrusion Detection Based on k-Coverage in Mobile Sensor Networks with Empowered Intruders. IEEE Trans. Veh. Technol. 2018, 67, 12109–12123. [Google Scholar] [CrossRef]
  18. Nguyen, M.T.; La, H.M.; Teague, K.A. Collaborative and Compressed Mobile Sensing for Data Collection in Distributed Robotic Networks. IEEE Trans. Control Netw. Syst. 2018, 5, 1729–1740. [Google Scholar] [CrossRef]
  19. Jawhar, I.; Mohamed, N.; Al-Jaroodi, J.; Zhang, S. An Architecture for Using Autonomous Underwater Vehicles in Wireless Sensor Networks for Underwater Pipeline Monitoring. IEEE Trans. Ind. Inf. 2019, 15, 1329–1340. [Google Scholar] [CrossRef]
  20. Chen, C.; Jin, A.; Wang, Z.; Zheng, Y.; Yang, B.; Zhou, J.; Xu, Y.; Tu, Z. SGSR-Net: Structure Semantics Guided LiDAR Super-Resolution Network for Indoor LiDAR SLAM. IEEE Trans. Multimed. 2024, 26, 1842–1854. [Google Scholar] [CrossRef]
  21. Liang, W.; Zheng, M.; Zhang, J.; Shi, H.; Yu, H.; Yang, Y.; Liu, S.; Yang, W.; Zhao, X. WIA-FA and Its Applications to Digital Factory: A Wireless Network Solution for Factory Automation. Proc. IEEE 2019, 107, 1053–1073. [Google Scholar] [CrossRef]
  22. Luong, T.; Barros, G.; Boshoff, M.; Moldovan, C.; Schuster, D.; Gruhn, V.; Kuhlenkötter, B. Investigating the 5G Handover in Autonomous Mobile Robotic Applications. In Proceedings of the 2023 3rd International Conference on Robotics 2023, Automation and Artificial Intelligence (RAAI), Singapore, 14–16 December 2023; pp. 200–205. [Google Scholar] [CrossRef]
  23. Baeg, S.H.; Park, J.H.; Koh, J.; Park, K.W.; Baeg, M.H. RoboMaidHome: A Sensor Network-based Smart Home Environment for Service Robots. In Proceedings of the RO-MAN 2007-The 16th IEEE International Symposium on Robot and HIC, Jeju, Republic of Korea, 26–29 August 2007; pp. 182–187. [Google Scholar]
  24. Muratore, L.; Tsagarakis, N. XBot2D: Towards a robotics hybrid cloud architecture for field robotics. Front. Robot. AI 2023, 10, 1168694. [Google Scholar] [CrossRef] [PubMed]
  25. Hung, P.D.; Vinh, T.Q.; Ngo, T.D. Distributed coverage control for networked multi-robot systems in any environments. In Proceedings of the 2016 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Banff, AB, Canada, 12–15 July 2016; pp. 1067–1072. [Google Scholar]
  26. Wu, W.-Y.; Liu, Y.-C. Autonomous Guided Robotic Systems in Regulating Indoor Environmental Quality. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 9–12 July 2018; pp. 188–193. [Google Scholar]
  27. Lin, M.-T.; Liu, Y.-C. Effective Control for Wireless Sensor and Mobile Actuator Network in Regulation of Environmental Density Function. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 9–12 July 2018; pp. 1190–1195. [Google Scholar]
  28. Wang, J.-W.; Guo, Y.; Fahad, M.; Bingham, B. Dynamic Plume Tracking by Cooperative Robots. IEEE/ASME Trans. Mechatron. 2019, 24, 609–620. [Google Scholar] [CrossRef]
  29. Hsu, K.C.; Hu, H.; Fisac, J. The Safety Filter: A Unified View of Safety-Critical Control in Autonomous Systems. Annu. Rev. Control. Robot. Auton. Syst. 2023, 7, 47–72. [Google Scholar] [CrossRef]
  30. Ciucu-Durnoi, A.-N.; Delcea, C.; Stănescu, A.; Teodorescu, C.; Vargas, V.M. Beyond Industry 4.0: Tracing the Path to Industry 5.0. Sustainability 2024, 16, 5251. [Google Scholar] [CrossRef]
  31. Lan, G.; Hao, Q. End-to-end planning of autonomous driving in industry and academia. arxiv 2023. [Google Scholar] [CrossRef]
  32. Kashino, Z.; Nejat, G.; Benhabib, B. A Hybrid Strategy for Target Search Using Static and Mobile Sensors. IEEE Trans. Cybern. 2020, 50, 856–868. [Google Scholar] [CrossRef]
  33. Banjanovic-Mehmedovic, L.; Zukic, M.; Mehmedovic, F. Alarm detection and monitoring in an industrial environment using hybrid wireless sensor network. SN Appl. Sci. 2019, 1, 263. [Google Scholar] [CrossRef]
  34. Ahmed, S.A.; Popov, V.L.; Topalov, A.V.; Shakev, N.G. Environmental monitoring using a robotized wireless sensor network. AI Soc. 2018, 33, 207–214. [Google Scholar] [CrossRef]
  35. Subramaniam, A.; Prajapati, D.H.; Alremeithi, K.; Sealy, W. Enhancing Autonomous Robotics Through Cloud Computing. In Proceedings of the 2024 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), Bali, Indonesia, 4–6 July 2024; pp. 259–265. [Google Scholar] [CrossRef]
  36. Valecce, G.; Micoli, G.; Boccadoro, P.; Petitti, A.; Colella, R.; Milella, A.; Grieco, L.A. Robotic-aided IoT: Automated deployment of a 6TiSCH network using a UGV. IET Wirel. Sens. Syst. 2019, 9, 438–446. [Google Scholar] [CrossRef]
  37. Datta, S.; Baul, A.; Sarker, G.C.; Sadhu, P.K.; Hodges, D.R. A Comprehensive Review of the Application of Machine Learning in Fabrication and Implementation of Photovoltaic Systems. IEEE Access 2023, 11, 77750–77778. [Google Scholar] [CrossRef]
  38. Anis, K.; Sahar, T.; Imen, C. Indoor Surveillance Application using Wireless Robots and Sensor Networks: Coordination and Path Planning. In Mobile Ad Hoc Robots and Wireless Robotic Systems: Design and Implementation; Santos, R.A., Lengerke, O., Edwards-Block, A., Eds.; IGI Global: Hershey, PA, USA, 2013. [Google Scholar]
  39. Pascual, O.; Brunete, A.; Abderrahim, M. Integration of low-cost supervisory mobile robots in domestic Wireless Sensor Networks. In Proceedings of the 2014 International Conference on Robotics and Emerging Allied Technologies in Engineering (iCREATE), Islamabad, Pakistan, 22–24 April 2014; pp. 259–264. [Google Scholar]
  40. Scilimati, V.; Petitti, A.; Boccadoro, P.; Colella, R.; Di Paola, D.; Milella, A.; Grieco, L.A. Industrial Internet of things at work: A case study analysis in robotic-aided environmental monitoring. IET Wirel. Sens. Syst. 2017, 7, 155–162. [Google Scholar] [CrossRef]
  41. Zhang, Q.; Bai, J.; Chang, X. Pioneering Testing and Assessment Framework For AI-Powered SAR ATR Systems. In Proceedings of the 2024 IEEE 7th International Conference on Big Data and Artificial Intelligence (BDAI), Beijing, China, 5–7 July 2024; pp. 230–234. [Google Scholar] [CrossRef]
  42. Bertiz, C.A.S.; Lozano, J.J.F.; Gomez-Ruiz, J.A.; García-Cerezo, A. Integration of a Mobile Node into a Hybrid Wireless Sensor Network for Urban Environments. Sensors 2019, 19, 215. [Google Scholar] [CrossRef] [PubMed]
  43. Karray, F.; Jmal, M.W.; Abid, M.; BenSaleh, M.S.; Obeid, A.M. A review on wireless sensor node architectures. In Proceedings of the 2014 9th International Symposium on Reconfigurable and Communication-Centric Systems-On-Chip (ReCoSoC), Montpellier, France, 26–28 May 2014; pp. 1–8. [Google Scholar]
  44. Xu, T. Recent advances in Rapidly-exploring Random Tree: A review. Heliyon 2024, 10, e32451. [Google Scholar] [CrossRef]
  45. Smith, R.G. The Contract Net Protocol: High-Level Communication and Control in a Distributed Problem Solver. IEEE Trans. Comput. 1980, 29, 1104–1113. [Google Scholar] [CrossRef]
  46. Hart, P.E.; Nilsson, N.J.; Raphael, B. A Formal Basis for the Heuristic Determination of Minimum Cost Paths. IEEE Trans. Syst. Sci. Cybern. 1968, 4, 100–107. [Google Scholar] [CrossRef]
  47. Karaman, S.; Frazzoli, E. Sampling-based algorithms for optimal motion planning. Int. J. Robot. Res. 2011, 30, 846–894. [Google Scholar] [CrossRef]
  48. Browne, A.F.; Conrad, J.M. A Versatile Approach for Teaching Autonomous Robot Control to Multi-Disciplinary Undergraduate and Graduate Students. IEEE Access 2018, 6, 25060–25065. [Google Scholar] [CrossRef]
  49. Kelemenová, T.; Kelemen, M.; Miková, L.; Maxim, V.; Prada, E.; Lipták, T.; Menda, F. Model-Based Design and HIL Simulations. Am. J. Mech. Eng. 2013, 1, 276–281. [Google Scholar]
  50. Sarhadi, P.; Yousefpour, S. State of the art: Hardware in the loop modeling and simulation with its applications in design, development, and implementation of system and control software. Int. J. Dyn. Control 2015, 3, 470–479. [Google Scholar] [CrossRef]
  51. Nishimura, D.W. Understanding Preservation Metrics; Image Permanence Institute (IPI); Image Permanence Institute Preservation Metrics; Image Permanence Institute-Rochester Institute of Technology (IPI-RIT): Rochester, NY, USA, 2011. [Google Scholar]
  52. Sonntag, D.L.P. Important new values of the physical constants of 1986, vapor pressure formulations based on the ITS-90, and psychrometer formulae. Z. Meteorol. 1990, 70, 340–344. [Google Scholar]
  53. Simpson, W.T. Equilibrium Moisture Content of Wood in Outdoor Locations in the United States and Worldwide; U.S. Department of Agriculture, Forest Service, Forest Products Laboratory: Madison, WI, USA, 1998. [CrossRef]
  54. Padfield, T. The Preservation Index and the Time Weighted Preservation Index, Conservation Physics. 2004. Available online: http://www.conservationphysics.org/twpi/twpi_01.html (accessed on 3 April 2025).
  55. Strlič, M.; Kolar, J. Ageing and Stabilisation of Paper; National and University Library: Ljubljana, Slovenia, 2005. [Google Scholar]
  56. Zhang, W.; Wang, S.; Guo, H. Influence of Relative Humidity on the Mechanical Properties of Palm Leaf Manuscripts: Short-Term Effects and Long-Term Aging. Molecules 2024, 29, 5644. [Google Scholar] [CrossRef]
  57. Popescu, C.-A.; Georgescu, V.-C.; Popescu, E.-M. Experimental Research Regarding the Use of ZigBee Technology in Industrial Logistics Applications. In Proceedings of the 2021 12th International Symposium on Advanced Topics in Electrical Engineering (ATEE), Bucharest, Romania, 25–27 March 2021; pp. 1–7. [Google Scholar] [CrossRef]
  58. Palattella, M.R.; Dohler, M.; Grieco, A.; Rizzo, G.; Torsner, J.; Engle, T.; Ladid, L. Internet of Things in the 5G Era: Enablers, Architecture, and Business Models. IEEE J. Sel. Areas Commun. 2016, 34, 510–527. [Google Scholar] [CrossRef]
  59. Ulema, M. Wireless sensor networks: Architectures, protocols, and management. In Proceedings of the 2004 IEEE/IFIP Network Operations and Management Symposium (IEEE Cat. No.04CH37507), Seoul, Republic of Korea, 19–23 April 2004; Volume 1, p. 931. [Google Scholar] [CrossRef]
Figure 1. IoT-based hybrid ANR system operational structure.
Figure 1. IoT-based hybrid ANR system operational structure.
Technologies 13 00168 g001
Figure 2. IoT-based hybrid ANR system architecture.
Figure 2. IoT-based hybrid ANR system architecture.
Technologies 13 00168 g002
Figure 3. SSN structure.
Figure 3. SSN structure.
Technologies 13 00168 g003
Figure 4. Fuzzy sets and member functions (MFs).
Figure 4. Fuzzy sets and member functions (MFs).
Technologies 13 00168 g004
Figure 5. MRSN operational structure.
Figure 5. MRSN operational structure.
Technologies 13 00168 g005
Figure 6. Coordination algorithm for mobile robot response using Contract Net Protocol (CNP) between MRSN and G-MRSNs.
Figure 6. Coordination algorithm for mobile robot response using Contract Net Protocol (CNP) between MRSN and G-MRSNs.
Technologies 13 00168 g006
Figure 7. IoT BS structure.
Figure 7. IoT BS structure.
Technologies 13 00168 g007
Figure 8. SSN simulation model.
Figure 8. SSN simulation model.
Technologies 13 00168 g008
Figure 9. SDE structure.
Figure 9. SDE structure.
Technologies 13 00168 g009
Figure 10. Mobile robot and operational environment SMs.
Figure 10. Mobile robot and operational environment SMs.
Technologies 13 00168 g010
Figure 11. HIL simulation setup.
Figure 11. HIL simulation setup.
Technologies 13 00168 g011
Figure 12. SSN LCD and virtual monitor interface. The communicated data are <3, 25, 20, 15, 20, 0>.
Figure 12. SSN LCD and virtual monitor interface. The communicated data are <3, 25, 20, 15, 20, 0>.
Technologies 13 00168 g012
Figure 13. SSN3 hardware during HIL simulation.
Figure 13. SSN3 hardware during HIL simulation.
Technologies 13 00168 g013
Figure 14. MRSN3 and G-MRSN CNP coordination and A* path planning to SSN4 location.
Figure 14. MRSN3 and G-MRSN CNP coordination and A* path planning to SSN4 location.
Technologies 13 00168 g014
Figure 15. Lab environment SM and hardware MRSN connection to LRES/MRS GUI during HIL simulation.
Figure 15. Lab environment SM and hardware MRSN connection to LRES/MRS GUI during HIL simulation.
Technologies 13 00168 g015
Figure 16. MRSN1 SM navigation to SSN3 location during HIL simulation.
Figure 16. MRSN1 SM navigation to SSN3 location during HIL simulation.
Technologies 13 00168 g016
Figure 17. SSN3 sensor node data (SND) GUI visualizations during system and HIL simulations: (a) SSN3 simulation mode (SM) SND visualization, showing higher event density due to immediate data availability; (b) SSN3 hardware SND visualization, reflecting calibration delay and real-world sensor behavior.
Figure 17. SSN3 sensor node data (SND) GUI visualizations during system and HIL simulations: (a) SSN3 simulation mode (SM) SND visualization, showing higher event density due to immediate data availability; (b) SSN3 hardware SND visualization, reflecting calibration delay and real-world sensor behavior.
Technologies 13 00168 g017
Figure 18. SSN, O-SSN, and IoT-BSG hardware.
Figure 18. SSN, O-SSN, and IoT-BSG hardware.
Technologies 13 00168 g018
Figure 19. MRSN assembly with communication types to IoT BS.
Figure 19. MRSN assembly with communication types to IoT BS.
Technologies 13 00168 g019
Figure 20. Lab environment setup and its 2D occupancy grid map.
Figure 20. Lab environment setup and its 2D occupancy grid map.
Technologies 13 00168 g020
Figure 21. SSN1, SSN5, and MRSN1 O-SSN before and after event detection.
Figure 21. SSN1, SSN5, and MRSN1 O-SSN before and after event detection.
Technologies 13 00168 g021
Figure 22. MRSN1 continuous navigation within its zone.
Figure 22. MRSN1 continuous navigation within its zone.
Technologies 13 00168 g022
Figure 23. MRSN2 navigation to alerting SSN5 location.
Figure 23. MRSN2 navigation to alerting SSN5 location.
Technologies 13 00168 g023
Figure 24. ThingSpeak SSN3 hardware channel data visualization.
Figure 24. ThingSpeak SSN3 hardware channel data visualization.
Technologies 13 00168 g024
Figure 25. Zone 1 and 2 %EMC and PI and dewpoint regression analysis.
Figure 25. Zone 1 and 2 %EMC and PI and dewpoint regression analysis.
Technologies 13 00168 g025
Figure 26. Neural network (NN) structure for training and prediction.
Figure 26. Neural network (NN) structure for training and prediction.
Technologies 13 00168 g026
Table 1. Comparison of techniques and metrics in ANR systems.
Table 1. Comparison of techniques and metrics in ANR systems.
Attribute/MetricMSNs [17]CCMS [18]ALSN [19]AI-Driven Robotics [8]IoT-Aided ANR [23]Cloud Robotics [24]5G for AMRs [22]
Key ApplicationIntrusion detectionData collection and mappingUnderwater monitoringPerception and decision-makingUrban surveillanceDistributed field roboticsIndustrial automation
Network ArchitectureMobile sensor networkCollaborative sensingLinear sensor networkModular and data-drivenIoT-integratedHybrid cloud5G-based communication
Energy EfficiencyModerateHighHighModerateHighImproved through offloadingHigh
Data CommunicationSimulation onlyExperimental validationSimulated and testedAI-enhancedReal-time detectionReal-timeSeamless
ScalabilityModerateHighModerateHighHighHighHigh
Validation TypeSimulationSimulation and ExperimentSimulationAI-enhanced simulationsExperimentalSimulationExperimental
Table 2. Recent advancements in ANR technologies.
Table 2. Recent advancements in ANR technologies.
Attribute/MetricProbabilistic Models [29]LiDAR SLAM [20]Digital Twin for Swarms [30]5G Networks [31]
Key ApplicationSafety-critical controlIndoor navigationSwarm coordination and real-time monitoringIndustrial automation
Technology TypeUnified Safety Filter FrameworkStructure-Semantics-Guided LiDARDigital twin models5G-based communication
Latency ImpactModerateLow-latencyReal-timeSeamless
Fault ToleranceHighHighHighModerate
Collaboration FrameworkDecentralizedCentralizedDecentralizedCentralized
Energy EfficiencyModerateHighHighHigh
Validation TypeSimulationSimulation and ExperimentSimulation and DeploymentExperimental
ScalabilityModerateHighHighHigh
Environmental AdaptabilityUrban and SuburbanIndoorMixed Urban–FieldIndustrial Environments
Real-World ApplicationsAutonomous VehiclesIndoor mappingRobotics swarms for disaster responseIoT-enabled factories
Table 3. Digital sensors’ input combination with fuzzy logic system results.
Table 3. Digital sensors’ input combination with fuzzy logic system results.
Fuzzy Rules OutputFlame SensorMotion SensorEvent Classification
CriticalHighHighCritical Alert
CriticalHighLowCritical Alert
HighLowHighHigh Alert
LowLowLowNormal Alert
Table 4. %EMC and PI NN predictions per zone.
Table 4. %EMC and PI NN predictions per zone.
% EMC Predictions per Zone
Zones%EMCStatus/Color CodeMetal Corrosion RiskAverage Dewpoint
Zone 17.8168OKLimited risk of corrosion3.6162
Zone 28.3270OKLimited risk of corrosion3.4151
PI Predictions per Zone
ZonesPI (years)Status/Color CodeOrganic Degradation RiskAverage Dewpoint
Zone128.7899RISKAccelerated rate of organic decay3.6162
Zone 226.6957RISKAccelerated rate of organic decay3.4151
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Habib, M.K.; Chukwuemeka, C.I. Development of IoT-Based Hybrid Autonomous Networked Robots. Technologies 2025, 13, 168. https://doi.org/10.3390/technologies13050168

AMA Style

Habib MK, Chukwuemeka CI. Development of IoT-Based Hybrid Autonomous Networked Robots. Technologies. 2025; 13(5):168. https://doi.org/10.3390/technologies13050168

Chicago/Turabian Style

Habib, Maki K., and Chimsom I. Chukwuemeka. 2025. "Development of IoT-Based Hybrid Autonomous Networked Robots" Technologies 13, no. 5: 168. https://doi.org/10.3390/technologies13050168

APA Style

Habib, M. K., & Chukwuemeka, C. I. (2025). Development of IoT-Based Hybrid Autonomous Networked Robots. Technologies, 13(5), 168. https://doi.org/10.3390/technologies13050168

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop