Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,747)

Search Parameters:
Keywords = industrial internet of things

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 24743 KB  
Article
EACCO: Optimizing the Computation and Communication in Resource-Constrained IoT Devices for Energy-Efficient Swarm Robotics
by Amir Ijaz, Hashem Haghbayan, Ethiopia Nigussie, Abdul Malik and Juha Plosila
Sensors 2026, 26(9), 2839; https://doi.org/10.3390/s26092839 - 1 May 2026
Abstract
Energy consumption is a critical concern for Internet of Things (IoT) platforms lacking abundant resources, particularly for swarm robotic systems that rely on numerous devices operating collaboratively over extended periods. This study presents a comprehensive design strategy for improving processing and communication to [...] Read more.
Energy consumption is a critical concern for Internet of Things (IoT) platforms lacking abundant resources, particularly for swarm robotic systems that rely on numerous devices operating collaboratively over extended periods. This study presents a comprehensive design strategy for improving processing and communication to enhance system efficiency and reduce energy consumption. We incorporate energy harvesting (photovoltaic and RF), dynamic power management, and energy-efficient communication protocols (e.g., duty cycle, power control, data compression) into two complementary platforms built for swarm robotics: MCU-based nodes (TI MSP430 with LoRa transceiver), which serve as the experimental prototype for validating energy-aware communication, compression, and scheduling mechanisms; edge platforms (Jetson Nano and TX2), which are used for high-level power profiling and system-level evaluation, particularly for computation intensive workloads and comparative analysis. Our technique involves analyzing the device’s energy usage and harvesting processes, developing efficient communication protocols, and validating the system through simulations and hardware prototypes. Experimental results under outdoor and indoor conditions show that the device maintains an energy neutrality ratio well above unity, even with limited ambient energy. Key findings include significant reductions in energy per bit transmitted and reliable long-term operation. These insights pave the way for deploying swarms of autonomous IoT-based robots with minimal maintenance and maximal longevity. Full article
(This article belongs to the Section Internet of Things)
23 pages, 7922 KB  
Article
Hardware-Assisted Security Enhancements for an FPGA-ARM Embedded Vision System in IoT Applications
by Tomyslav Sledevič and Darius Andriukaitis
Electronics 2026, 15(9), 1887; https://doi.org/10.3390/electronics15091887 - 29 Apr 2026
Viewed by 1
Abstract
EmbeddedField-Programmable Gate Array (FPGA)-Advanced RISC Machine (ARM) systems used in industrial and Internet of Things (IoT) environments increasingly operate as network-connected edge devices. While such connectivity enables distributed processing and remote monitoring, it also exposes embedded vision nodes to security threats, including command [...] Read more.
EmbeddedField-Programmable Gate Array (FPGA)-Advanced RISC Machine (ARM) systems used in industrial and Internet of Things (IoT) environments increasingly operate as network-connected edge devices. While such connectivity enables distributed processing and remote monitoring, it also exposes embedded vision nodes to security threats, including command injection, frame replay, data tampering, and abnormal communication traffic. This paper presents a hardware-assisted security architecture for an FPGA-ARM embedded vision system designed for high-speed image acquisition and network streaming. The proposed solution integrates several lightweight protection mechanisms directly into the FPGA processing pipeline, including frame replay detection, cyclic redundancy check (CRC)-based frame integrity verification, frame sequence monitoring, authenticated command execution, communication anomaly monitoring, and hardware-rooted trust primitives, such as a ring-oscillator physical unclonable function (PUF) and a pseudo-random generator. Optional secure communication is provided via a lightweight ASCON-authenticated encryption core. The architecture was implemented on a Cyclone V System-on-Chip (SoC) platform using an industrial Camera Link camera and evaluated in a low-latency image-acquisition setup operating at 100 fps, with data throughput exceeding 1 Gbps. Experimental results demonstrate that the proposed security architecture introduces only about 1.6% additional FPGA logic utilization while maintaining full real-time acquisition performance. The presented approach demonstrates that practical hardware-level security mechanisms can be integrated into FPGA-based embedded vision nodes with minimal architectural modifications and negligible performance overhead. Full article
31 pages, 2825 KB  
Article
IIoT-Based Remote Monitoring System for Temperature, Current, and Vibration Using PLC and Node-RED in a Data Center Cooling Compressor: A Condition-Based Maintenance Framework
by Jefferson Damián Pinza Apolo, Jonathan Lizandro Bravo Robles, José Luis Dumán Zhicay, Ramiro Xavier Cazares Guerrero, Wilmer Fabian Albarracin Guarochico and Paul Francisco Baldeón Egas
Sensors 2026, 26(9), 2772; https://doi.org/10.3390/s26092772 - 29 Apr 2026
Viewed by 38
Abstract
Climate control systems are critical to ensuring the continuous operation of data centers, as they maintain the environmental conditions required by sensitive electronic equipment. In this context, continuous supervision of refrigeration compressors is essential to prevent failures that may compromise thermal stability. This [...] Read more.
Climate control systems are critical to ensuring the continuous operation of data centers, as they maintain the environmental conditions required by sensitive electronic equipment. In this context, continuous supervision of refrigeration compressors is essential to prevent failures that may compromise thermal stability. This work presents the design, implementation, and experimental validation of a remote monitoring and condition-based maintenance framework built on Industrial Internet of Things (IIoT) technologies for air-conditioning compressors used in data centers. The proposed architecture integrates industrial-grade sensors for temperature, electric current, and vibration, a Siemens LOGO! programmable logic controller (PLC) for signal acquisition and scaling, a Node-RED middleware layer for data flow management, and the ThingSpeak cloud platform for remote storage and analysis. The novel contributions of this work are: (i) a fully integrated low-cost IIoT stack validated on a Copeland ZR144KCE-TF5 scroll compressor under real operating conditions over a continuous 49-day monitoring period; (ii) a hybrid anomaly detection model that combines Z-score statistical baselines with moving-average prediction error to reduce false positives from transient events; and (iii) a condition-based maintenance decision framework that maps the three monitored variables to ISO 10816-3 vibration severity zones and manufacturer-referenced thermal and electrical thresholds, producing recommended maintenance actions. The framework was applied to the acquired dataset, confirming predominantly stable operation (93.4% of samples in ISO 10816-3 Zones A–B) while detecting an emergent mechanical-wear trend (5.64% of samples in Zone C) concentrated in the final days of the monitoring period and demonstrating the feasibility of the proposed architecture as a scalable and replicable solution for condition monitoring and maintenance decision support in critical technological infrastructures. Full article
(This article belongs to the Section Industrial Sensors)
Show Figures

Figure 1

28 pages, 2920 KB  
Article
NIDS-Mamba: Lightweight Network Intrusion Detection for IoT Sensor Networks via State Space Models
by Zixiang Ding, Jiahao Zheng and Xianyun Wu
Sensors 2026, 26(9), 2766; https://doi.org/10.3390/s26092766 - 29 Apr 2026
Viewed by 63
Abstract
The ubiquity of resource-constrained Internet-of Things (IoT) nodes creates an urgent demand for network intrusion detection systems (NIDSs) optimized for edge devices with limited computing power. In this paper, we propose a new NIDS system based on Mamba. NIDS-Mamba uses a dynamic sparse [...] Read more.
The ubiquity of resource-constrained Internet-of Things (IoT) nodes creates an urgent demand for network intrusion detection systems (NIDSs) optimized for edge devices with limited computing power. In this paper, we propose a new NIDS system based on Mamba. NIDS-Mamba uses a dynamic sparse attention and a lightweight state space to jointly learn from short-term anomaly and long-term attack patterns. We use standardized NF-UNSW-NB15 and NF-CSE-CIC-IDS2018 datasets to verify the effectiveness of this NIDS-Mamba model. We find that this NIDS-Mamba model is very effective in dealing with extreme class imbalance problems. In the NF-CSE-CIC-IDS2018 dataset, the model achieves 98.32% accuracy, 96.98% F1-score, and an AUC of 0.9996. Most notably, the model is very robust in handling extreme class imbalance problems in the NF-UNSW-NB15 dataset. It achieves 97.03% G-Mean, 0.7915 MCC, and 0.9983 AUC, far exceeding other baseline models. Compared to Transformer-based baselines, NIDS-Mamba achieves nearly an order-of-magnitude improvement in throughput while maintaining a parameter footprint compatible with edge deployment constraints. The proposed architecture effectively mitigates the quadratic complexity and memory wall inherent in standard Transformers, ensuring compatibility with Limited RAM and strict energy constraints. The proposed model achieves a compact design with 1.12 million parameters and a peak inference memory of 5.4 MB, ensuring its feasibility for edge-based IoT nodes. These properties make NIDS-Mamba a strong candidate for deployment on IoT gateways and edge sensor nodes in smart home, industrial IoT, and critical infrastructure scenarios. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

50 pages, 29943 KB  
Systematic Review
Hybrid Approaches of Machine Learning Algorithms in Predictive Maintenance: A Systematic Literature Review
by Jorge Paredes, Danilo Chavez, Ramiro Isa-Jara and Diego Vargas
Appl. Syst. Innov. 2026, 9(5), 90; https://doi.org/10.3390/asi9050090 - 29 Apr 2026
Viewed by 200
Abstract
The advent of Industry 4.0 has precipitated the digitization of myriad industrial processes, a feat attributable to the implementation of sophisticated digital enablers such as artificial intelligence (AI) and the Internet of Things (IoT). These technological advances have facilitated the implementation of various [...] Read more.
The advent of Industry 4.0 has precipitated the digitization of myriad industrial processes, a feat attributable to the implementation of sophisticated digital enablers such as artificial intelligence (AI) and the Internet of Things (IoT). These technological advances have facilitated the implementation of various innovative applications, especially in the field of predictive maintenance. This approach facilitates more precise estimation of the remaining useful life (RUL) of equipment, determination of the health index (HI) of machinery, and planning of effective maintenance schedules that circumvent unexpected and costly shutdowns in industrial operations. The employment of hybrid approaches founded on machine learning algorithms in the domain of predictive maintenance signifies a perpetually evolving field of research, wherein novel techniques, methodologies, and strategies are proposed to enhance maintenance efficiency and reliability. In order to furnish a substantial and exhaustive compendium of information, a methodical literature review is hereby presented, offering a meticulous survey of the hybrid approaches utilized within this domain. The study analyzed 77 papers from the 914 papers found on the topic, to find and organize the body of knowledge, and presents a lucid taxonomy, the primary algorithms employed in hybrid approaches, the most prevalent datasets, the applicable technology architectures, and the maturity level of these solutions. This study provides a robust conceptual foundation for future research, underscoring the significance of hybrid approaches as a promising field of study, with considerable potential for advancement in the realm of industrial predictive maintenance. Full article
Show Figures

Figure 1

33 pages, 1749 KB  
Article
LLM-Conductor: A Closed-Loop Resource-Adaptive Architecture for Secure LLM Deployment in Industrial Sensor Networks and IIoT Systems
by Kai Xu, Diming Zhang and Xuguo Wang
Sensors 2026, 26(9), 2733; https://doi.org/10.3390/s26092733 - 28 Apr 2026
Viewed by 567
Abstract
To address the bottlenecks of missing decision-making closed loop, insufficient experience reuse, and decoupled resource scheduling in industrial LLM deployment, this paper proposes LLM-Conductor, a three-layer collaborative architecture that enables monitoring-feedback autonomous decision-making, structured policy memory, and joint policy-resource optimization.Through ablation studies, horizontal [...] Read more.
To address the bottlenecks of missing decision-making closed loop, insufficient experience reuse, and decoupled resource scheduling in industrial LLM deployment, this paper proposes LLM-Conductor, a three-layer collaborative architecture that enables monitoring-feedback autonomous decision-making, structured policy memory, and joint policy-resource optimization.Through ablation studies, horizontal comparisons with ISOLATEGPT and ReAct, and graded resource-reduction experiments across six tiers, the results demonstrate that the security risk incidence rate is reduced from 70.6 percent to 1.3 percent, the multi-application collaborative task completion rate reaches 100 percent, and token utilization improves to 88.9 percent. Under constraints of at least 512 MB memory and at least 0.5 GHz CPU, the core task completion rate remains above 95 percent. By deeply coupling decision-making with resource scheduling, this architecture provides an integrated pathway toward efficient, secure, and reliable LLM deployment in Industrial Internet of Things scenarios. Current validation focuses on software-layer interaction patterns under simulated resource-constrained environments, with physical-layer industrial integration reserved for future work. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

27 pages, 3983 KB  
Article
Low-Latency DDoS Detection for IIoT and SCADA Networks Using Proximal Policy Optimisation and Deep Reinforcement Learning
by Mikiyas Alemayehu, Mohamed Chahine Ghanem, Hamza Kheddar, Dipo Dunsin, Chaker Abdelaziz Kerrache and Geetanjali Rathee
Information 2026, 17(5), 412; https://doi.org/10.3390/info17050412 - 26 Apr 2026
Viewed by 129
Abstract
Industrial Internet of Things (IIoT) and SCADA-connected networks are increasingly vulnerable to Distributed Denial of Service (DDoS) attacks, which can disrupt time-sensitive industrial processes and compromise operational continuity. Effective mitigation requires accurate and low-latency attack detection at the network edge, where industrial gateways [...] Read more.
Industrial Internet of Things (IIoT) and SCADA-connected networks are increasingly vulnerable to Distributed Denial of Service (DDoS) attacks, which can disrupt time-sensitive industrial processes and compromise operational continuity. Effective mitigation requires accurate and low-latency attack detection at the network edge, where industrial gateways operate under strict constraints in computation, memory, and energy. This study investigates Deep Reinforcement Learning (DRL) for real-time binary DDoS detection and proposes a detector based on Proximal Policy Optimisation (PPO) for deployment in resource-constrained IIoT environments. Four DRL agents, namely Deep Q-Network (DQN), Double DQN, Dueling DQN, and PPO, are trained and evaluated within a unified experimental pipeline incorporating automatic label mapping, numerical feature selection, robust scaling, and class balancing. Experiments are conducted on three representative benchmark datasets: CIC-DDoS2019, Edge-IIoTset, and CICIoT23. Performance is assessed using accuracy, precision, recall, F1-score, false positive rate, false negative rate, and CPU inference latency. The reward function is asymmetric: +1 for correct classification, −1 for false positive, and −2 for false negative, penalising missed attacks more heavily for IIoT safety. The results show that PPO provides a competitive accuracy–latency tradeoff across all three datasets, achieving the highest mean accuracy of 97.65% and ranking first on CIC-DDoS2019 with a score of 95.92%, while remaining competitive on Edge-IIoTset (99.11%) and CICIoT23 (97.92%). PPO also converges faster than the value-based baselines. Inference latency is below 0.8 ms per sample on a standard CPU (Intel i7-11800H), confirming real-time feasibility. To support practical deployment, the trained PPO policies are exported to ONNX format (≈9 KB per model), enabling lightweight and PyTorch-independent inference on industrial edge gateways. Full article
(This article belongs to the Special Issue Reinforcement Learning for Cyber Security: Methods and Applications)
38 pages, 6938 KB  
Article
DeepSense: An Adaptive Scalable Ensemble Framework for Industrial IoT Anomaly Detection
by Amir Firouzi and Ali A. Ghorbani
Sensors 2026, 26(9), 2662; https://doi.org/10.3390/s26092662 (registering DOI) - 24 Apr 2026
Viewed by 613
Abstract
The Industrial Internet of Things (IIoT) has become a cornerstone of modern industrial automation, enabling real-time monitoring, intelligent decision-making, and large-scale connectivity across cyber–physical systems. However, the growing scale, heterogeneity, and dynamic behavior of IIoT environments significantly expand the attack surface and challenge [...] Read more.
The Industrial Internet of Things (IIoT) has become a cornerstone of modern industrial automation, enabling real-time monitoring, intelligent decision-making, and large-scale connectivity across cyber–physical systems. However, the growing scale, heterogeneity, and dynamic behavior of IIoT environments significantly expand the attack surface and challenge the effectiveness of conventional security mechanisms. In this paper, we propose DeepSense, a hybrid and adaptive anomaly and intrusion detection framework specifically designed for resource-constrained and heterogeneous IIoT deployments. DeepSense integrates three complementary components: DataSense, a realistic data pipeline and experimental testbed supporting synchronized sensor and network data processing; RuleSense, a lightweight rule-based detection layer that provides fast, deterministic, and interpretable anomaly screening at the edge; and NeuroSense, a learning-driven detection module comprising an adaptive ensemble of 22 machine learning and deep learning models spanning classical, neural, hybrid, and Transformer-based architectures. NeuroSense operates as a second detection stage that validates suspicious events flagged by RuleSense and enables both coarse-grained and fine-grained attack classification. To support rigorous and practical assessment, this work further introduces a comprehensive performance evaluation framework that extends beyond accuracy-centric metrics by jointly considering detection quality, latency, resource efficiency, and detection coverage, alongside an optimization-based process for selecting Pareto-optimal model ensembles under realistic IIoT constraints. Extensive experiments across diverse detection scenarios demonstrate that DeepSense exhibits strong generalization, lower false positive rates, and robust performance under evolving attack behaviors. The proposed framework provides a scalable and efficient IIoT security solution that meets the operational requirements of Industry 4.0 and the resilience-oriented objectives of Industry 5.0. Full article
40 pages, 1948 KB  
Systematic Review
Edge–Cloud Collaboration for Machine Condition Monitoring: A Comprehensive Review of Mechanisms, Models, and Applications
by Liyuan Yu, Jitao Fang, Qiuyan Wang, Fajia Li and Haining Liu
Machines 2026, 14(5), 476; https://doi.org/10.3390/machines14050476 (registering DOI) - 24 Apr 2026
Viewed by 149
Abstract
Machine condition monitoring increasingly depends on distributed sensing, edge intelligence, and cloud analytics, yet timely and trustworthy health assessment remains constrained by latency, bandwidth, privacy, and reliability requirements. Cloud-only architectures provide scalable computation and historical data integration but often fail to satisfy real-time [...] Read more.
Machine condition monitoring increasingly depends on distributed sensing, edge intelligence, and cloud analytics, yet timely and trustworthy health assessment remains constrained by latency, bandwidth, privacy, and reliability requirements. Cloud-only architectures provide scalable computation and historical data integration but often fail to satisfy real-time industrial needs, whereas edge-only deployments are limited by restricted computing resources and fragmented local knowledge. Edge–cloud collaboration has, therefore, emerged as a practical architecture for distributing perception, inference, learning, and coordination across hierarchical industrial systems. This review examines 147 publications on edge–cloud collaboration for machine condition monitoring published between 2019 and February 2026. A four-dimensional taxonomy is developed to organize the literature into model-centric, data-centric, resource and task-centric, and architecture and trust-centric mechanisms, while 13 survey and review papers are considered separately for contextual comparison. On this basis, the review analyzes representative collaboration mechanisms and enabling technologies, with particular attention to federated learning, transfer learning, knowledge distillation, digital twins, and deep reinforcement learning, and surveys their deployment in manufacturing, energy, transportation, and infrastructure monitoring scenarios. The literature remains dominated by model-centric collaboration, while architecture and trust-centric studies increasingly provide the system foundations required for practical deployment. The review further identifies major open challenges, including robust generalization under changing operating conditions, efficient data transmission, real-time resource coordination, interoperability, and trustworthy large-scale deployment, and outlines future directions in foundation-model-based edge–cloud collaboration, continual learning, dual digital twins, trustworthy collaboration, and privacy-preserving industrial ecosystems. Full article
15 pages, 1316 KB  
Article
Study of Graphene-Based Strain Sensing Output Signals Under External Electromagnetic Interference Conditions
by Furong Kang, Shuqi Han, Kaixi Bi, Jian He and Xiujian Chou
Nanomaterials 2026, 16(9), 509; https://doi.org/10.3390/nano16090509 (registering DOI) - 23 Apr 2026
Viewed by 495
Abstract
Graphene possesses exceptional mechanical strength, high electrical conductivity, and a stable lattice structure, making it an ideal material for sensors in advanced manufacturing. However, these sensors face stability challenges due to complex electromagnetic interference (EMI) environments generated by electrical equipment. Therefore, investigating the [...] Read more.
Graphene possesses exceptional mechanical strength, high electrical conductivity, and a stable lattice structure, making it an ideal material for sensors in advanced manufacturing. However, these sensors face stability challenges due to complex electromagnetic interference (EMI) environments generated by electrical equipment. Therefore, investigating the influence of EMI on sensor performance is of significant importance. In this study, simulations were performed to analyze electrical parameter perturbations of intrinsic graphene films under EMI conditions. The Magnetic Fields, Solid Mechanics, and Electrostatics modules in COMSOL Multiphysics were employed to construct a coupled model of a three-phase power transformer and a graphene-based pressure sensor. The results indicate that EMI can induce baseline drift on the order of ~5% full scale (FS) in the graphene current density, accompanied by degradation in signal-to-noise ratio (SNR) exceeding ~15 dB under typical simulation conditions. Graphene in direct contact with metal electrodes shows enhanced sensitivity to EMI, with more pronounced noise amplification due to interfacial coupling effects. In contrast, cavity-suspended graphene configurations exhibit relatively improved robustness, suggesting that suspended membrane architectures can mitigate EMI by reducing parasitic coupling and enhancing mechanical isolation. Compared with previous studies, this work highlights the role of multiphysics coupling and membrane suspension in influencing EMI-induced perturbations, providing theoretical guidance for the design of graphene-based sensors in power system and industrial Internet of Things (IoT) applications. Full article
(This article belongs to the Section Nanoelectronics, Nanosensors and Devices)
18 pages, 604 KB  
Review
A Narrative Review on Internet of Things and Artificial Intelligence for Poultry Production
by Anjan Dhungana, Bidur Paneru, Samin Dahal and Lilong Chai
Animals 2026, 16(9), 1285; https://doi.org/10.3390/ani16091285 - 22 Apr 2026
Viewed by 410
Abstract
Recently, poultry production has increased worldwide to address the increasing demand of affordable animal-sourced protein. To meet this requirement, poultry production operations have become more concentrated, introducing management challenges related to disease control, productivity, and animal welfare. However, manual flock monitoring and management [...] Read more.
Recently, poultry production has increased worldwide to address the increasing demand of affordable animal-sourced protein. To meet this requirement, poultry production operations have become more concentrated, introducing management challenges related to disease control, productivity, and animal welfare. However, manual flock monitoring and management have become impractical in such cases, creating a need for automatic data-driven management approaches. In this context, the Internet of Things (IoT) has emerged as a potential technological solution for continuous flock monitoring, data sharing, and decision-making. Despite this, its adoption in poultry production is limited compared with its widespread use in crop production, transportation, and manufacturing industrial sectors. Furthermore, advanced analytical techniques such as artificial intelligence (AI), applied to data gathered by IoT-enabled devices, have shown promising results by generating actionable information. Existing literature suggests that the integration of IoT and AI can address the major challenges associated with modern large-scale poultry production systems. While most applications remain at the research scale, such technologies have the potential for improving flock monitoring, enhancing productivity, and ensuring proper animal welfare. This narrative review examines the current state of IoT and AI based technologies, together or in part identifies the limitations, research gaps, and opportunities for future development. Full article
(This article belongs to the Section Poultry)
Show Figures

Figure 1

43 pages, 646 KB  
Review
TinyML in Industrial IoT: A Systematic Review of Applications, System Components, and Methodologies
by Shahad Alharthi, Muhammad Rashid and Malak Aljabri
Sensors 2026, 26(8), 2550; https://doi.org/10.3390/s26082550 - 21 Apr 2026
Viewed by 699
Abstract
Tiny Machine Learning (TinyML) enables Machine Learning (ML) models to run on resource-constrained devices, which is critical for Industrial Internet of Things (IIoT) systems requiring low latency, energy efficiency, and local decision-making. Nevertheless, deploying TinyML in IIoT remains challenging due to diverse applications, [...] Read more.
Tiny Machine Learning (TinyML) enables Machine Learning (ML) models to run on resource-constrained devices, which is critical for Industrial Internet of Things (IIoT) systems requiring low latency, energy efficiency, and local decision-making. Nevertheless, deploying TinyML in IIoT remains challenging due to diverse applications, hardware, frameworks, and deployment methodologies, highlighting the need for a structured and focused review. Existing review articles mainly address general IoT or edge AI, leaving a critical gap in a unified and systematic understanding of TinyML applications, system components, and methodologies within IIoT contexts. Consequently, this systematic literature review (SLR) addresses this gap by analyzing 35 peer-reviewed studies published between 2018 and 2026, offering a comprehensive and structured synthesis of TinyML-enabled IIoT systems. The selected works are synthesized across three major dimensions: applications, system components, and methodologies. In terms of applications, TinyML is primarily used for predictive maintenance, equipment monitoring, anomaly detection, energy management, and general-purpose applications. The general category captures cross-domain solutions that do not fit into a single industrial application. A comparative analysis of all application categories is conducted in terms of accuracy, latency, memory, and energy. For system components, a structured comparison shows how hardware, software, and sensing choices shape performance and applicability. Hardware platforms are grouped by microcontroller families, highlighting dominant types. Software frameworks are summarized, showing the widespread use of lightweight toolchains for on-device inference. Sensor types are categorized, with vibration sensing most common. They are supported by other sensing methods such as vision, sound (acoustic), and environmental sensors. Finally, the methodologies examined in this SLR provide a comprehensive view of the data foundations, model selection, and optimization strategies. In short, this SLR converges diverse TinyML–IIoT applications, microcontroller-based hardware, lightweight software frameworks, sensing modalities, varied datasets, and optimization strategies, while also identifying challenges and future research directions. Full article
Show Figures

Figure 1

26 pages, 1940 KB  
Article
Industry 4.0 in the Sustainable Maritime Sector: A Componential Evaluation with Bayesian BWM
by Mahmut Mollaoglu, Bukra Doganer, Hakan Demirel, Abit Balin and Emre Akyuz
Sustainability 2026, 18(8), 4078; https://doi.org/10.3390/su18084078 - 20 Apr 2026
Viewed by 295
Abstract
The rapid diffusion of industry 4.0 technologies has substantially transformed the maritime transportation sectors by enabling data-driven operations, enhanced connectivity, and more intelligent decision-making processes. Digital technologies such as the Internet of Things (IoT), simulation systems, and advanced data analytics are increasingly reshaping [...] Read more.
The rapid diffusion of industry 4.0 technologies has substantially transformed the maritime transportation sectors by enabling data-driven operations, enhanced connectivity, and more intelligent decision-making processes. Digital technologies such as the Internet of Things (IoT), simulation systems, and advanced data analytics are increasingly reshaping operational structures in maritime logistics, positioning technological transformation as a strategic priority for firms. However, the weighting and prioritization of components emerging with industry 4.0 technologies remain an underexplored area in the literature. The primary motivation of this study is to determine the weights of these industry 4.0 components using the Bayesian Best Worst Method (BWM) and to reveal their corresponding credal ranking levels. In this context, the present study aims to evaluate and prioritize the critical industry 4.0 components influencing technological transformation processes using the Bayesian BWM. Bayesian BWM is preferred over alternative Multi Criteria Decision Making (MCDM) approaches due to its ability to explicitly model uncertainty within a probabilistic framework, generate more consistent weighting results, and flexibly incorporate decision-makers’ judgments. The findings reveal that safety and security (0.2945) constitute the most influential main component, underscoring the necessity of robust digital infrastructures and reliable systems within highly digitalized operational environments. Among the sub-components, data privacy (0.1301) demonstrates the highest global weight, highlighting the growing importance of safeguarding sensitive information in data-intensive digital systems. The results further indicate that autonomous operation and coordination play significant roles in facilitating efficient digital operations, particularly through real-time equipment monitoring and IoT-based operational visibility. Moreover, sustainability (0.1968) emerges as the second most important component, suggesting that organizations increasingly assess technological investments not only in terms of operational efficiency but also with respect to long-term resilience. Within this dimension, continuous training (0.0614) is identified as the most influential component, indicating that the success of digital transformation depends not only on technological infrastructure but also on the development of human capabilities. With the increasing digitalization of the maritime industry, protection against cyber threats has become essential for ensuring operational continuity and safeguarding data integrity. In this regard, adopting proactive cybersecurity strategies and continuously monitoring and updating systems are of critical importance. In the digital transformation of maritime transportation, integrating sustainability considerations is essential to ensure long-term operational efficiency and environmental responsibility. These practical implications are particularly relevant for policymakers, port authorities, and shipping companies seeking to enhance both digital capabilities and sustainable performance. Full article
(This article belongs to the Section Sustainable Oceans)
Show Figures

Figure 1

29 pages, 3416 KB  
Article
Enhancing Collaborative AI Learning: A Blockchain-Secured, Edge-Enabled Platform for Multimodal Education in IIoT Environments
by Ahsan Rafiq, Eduard Melnik, Alexey Samoylov, Alexander Kozlovskiy and Irina Safronenkova
Big Data Cogn. Comput. 2026, 10(4), 123; https://doi.org/10.3390/bdcc10040123 - 17 Apr 2026
Viewed by 564
Abstract
As industries deploy more connected devices in factories, warehouses, and smart facilities, the need for artificial intelligence (AI) systems that can operate securely in distributed, data-intensive environments is growing. Traditional centralized learning and online education platforms struggle when students and systems have to [...] Read more.
As industries deploy more connected devices in factories, warehouses, and smart facilities, the need for artificial intelligence (AI) systems that can operate securely in distributed, data-intensive environments is growing. Traditional centralized learning and online education platforms struggle when students and systems have to process real-time streams (sensors, video, text) with strict latency and privacy requirements. To address this challenge, a blockchain-secured, edge-enabled multimodal federated learning framework tailored for Industrial IoT (IIoT) environments is proposed. The model integrates four key layers: (i) a blockchain layer that provides credentialing, transparency, and token-based incentives; (ii) a multimodal community layer that supports group formation, peer consensus, and cross-modal learning across text, images, audio, and sensor data; (iii) an edge computing layer that enables low-latency task offloading and secure training within Intel SGX enclaves; and (iv) a data layer that applies pre-processing, differential privacy, and synthetic augmentation to safeguard sensitive information. Experiments on industrial multimodal datasets demonstrate 42% faster model aggregation, 78.9% multimodal accuracy, and 1.9% accuracy loss under ε = 1.0 differential privacy. This shows a scalable and practical path for decentralized AI training in next-generation IIoT systems, confirming the possibility of technical support for educational processes. However, the conducted research requires a validation of pedagogical effectiveness. Full article
Show Figures

Figure 1

13 pages, 747 KB  
Article
Uplink-Centric DUDe for IoT and Industry 4.0
by Charalampos Chatzigeorgiou, Christos Bouras, Vasileios Kokkinos, Apostolos Gkamas and Philippos Pouyioutas
Electronics 2026, 15(8), 1680; https://doi.org/10.3390/electronics15081680 - 16 Apr 2026
Viewed by 216
Abstract
This study investigates Downlink/Uplink Decoupling (DUDe) in 5G networks, a framework that allows user equipment to select its uplink serving cell independently of the downlink anchor. This approach is designed to alleviate the “macro bias” and pathloss issues that typically degrade performance for [...] Read more.
This study investigates Downlink/Uplink Decoupling (DUDe) in 5G networks, a framework that allows user equipment to select its uplink serving cell independently of the downlink anchor. This approach is designed to alleviate the “macro bias” and pathloss issues that typically degrade performance for Internet of Things (IoT) traffic. We propose a framework managed by Mobile Edge Computing (MEC) that operates on a per-Transmission Time Interval (TTI) basis, incorporating stability mechanisms such as hysteresis and Time to Trigger to prevent frequent, unnecessary handovers. The performance is evaluated using a system-level simulator across two scenarios: a high-density urban IoT deployment and an Industry 4.0 smart factory environment. Our results demonstrate that the proposed framework significantly improves uplink throughput and reduces tail latency compared to traditional coupled association methods. Furthermore, an ablation study confirms that these performance gains are derived from the structural decoupling of links, providing a scalable path for improving connectivity in 5G and beyond. Full article
(This article belongs to the Special Issue Feature Papers in Networks: 2025–2026 Edition)
Show Figures

Figure 1

Back to TopTop