Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (375)

Search Parameters:
Keywords = next-generation IoT

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 1354 KB  
Article
Chaos Theory with AI Analysis in IoT Network Scenarios
by Antonio Francesco Gentile and Maria Cilione
Cryptography 2026, 10(2), 25; https://doi.org/10.3390/cryptography10020025 - 10 Apr 2026
Abstract
While general network dynamics have been extensively modeled using stochastic methods, the emergence of dense Internet of Things (IoT) ecosystems demands a more specialized analytical framework. IoT environments are characterized by extreme non-linearity and sensitivity to initial conditions, where traditional models often fail [...] Read more.
While general network dynamics have been extensively modeled using stochastic methods, the emergence of dense Internet of Things (IoT) ecosystems demands a more specialized analytical framework. IoT environments are characterized by extreme non-linearity and sensitivity to initial conditions, where traditional models often fail to account for chaotic latency and packet loss. This paper introduces a specialized approach that integrates Chaos Theory with the innovative paradigm of Vibe Coding—an AI-assisted development and analysis methodology that allows for the `encoding’ and interpretation of the dynamic `vibe’ or signature of network fluctuations in real-time. By categorizing network behavior into four distinct scenarios (quiescent, perturbed, attacked, and perturbed–Attacked), the proposed framework utilizes deep learning to transform chaotic signals into actionable intelligence. Our findings demonstrate that this specialized synergy between chaos analysis and Vibe Coding provides superior classification of adversarial threats, such as DoS and injection attacks, fostering intelligent native security for next-generation IoT infrastructures. Full article
30 pages, 1189 KB  
Systematic Review
Intelligent Evaporative Cooling Systems for Post-Harvest Fruit and Vegetable Preservation: A Systematic Literature Review
by Rabiu Omeiza Isah, Segun Emmanuel Adebayo, Bello Kontagora Nuhu, Eustace Manayi Dogo, Buhari Ugbede Umar, Danlami Maliki, Ibrahim Mohammed Abdullahi, Olayemi Mikail Olaniyi and James Agajo
AgriEngineering 2026, 8(4), 150; https://doi.org/10.3390/agriengineering8040150 - 9 Apr 2026
Abstract
Post-harvest losses of fruits and vegetables are an important bottleneck in food systems of countries around the world, with 30–50% of perishable food items lost between farm and consumer, smallholder farmers in low-and-middle income countries (LMICs) with poor cold chain infrastructures facing a [...] Read more.
Post-harvest losses of fruits and vegetables are an important bottleneck in food systems of countries around the world, with 30–50% of perishable food items lost between farm and consumer, smallholder farmers in low-and-middle income countries (LMICs) with poor cold chain infrastructures facing a disproportionate burden. Evaporative cooling (EC) is a low-cost and energy-efficient alternative to mechanical refrigeration; however, traditional systems are operated in one position and are dependent on climate, which restricts its performance. The combination of Internet of Things (IoT) sensing, machine learning (ML), and the advanced control theory has made intelligent evaporative cooling systems (IECS) adaptive, data-driven platforms that can regulate the environment in real-time and optimise autonomously. This is a systematic literature review that was carried out according to PRISMA 2020, summarising 94 peer-reviewed articles published in 2018–2025 to map the technological landscape, performance indicators, and research directions of the field of post-harvest fruit and vegetable preservation using IECS. Findings indicate that IECS can considerably lower the storage temperatures, increase the shelf life by 50–200%, and reduce energy consumption by 75–90% compared to traditional refrigeration, and the payback period is as short as 1.2 years. In arid conditions, ML models are accurate in prediction with an R2 of 0.98. The gaps in the research identified are a lack of validation in wet climatic conditions, non-existent standardised Ag-IoT protocols, inadequate Food–Energy–Water (FEW) nexus calculation, and no explainable AI (XAI) interfaces. An example of a conceptual framework of four layers synthesised is proposed to direct next-generation research and implementation of the IECS. Full article
23 pages, 4282 KB  
Article
FPGA-Accelerated Machine Learning for Computational Environmental Information Processing in IoT-Integrated High-Density Nanosensor Networks
by Alaa Kamal Yousif Dafhalla, Fawzia Awad Elhassan Ali, Asma Ibrahim Gamar Eldeen, Ikhlas Saad Ahmed, Ameni Filali, Amel Mohamed essaket Zahou, Amal Abdallah AlShaer, Suhier Bashir Ahmed Elfaki, Rabaa Mohammed Eltayeb and Tijjani Adam
Information 2026, 17(4), 354; https://doi.org/10.3390/info17040354 - 8 Apr 2026
Viewed by 174
Abstract
This study presents a nanosensor network system for autonomous microclimate optimization in precision horticulture, leveraging a field-programmable gate array (FPGA)-based control architecture that is integrated with an edge-level machine learning inference. Unlike the conventional greenhouse automation systems, which exhibit thermal and hygroscopic hysteresis [...] Read more.
This study presents a nanosensor network system for autonomous microclimate optimization in precision horticulture, leveraging a field-programmable gate array (FPGA)-based control architecture that is integrated with an edge-level machine learning inference. Unlike the conventional greenhouse automation systems, which exhibit thermal and hygroscopic hysteresis often exceeding 32 °C and 78% relative humidity, the proposed framework embeds a random forest regression (RFR) model directly within the Altera DE2-115 FPGA fabric to enable predictive environmental regulation. The model achieved an R2 of 0.985 and root mean square error (RMSE) of 0.28 °C, allowing proactive compensation for the thermodynamic disturbances from the high-intensity light-emitting diode (LED) lighting with a 120 s predictive horizon. The real-time monitoring and remote supervision were supported via a NodeMCU-based IoT gateway, achieving a 140 ms mean communication latency and a 99.8% packet delivery reliability. The preliminary validation using lettuce (Lactuca sativa) optimized the environmental parameters, while the subsequent experiments with pepper (Capsicum annuum), a commercially important and environmentally sensitive crop, demonstrated system performance under real-world conditions. The control system maintained a temperature and humidity within ±0.3 °C and ±1.2% of the setpoints, respectively, and outperformed the baseline rule-based control with a 28% increase in fresh biomass, a 22% improvement in dry matter accumulation, a 25% reduction in actuator duty-cycle switching, and an 18% decrease in overall energy consumption. These results highlight the efficacy of FPGA-integrated edge intelligence combined with low-latency IoT telemetry as a scalable, energy-efficient, and high-fidelity solution for sub-degree environmental control in next-generation, controlled-environment, and vertical farming systems. Full article
Show Figures

Figure 1

25 pages, 852 KB  
Article
Hardware Implementation-Based Lightweight Privacy- Preserving Authentication Scheme for Internet of Drones Using Physically Unclonable Function
by Razan Alsulieman, Eduardo Hernandez Escobar, Richard Swilley, Ahmed Sherif, Kasem Khalil, Mohamed Elsersy and Rabab Abdelfattah
Sensors 2026, 26(7), 2224; https://doi.org/10.3390/s26072224 - 3 Apr 2026
Viewed by 294
Abstract
The Internet of Drones (IoD) has emerged as a critical extension of the Internet of Things, enabling unmanned aerial vehicles to support diverse applications, including precision agriculture, logistics, disaster monitoring, and security surveillance. Despite its rapid growth, securing IoD communications remains a significant [...] Read more.
The Internet of Drones (IoD) has emerged as a critical extension of the Internet of Things, enabling unmanned aerial vehicles to support diverse applications, including precision agriculture, logistics, disaster monitoring, and security surveillance. Despite its rapid growth, securing IoD communications remains a significant challenge due to the open wireless environment, high drone mobility, and strict computational and energy constraints. Existing authentication mechanisms either rely on computationally expensive cryptographic operations or remain validated only at the protocol or simulation level, leaving a critical gap in practical, hardware-validated solutions suitable for resource-constrained drone platforms. This gap motivates the need for a lightweight, privacy-preserving authentication scheme that is both theoretically sound and experimentally deployable on real hardware. To address this, we propose a Physically Unclonable Functions (PUF)-assisted lightweight authentication scheme for IoD environments that binds cryptographic keys to each drone’s intrinsic hardware characteristics via PUFs. The scheme employs dynamically generated pseudo-identities to conceal permanent drone identities and prevent tracking, while authentication and key agreement are achieved using efficient symmetric cryptographic primitives, including SHA-256 for key derivation and updates, AES-256 for secure communication, and lightweight XOR operations to minimize overhead. Forward secrecy is ensured through rolling key updates, and periodic renewal of PUF challenges enhances resistance to replay and modeling attacks. To validate practicality, both software-based and hardware-based implementations were developed and evaluated. The software evaluation demonstrates a low communication overhead of 708.5 bytes and an average computation time of 18.87 ms. The hardware implementation on a Nexys A7-100T FPGA operates at 100 MHz with only 12.49% LUT utilization and low dynamic power consumption of approximately 182.5 mW. These results confirm that the proposed framework achieves an effective balance between security, privacy, and efficiency. The significance of this work lies in providing a fully hardware-validated, PUF-based authentication framework specifically tailored to the real-world constraints of IoD environments, offering a practical foundation for securing next-generation drone networks. Full article
Show Figures

Figure 1

15 pages, 789 KB  
Article
EdgeRescue: Lightweight AI-Based Self-Healing for Energy-Constrained IoT Meshes
by Haifa A. Alanazi, Abdulaziz G. Alanazi and Nasser S. Albalawi
Computation 2026, 14(4), 84; https://doi.org/10.3390/computation14040084 - 3 Apr 2026
Viewed by 261
Abstract
As the scale and complexity of Internet of Things (IoT) deployments increase, maintaining resilience in resource-constrained mesh networks becomes a significant challenge. Frequent node failures due to battery depletion, environmental interference, or hardware degradation can disrupt data flows and lead to operational downtime. [...] Read more.
As the scale and complexity of Internet of Things (IoT) deployments increase, maintaining resilience in resource-constrained mesh networks becomes a significant challenge. Frequent node failures due to battery depletion, environmental interference, or hardware degradation can disrupt data flows and lead to operational downtime. To address this, we propose EdgeRescue, a novel lightweight AI-driven framework for self-healing in energy-constrained IoT mesh environments. EdgeRescue enables each node to perform local anomaly detection using compact 1D Convolutional Neural Networks (1D-CNNs) and initiates distributed, energy-aware routing reconfiguration when faults are detected. Unlike cloud-dependent methods, EdgeRescue operates entirely at the edge, requiring minimal computation, memory, and communication overhead. Extensive simulations on a 100-node testbed demonstrate that EdgeRescue improves packet delivery by 13.2%, reduces recovery latency by 57%, and lowers average node energy consumption by 18.8% compared to state-of-the-art baselines. These results establish EdgeRescue as a scalable and practical solution for achieving real-time resilience in next-generation IoT mesh networks. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

13 pages, 1799 KB  
Proceeding Paper
Cooling Tower Decision Support Web System: A Case Study
by Hao-Yu Lien, Wen-Hao Chen and Yen-Jen Chen
Eng. Proc. 2026, 134(1), 7; https://doi.org/10.3390/engproc2026134007 - 30 Mar 2026
Viewed by 222
Abstract
Conventional cooling tower operations often rely on the operator’s experience for fan-switching control, lacking precise decision support and real-time monitoring capabilities. This makes it challenging to maintain water temperature within an optimal range, thereby affecting industrial process efficiency. Using a case study approach, [...] Read more.
Conventional cooling tower operations often rely on the operator’s experience for fan-switching control, lacking precise decision support and real-time monitoring capabilities. This makes it challenging to maintain water temperature within an optimal range, thereby affecting industrial process efficiency. Using a case study approach, we integrate a Long Short-Term Memory (LSTM) model for temperature prediction with a Reinforcement Learning (RL) model to develop a web-based decision support system for cooling tower operations. The system uses an LSTM model to predict the trend of return water temperature for the next 15 min. This prediction, along with environmental conditions and historical data, is then fed into the RL model. Through a reward mechanism, the model is designed to receive a higher score when the predicted temperature is close to the benchmark of 30.5 °C and a lower score otherwise, enabling it to learn the optimal fan control strategy. Based on the evaluation results, the system automatically determines the optimal action—turning the fan on, off, or maintaining its current state—and provides specific fan operation suggestions and a decision-making basis to the operator via a web interface. This system is designed with a layered architecture, comprising functional modules such as a real-time monitoring dashboard, historical data query, and AI model management. Through visual elements like temperature trend line charts, fan status indicators, and a decision suggestion interface, it provides operators with real-time water temperature status, predicted temperature trends, and specific operational recommendations. The system has been deployed and is running in an actual manufacturing factory, where the AI model generates predictions and decision outputs every 15 min, assisting operators in adjusting fan control. This has successfully stabilized the outlet water temperature within the target range of 30–31 °C, thereby enhancing the efficiency of cooling water temperature regulation. The model presents the practical application of AI technology in a manufacturing control scenario and establishes a web-based decision support system, providing a concrete example for smart manufacturing transformation within an Industrial IoT environment. Full article
Show Figures

Figure 1

20 pages, 543 KB  
Article
EdgeGuard-AI: Zero-Trust and Load-Aware Federated Scheduling for Secure and Low-Latency IoT Edge Networks
by Abdulaziz G. Alanazi and Haifa A. Alanazi
Sensors 2026, 26(6), 1989; https://doi.org/10.3390/s26061989 - 23 Mar 2026
Viewed by 302
Abstract
Edge computing is now widely used to support real-time and safety-critical IoT services. However, current edge schedulers usually optimize only performance, while security verification and trust assessment are handled as separate modules. This separation creates a practical risk: tasks may be assigned to [...] Read more.
Edge computing is now widely used to support real-time and safety-critical IoT services. However, current edge schedulers usually optimize only performance, while security verification and trust assessment are handled as separate modules. This separation creates a practical risk: tasks may be assigned to lightly loaded but compromised edge nodes, or secure nodes may become overloaded, violating latency requirements. We propose EdgeGuard-AI, a unified trust-driven and load-aware scheduling framework inspired by zero-trust security principles for next-generation IoT edge networks. The framework jointly learns dynamic node trust and short-term workload patterns from distributed edge data and integrates both signals into scheduling decisions. Experimental results on a realistic IoT edge security dataset show a task success rate of 97.3 percent, average scheduling latency of 58.1 ms during stress periods, unsafe offloading below 2 percent, and trust discrimination AUC of 0.971. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

19 pages, 1184 KB  
Article
Hardware-Accelerated Cryptographic Random Engine for Simulation-Oriented Systems
by Meera Gladis Kurian and Yuhua Chen
Electronics 2026, 15(6), 1297; https://doi.org/10.3390/electronics15061297 - 20 Mar 2026
Viewed by 357
Abstract
Modern computing platforms increasingly rely on random number generators (RNGs) for modeling probabilistic processes in simulation, probabilistic computing, and system validation. They are also essential for cryptographic operations such as key generation, authenticated encryption, and digital signatures. Deterministic Random Bit Generators (DRBGs), as [...] Read more.
Modern computing platforms increasingly rely on random number generators (RNGs) for modeling probabilistic processes in simulation, probabilistic computing, and system validation. They are also essential for cryptographic operations such as key generation, authenticated encryption, and digital signatures. Deterministic Random Bit Generators (DRBGs), as specified in the National Institute of Standards and Technology (NIST) Special Publication (SP) 800-90A, provides a standardized method for expanding entropy into cryptographically strong pseudorandom sequences. This work presents the design and Field Programmable Gate Array (FPGA) implementation of a hash-based DRBG using Ascon-Hash256, a lightweight, quantum-resistant hash function from the NIST-standardized Ascon cryptographic suite. It implements hash-based derivation, instantiation, generation, and reseeding of the generator via iterative hash invocations and state updates. Leveraging Ascon’s sponge-based structure, the design achieves efficient entropy absorption and diffusion while maintaining an area-efficient FPGA architecture, making it well suited for resource-constrained platforms. The diffusion properties of the proposed DRBG are evaluated through avalanche and reproducibility analyses, confirming strong sensitivity to input variations and secure, repeatable operation. Moreover, Monte Carlo and stochastic-diffusion evaluation of the generated bitstreams demonstrates correct convergence and statistically consistent behavior. These results confirm that the proposed hash-based DRBG provides reproducible, hardware-efficient, and cryptographically secure random numbers suitable for next-generation neuromorphic, probabilistic computing systems, and Internet of Things (IoT) devices. Full article
Show Figures

Figure 1

16 pages, 686 KB  
Article
Design of Network Traffic Analysis Models Based on Deep Neural Networks
by Jiantao Cui and Yixiang Zhao
Future Internet 2026, 18(3), 152; https://doi.org/10.3390/fi18030152 - 16 Mar 2026
Viewed by 265
Abstract
The proliferation of next-generation Internet infrastructures and the Internet of Things (IoT) has exponentially increased network traffic complexity. While deep learning (DL)-based intrusion detection systems (IDSs) show immense potential, they persistently suffer from challenges including high computational overhead, vanishing gradients in deep architectures, [...] Read more.
The proliferation of next-generation Internet infrastructures and the Internet of Things (IoT) has exponentially increased network traffic complexity. While deep learning (DL)-based intrusion detection systems (IDSs) show immense potential, they persistently suffer from challenges including high computational overhead, vanishing gradients in deep architectures, and acute sensitivity to noise. Consequently, these issues impede their real-time deployment in resource-constrained edge computing environments. To overcome these limitations, we propose a novel, lightweight, and robust intrusion detection framework based on deep neural networks (DNNs). Initially, we employ a Robust Scaler-based statistical preprocessing strategy to supersede traditional Z-score standardization, effectively mitigating the adverse impacts of outliers and burst traffic noise. Subsequently, we design an advanced architecture that integrates self-normalizing residual blocks with a channel attention mechanism. Leveraging compressed hidden layers alongside the Scaled Exponential Linear Unit (SELU) activation function, this architecture not only mitigates the vanishing gradient problem but also amplifies critical traffic features. Concurrently, it achieves a substantial reduction in both parameter count and inference latency. Furthermore, we introduce a cosine annealing strategy to dynamically adjust the learning rate during training, thereby facilitating the model’s escape from local optima and accelerating convergence. Extensive experiments on standard benchmark datasets demonstrate that our proposed framework achieves superior detection accuracy while maintaining exceptional computational efficiency compared to state-of-the-art baselines. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Figure 1

31 pages, 453 KB  
Review
Neuromorphic Computing for Long-Term Cardiac Health: A Review of Spiking Neural Networks in Low-Power Wearable Electronics
by Sadiq Alinsaif
Electronics 2026, 15(6), 1179; https://doi.org/10.3390/electronics15061179 - 12 Mar 2026
Viewed by 741
Abstract
The integration of Artificial Intelligence (AI) into Internet of Things (IoT) medical devices has revolutionized arrhythmia monitoring. However, the high computational and power demands of traditional Deep Learning (DL) models pose significant challenges for long-term, battery-operated smart electronics. Spiking Neural Networks (SNNs), inspired [...] Read more.
The integration of Artificial Intelligence (AI) into Internet of Things (IoT) medical devices has revolutionized arrhythmia monitoring. However, the high computational and power demands of traditional Deep Learning (DL) models pose significant challenges for long-term, battery-operated smart electronics. Spiking Neural Networks (SNNs), inspired by the biological efficiency of the human brain, offer a promising solution. This paper reviews the intersection of SNNs, low-power IoT hardware, and biomedical signal processing. I examine the transition from frame-based to event-driven processing, and discuss the hardware–software co-design necessary for next-generation cardiac wearables. Full article
55 pages, 17048 KB  
Review
The Evolution of Visualization Technologies in Healthcare: A Bibliometric Analysis of Studies Published from 1994 to 2025
by Fangzhong Cheng, Chun Yang and Rong Deng
Information 2026, 17(3), 281; https://doi.org/10.3390/info17030281 - 11 Mar 2026
Viewed by 563
Abstract
Healthcare visualization has become a crucial approach for interpreting complex medical data, supporting informed clinical decision-making, and enhancing public health management. However, existing reviews tend to focus on specific technologies or application scenarios, offering limited insight into the field’s overall knowledge structure, developmental [...] Read more.
Healthcare visualization has become a crucial approach for interpreting complex medical data, supporting informed clinical decision-making, and enhancing public health management. However, existing reviews tend to focus on specific technologies or application scenarios, offering limited insight into the field’s overall knowledge structure, developmental trajectory, and interdisciplinary integration. To address this gap, this study systematically reviews 1121 publications from 1994 to 2025 indexed in the Web of Science Core Collection. By combining bibliometric analysis with qualitative assessment, it maps the field’s evolution and underlying research paradigms. The findings reveal a clear shift from early innovation in technical tools toward the realization of clinical value, giving rise to an integrated research system that connects technology, data, clinical practice, and public health. Recent research has progressed beyond initial explorations of medical imaging, standalone devices, and isolated techniques, moving instead toward core domains such as immersive medical visualization, medical data visualization and analytics, health information systems and decision support, AI-assisted epidemic prediction and diagnosis, and integrated IoT-based healthcare frameworks. Looking ahead, an assessment of future trends suggests that, among other directions, the deep integration of explainable artificial intelligence (XAI) with visualization analysis, the development of IoT-driven real-time interactive systems, and the extension of visualization-enabled services from clinical applications toward inclusive population-level health coverage represent core driving forces for the future development of this field. These insights offer strategic guidance for future research, inform the design principles of next-generation visualization systems, and provide new models of interdisciplinary collaboration. The results also offer evidence-based support for health resource planning, technological innovation, and policy formulation. Full article
(This article belongs to the Special Issue Medical Data Visualization)
Show Figures

Figure 1

30 pages, 2010 KB  
Article
On the Convergence of Internet of Things and Decentralized Finance: Security Challenges and Future Directions
by Prasannakumaran Sarasijanayanan, Nithya Nedungadi and Sriram Sankaran
Sensors 2026, 26(6), 1740; https://doi.org/10.3390/s26061740 - 10 Mar 2026
Viewed by 575
Abstract
The rapid convergence of the Internet of Things (IoT) and decentralized finance (DeFi) is reshaping the digital economy by enabling autonomous, trustless, and value-driven interactions among connected devices. This paper provides a comprehensive survey of the emerging paradigm that combines IoT’s pervasive sensing [...] Read more.
The rapid convergence of the Internet of Things (IoT) and decentralized finance (DeFi) is reshaping the digital economy by enabling autonomous, trustless, and value-driven interactions among connected devices. This paper provides a comprehensive survey of the emerging paradigm that combines IoT’s pervasive sensing and communication capabilities with DeFi’s programmable financial infrastructure. We first discuss the motivation behind this convergence and explore key opportunities, including autonomous machine-to-machine (M2M) payments, decentralized data marketplaces, and trustless IoT service provisioning. Despite its potential, IoT–DeFi integration introduces significant security and privacy challenges related to smart contract vulnerabilities, consensus protocol risks, oracle manipulation, and constrained device capabilities. We review existing mitigation approaches such as lightweight cryptography, secure contract design, and decentralized identity management, and critically assess their limitations in heterogeneous, resource-limited environments. Building on this analysis, identify research gaps and propose future directions emphasizing formal verification of IoT-integrated smart contracts, robust oracle design, interoperability frameworks, and privacy-preserving trust models. This survey systematically maps opportunities, threats, and open issues. In doing so, it guides researchers and practitioners toward building secure, scalable, and energy-efficient IoT–DeFi ecosystems for next-generation decentralized applications. Full article
(This article belongs to the Special Issue Advances in Security for Emerging Intelligent Systems)
Show Figures

Graphical abstract

39 pages, 1767 KB  
Systematic Review
Advanced Hardware Security on Embedded Processors: A 2026 Systematic Review
by Ali Kia, Aaron W. Storey and Masudul Imtiaz
Electronics 2026, 15(5), 1135; https://doi.org/10.3390/electronics15051135 - 9 Mar 2026
Viewed by 1312
Abstract
The proliferation of Internet of Things (IoT) devices and embedded processors has recently spurred rapid advances in hardware-level security. This paper systematically reviews developments in securing microcontroller units (MCUs) and constrained embedded platforms from 2020 to 2026, a period marked by the finalization [...] Read more.
The proliferation of Internet of Things (IoT) devices and embedded processors has recently spurred rapid advances in hardware-level security. This paper systematically reviews developments in securing microcontroller units (MCUs) and constrained embedded platforms from 2020 to 2026, a period marked by the finalization of NIST’s post-quantum cryptography standards and accelerated commercial deployment of hardware security primitives. Through analysis of the peer-reviewed literature, industry implementations, and standardization efforts, we survey five critical areas: post-quantum cryptography (PQC) implementations on resource-constrained hardware, physically unclonable functions (PUFs) for device authentication, hardware Roots of Trust and secure boot mechanisms, side-channel attack mitigations, and Trusted Execution Environments (TEEs) for microcontroller-class devices. For each domain, we analyze technical mechanisms, deployment constraints (power, memory, cost), security guarantees, and commercial maturity. Our review distinguishes itself through its integration perspective, examining how these primitives must be composed to secure real-world embedded systems, and its emphasis on post-standardization PQC developments. We highlight critical gaps including PQC memory overhead challenges, ML-resistant PUF designs, and TEE developer friction, while documenting commercial progress such as PSA Level 3 certified components and 500+ million PUF-enabled devices deployed. This synthesis provides practitioners with practical guidance for securing the next generation of IoT and embedded systems. Full article
Show Figures

Figure 1

9 pages, 1065 KB  
Proceeding Paper
Reconfigurable Metasurface-Enabled AIoT Framework for Intelligent and Sustainable Smart Cities
by Shubham Gupta and Suhaib Ahmed
Eng. Proc. 2026, 124(1), 59; https://doi.org/10.3390/engproc2026124059 - 9 Mar 2026
Viewed by 360
Abstract
The fast growth of smart city systems requires sensing and intelligence systems that are dynamic, power-efficient, and have capabilities of real-time decision-making. The traditional IoT-based smart city systems are subject to constraints like nonflexible sensing architectures, high energy use, and high-latency because of [...] Read more.
The fast growth of smart city systems requires sensing and intelligence systems that are dynamic, power-efficient, and have capabilities of real-time decision-making. The traditional IoT-based smart city systems are subject to constraints like nonflexible sensing architectures, high energy use, and high-latency because of processing on clouds. To solve these problems, in this paper, a reconfigurable metasurface-based Artificial Intelligence of Things (AIoT) architecture of smart cities is proposed. The proposed system incorporates programmable electromagnetic metasurface-based sensing, edge-level Artificial Intelligence, and AIoT gateways to implement ultra-sensitive sensing, low-latency analytics, and effective resource utilization. A computer algorithm with a hybrid realization between metasurface physics and neural network-based learning can be used to improve the accuracy and flexibility of sensing. The experimental analysis with publicly available data of a smart city proves that the proposed framework can attain an accuracy of sensing in the range of 92% and 97%, by far surpassing traditional IoT sensors, with 78% and 83% as the accuracy limits. Moreover, the suggested system shortens end-to-end latency to as low as 3645 ms, as compared to 8490 ms, and also reduces the power usage. The improved sensing efficiency, which is defined as the ratio of power consumption to accuracy, is obtained in all test conditions. These findings validate that the suggested AIoT framework, powered by the metasurface, can be used to offer a scalable and low-latency solution that uses less energy when it is deployed in applications linked to smart cities of the next generation. Full article
(This article belongs to the Proceedings of The 6th International Electronic Conference on Applied Sciences)
Show Figures

Figure 1

26 pages, 14884 KB  
Review
A Review on Forest Fire Detection Techniques: Past, Present, and Sustainable Future
by Alimul Haque Khan, Ali Newaz Bahar and Khan Wahid
Sensors 2026, 26(5), 1609; https://doi.org/10.3390/s26051609 - 4 Mar 2026
Viewed by 783
Abstract
Forest fires are a major concern due to their significant impact on the environment, economy, and wildlife habitats. Efficient early detection systems can significantly mitigate their devastating effects. This paper provides a comprehensive review of forest fire detection (FFD) techniques and traces their [...] Read more.
Forest fires are a major concern due to their significant impact on the environment, economy, and wildlife habitats. Efficient early detection systems can significantly mitigate their devastating effects. This paper provides a comprehensive review of forest fire detection (FFD) techniques and traces their evolution from basic lookout-based methods to sophisticated remote sensing technologies, including recent Internet of Things (IoT)- and Unmanned Aerial Vehicle (UAV)-based sensor network systems. Historical methods, characterized primarily by human surveillance and basic electronic sensors, laid the foundation for modern techniques. Recently, there has been a noticeable shift toward ground-based sensors, automated camera systems, aerial surveillance using drones and aircraft, and satellite imaging. Moreover, the rise of Artificial Intelligence (AI), Machine Learning (ML), and the IoT introduces a new era of advanced detection capabilities. These detection systems are being actively deployed in wildfire-prone regions, where early alerts have proven critical in minimizing damage and aiding rapid response. All FFD techniques follow a common path of data collection, pre-processing, data compression, transmission, and post-processing. Providing sufficient power to complete these tasks is also an important area of research. Recent research focuses on image compression techniques, data transmission, the application of ML and AI at edge nodes and servers, and the minimization of energy consumption, among other emerging directions. However, to build a sustainable FFD model, proper sensor deployment is essential. Sensors can be either fixed at specific geographic locations or attached to UAVs. In some cases, a combination of fixed and UAV-mounted sensors may be used. Careful planning of sensor deployment is essential for the success of the model. Moreover, ensuring adequate energy supply for both ground-based and UAV-based sensors is important. Replacing sensor batteries or recharging UAVs in remote areas is highly challenging, particularly in the absence of an operator. Hence, future FFD systems must prioritize not only detection accuracy but also long-term energy autonomy and strategic sensor placement. Integrating renewable energy sources, optimizing data processing, and ensuring minimal human intervention will be key to developing truly sustainable and scalable solutions. This review aims to guide researchers and developers in designing next-generation FFD systems aligned with practical field demands and environmental resilience. Full article
(This article belongs to the Section Environmental Sensing)
Show Figures

Figure 1

Back to TopTop