Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (213)

Search Parameters:
Keywords = cloud–edge collaborative

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 12889 KB  
Article
Development of a Multi-Robot System for Autonomous Inspection of Nuclear Waste Tank Pits
by Pengcheng Cao, Edward Kaleb Houck, Anthony D'Andrea, Robert Kinoshita, Kristan B. Egan, Porter J. Zohner and Yidong Xia
Appl. Sci. 2025, 15(17), 9307; https://doi.org/10.3390/app15179307 - 24 Aug 2025
Viewed by 702
Abstract
This paper introduces the overall design plan, development timeline, and preliminary progress of the Autonomous Pit Exploration System project. This project aims to develop an advanced multi-robot system for the efficient inspection of nuclear waste-storage tank pits. The project is structured into three [...] Read more.
This paper introduces the overall design plan, development timeline, and preliminary progress of the Autonomous Pit Exploration System project. This project aims to develop an advanced multi-robot system for the efficient inspection of nuclear waste-storage tank pits. The project is structured into three phases: Phase 1 involves data collection and interface definition in collaboration with Hanford Site experts and university partners, focusing on tank riser geometry and hardware solutions. Phase 2 includes the selection of sensors and robot components, detailed mechanical design, and prototyping. Phase 3 integrates all components into a cohesive system managed by a master control package which also incorporates digital twin and surrogate models, and culminates in comprehensive testing and validation at a simulated tank pit at the Idaho National Laboratory. Additionally, the system’s communication design ensures coordinated operation through shared data, power, and control signals. For transportation and deployment, an electric vehicle (EV) is chosen to support the system for a full 10 h shift with better regulatory compliance for field deployment. A telescopic arm design is selected for its simple configuration and superior reach capability and controllability. Preliminary testing utilizes an educational robot to demonstrate the feasibility of splitting computational tasks between edge and cloud computers. Successful simultaneous localization and mapping (SLAM) tasks validate our distributed computing approach. More design considerations are also discussed, including radiation hardness assurance, SLAM performance, software transferability, and digital twinning strategies. Full article
(This article belongs to the Special Issue Mechatronic Systems Design and Optimization)
Show Figures

Figure 1

38 pages, 5163 KB  
Article
A Coordinated Adaptive Signal Control Method Based on Queue Evolution and Delay Modeling Approach
by Ruochen Hao, Yongjia Wang, Ziyu Wang, Lide Yang and Tuo Sun
Appl. Sci. 2025, 15(17), 9294; https://doi.org/10.3390/app15179294 - 24 Aug 2025
Viewed by 325
Abstract
Coordinated adaptive signal control is a proven strategy for improving traffic efficiency and minimizing vehicular delays. First, we develop a Queue Evolution and Delay Model (QEDM) that establishes the relationship between detector-measured queue lengths and model parameters. QEDM accurately characterizes residual queue dynamics [...] Read more.
Coordinated adaptive signal control is a proven strategy for improving traffic efficiency and minimizing vehicular delays. First, we develop a Queue Evolution and Delay Model (QEDM) that establishes the relationship between detector-measured queue lengths and model parameters. QEDM accurately characterizes residual queue dynamics (accumulation and dissipation), significantly enhancing delay estimation accuracy under oversaturated conditions. Secondly, we propose a novel intersection-level signal optimization method that addresses key practical challenges: (1) pedestrian stages, overlap phases; (2) coupling effects between signal cycle and queue length; and (3) stochastic vehicle arrivals in undersaturated conditions. Unlike conventional approaches, this method proactively shortens signal cycles to reduce queues while avoiding suboptimal solutions that artificially “dilute” delays by extending cycles. Thirdly, we introduce an adaptive coordination control framework that maintains arterial-level green-band progression while maximizing intersection-level adaptive optimization flexibility. To bridge theory and practice, we design a cloud–edge–terminal collaborative deployment architecture for scalable signal control implementation and validate the framework through a hardware-in-the-loop simulation platform. Case studies in real-world scenarios demonstrate that the proposed method outperforms existing benchmarks in delay estimation accuracy, average vehicle delay, and travel time in coordinated directions. Additionally, we analyze the influence of coordination constraint update intervals on system performance, providing actionable insights for adaptive control systems. Full article
17 pages, 1423 KB  
Article
Research on Endogenous Security Defense for Cloud-Edge Collaborative Industrial Control Systems Based on Luenberger Observer
by Lin Guan, Ci Tao and Ping Chen
Mathematics 2025, 13(17), 2703; https://doi.org/10.3390/math13172703 - 22 Aug 2025
Viewed by 269
Abstract
Industrial Control Systems (ICSs) are fundamental to critical infrastructure, yet they face increasing cybersecurity threats, particularly data integrity attacks like replay and data forgery attacks. Traditional IT-centric security measures are often inadequate for the Operational Technology (OT) environment due to stringent real-time and [...] Read more.
Industrial Control Systems (ICSs) are fundamental to critical infrastructure, yet they face increasing cybersecurity threats, particularly data integrity attacks like replay and data forgery attacks. Traditional IT-centric security measures are often inadequate for the Operational Technology (OT) environment due to stringent real-time and reliability requirements. This paper proposes an endogenous security defense mechanism based on the Luenberger observer and residual analysis. By embedding a mathematical model of the physical process into the control system, this approach enables real-time state estimation and anomaly detection. We model the ICS using a linear state-space representation and design a Luenberger observer to generate a residual signal, which is the difference between the actual sensor measurements and the observer’s predictions. Under normal conditions, this residual is minimal, but it deviates significantly during a replay attack. We formalize the system model, observer design, and attack detection algorithm. The effectiveness of the proposed method is validated through a simulation of an ICS under a replay attack. The results demonstrate that the residual-based approach can detect the attack promptly and effectively, providing a lightweight yet robust solution for enhancing ICS security. Full article
(This article belongs to the Special Issue Research and Application of Network and System Security)
Show Figures

Figure 1

45 pages, 2283 KB  
Review
Agricultural Image Processing: Challenges, Advances, and Future Trends
by Xuehua Song, Letian Yan, Sihan Liu, Tong Gao, Li Han, Xiaoming Jiang, Hua Jin and Yi Zhu
Appl. Sci. 2025, 15(16), 9206; https://doi.org/10.3390/app15169206 - 21 Aug 2025
Viewed by 369
Abstract
Agricultural image processing technology plays a critical role in enabling precise disease detection, accurate yield prediction, and various smart agriculture applications. However, its practical implementation faces key challenges, including environmental interference, data scarcity and imbalance datasets, and the difficulty of deploying models on [...] Read more.
Agricultural image processing technology plays a critical role in enabling precise disease detection, accurate yield prediction, and various smart agriculture applications. However, its practical implementation faces key challenges, including environmental interference, data scarcity and imbalance datasets, and the difficulty of deploying models on resource-constrained edge devices. This paper presents a systematic review of recent advances in addressing these challenges, with a focus on three core aspects: environmental robustness, data efficiency, and model deployment. The study identifies that attention mechanisms, Transformers, multi-scale feature fusion, and domain adaptation can enhance model robustness under complex conditions. Self-supervised learning, transfer learning, GAN-based data augmentation, SMOTE improvements, and Focal loss optimization effectively alleviate data limitations. Furthermore, model compression techniques such as pruning, quantization, and knowledge distillation facilitate efficient deployment. Future research should emphasize multi-modal fusion, causal reasoning, edge–cloud collaboration, and dedicated hardware acceleration. Integrating agricultural expertise with AI is essential for promoting large-scale adoption, as well as achieving intelligent, sustainable agricultural systems. Full article
(This article belongs to the Special Issue Pattern Recognition Applications of Neural Networks and Deep Learning)
Show Figures

Figure 1

30 pages, 3477 KB  
Article
Dynamic Task Scheduling Based on Greedy and Deep Reinforcement Learning Algorithms for Cloud–Edge Collaboration in Smart Buildings
by Ping Yang and Jiangmin He
Electronics 2025, 14(16), 3327; https://doi.org/10.3390/electronics14163327 - 21 Aug 2025
Viewed by 438
Abstract
Driven by technologies such as the Internet of Things and artificial intelligence, smart buildings have developed rapidly, and the demand for processing massive amounts of data has risen sharply. Traditional cloud computing is confronted with challenges such as high network latency and large [...] Read more.
Driven by technologies such as the Internet of Things and artificial intelligence, smart buildings have developed rapidly, and the demand for processing massive amounts of data has risen sharply. Traditional cloud computing is confronted with challenges such as high network latency and large bandwidth pressure. Although edge computing can effectively reduce latency, it has problems such as resource limitations and difficulties with cluster collaboration. Therefore, cloud–edge collaboration has become an inevitable choice to meet the real-time and reliability requirements of smart buildings. In view of the problems with the existing task scheduling methods in the smart building scenario, such as ignoring container compatibility constraints, the difficulty in balancing global optimization and real-time performance, and the difficulty in adapting to the dynamic environments, this paper proposes a two-stage cloud-edge collaborative dynamic task scheduling mechanism. Firstly, a task scheduling system model supporting container compatibility was constructed, aiming to minimize system latency and energy consumption while ensuring the real-time requirements of tasks were met. Secondly, for this task-scheduling problem, a hierarchical and progressive solution was designed: In the first stage, a Resource-Aware Cost-Driven Greedy algorithm (RACDG) was proposed to enable edge nodes to quickly generate the initial task offloading decision. In the second stage, for the tasks that need to be offloaded in the initial decision-making, a Proximal Policy Optimization algorithm based on Action Masks (AMPPO) is proposed to achieve global dynamic scheduling. Finally, in the simulation experiments, the comparison with other classical algorithms shows that the algorithm proposed in this paper can reduce the system delay by 26–63.7%, reduce energy consumption by 21.7–66.9%, and still maintain a task completion rate of more than 91.3% under high-load conditions. It has good scheduling robustness and application potential. It provides an effective solution for the cloud–edge collaborative task scheduling of smart buildings. Full article
Show Figures

Figure 1

18 pages, 1916 KB  
Article
Assessing Cross-Domain Threats in Cloud–Edge-Integrated Industrial Control Systems
by Lei Zhang, Yi Wang, Cheng Chang and Xingqiu Shen
Electronics 2025, 14(16), 3242; https://doi.org/10.3390/electronics14163242 - 15 Aug 2025
Viewed by 420
Abstract
As Industrial Control Systems (ICSs) increasingly adopt cloud–edge collaborative architectures, they face escalating risks from complex cross-domain cyber threats. To address this challenge, we propose a novel threat assessment framework specifically designed for cloud–edge-integrated ICSs. Our approach systematically identifies and evaluates security risks [...] Read more.
As Industrial Control Systems (ICSs) increasingly adopt cloud–edge collaborative architectures, they face escalating risks from complex cross-domain cyber threats. To address this challenge, we propose a novel threat assessment framework specifically designed for cloud–edge-integrated ICSs. Our approach systematically identifies and evaluates security risks across cyber and physical boundaries by building a structured dataset of ICS assets, attack entry points, techniques, and impacts. We introduce a unique set of evaluation indicators spanning four key dimensions—system modules, attack paths, attack methods, and potential impacts—providing a holistic view of cyber threats. Through simulation experiments on a representative ICS scenario inspired by real-world attacks, we demonstrate the framework’s effectiveness in detecting vulnerabilities and quantifying security posture improvements. Our results underscore the framework’s practical utility in guiding targeted defense strategies and its potential to advance research on cloud–edge ICS security. This work not only fills gaps in the existing methodologies but also provides new insights and tools applicable to sectors such as smart grids, intelligent manufacturing, and critical infrastructure protection. Full article
(This article belongs to the Special Issue Knowledge Information Extraction Research)
Show Figures

Figure 1

29 pages, 1615 KB  
Review
Internet of Things Driven Digital Twin for Intelligent Manufacturing in Shipbuilding Workshops
by Caiping Liang, Xiang Li, Wenxu Niu and Yansong Zhang
Future Internet 2025, 17(8), 368; https://doi.org/10.3390/fi17080368 - 14 Aug 2025
Viewed by 510
Abstract
Intelligent manufacturing research has focused on digital twins (DTs) due to the growing integration of physical and cyber systems. This study thoroughly explores the Internet of Things (IoT) as a cornerstone of DTs, showing its promise and limitations in intelligent shipbuilding digital transformation [...] Read more.
Intelligent manufacturing research has focused on digital twins (DTs) due to the growing integration of physical and cyber systems. This study thoroughly explores the Internet of Things (IoT) as a cornerstone of DTs, showing its promise and limitations in intelligent shipbuilding digital transformation workshops. We analyze the progress of IoT protocols, digital twin frameworks, and intelligent ship manufacturing. A unique bidirectional digital twin system for shipbuilding workshops uses the Internet of Things to communicate data between real and virtual workshops. This research uses a steel-cutting workshop to demonstrate the digital transformation of the production line, including data collection, transmission, storage, and simulation analysis. Then, major hurdles to digital technology application in shipbuilding are comprehensively examined. Critical barriers to DT deployment in shipbuilding environments are systematically analyzed, including technical standard unification, communication security, real-time performance guarantees, cross-workshop collaboration mechanisms, and the deep integration of artificial intelligence. Adaptive solutions include hybrid edge-cloud computing architectures for latency-sensitive tasks and reinforcement learning-based smart scheduling algorithms. The findings suggest that IoT-driven digital transformation may modernize shipbuilding workshops in new ways. Full article
Show Figures

Figure 1

20 pages, 706 KB  
Article
FedRP: Region-Specific Personalized Identification for Large-Scale IoT Systems
by Yuhan Jin, Bin Cao, Junfei Wang, Benkuan Zhou, Jiacheng Wang, Yingdong Liu, Fuwei Guo and Bo Xu
Symmetry 2025, 17(8), 1308; https://doi.org/10.3390/sym17081308 - 13 Aug 2025
Viewed by 339
Abstract
The widespread adoption of Internet of Things (IoT) technology has significantly expanded the scale at which devices are connected, posing new challenges to maintaining symmetry in network management. Traditional centralized identification architectures adopt a symmetric processing paradigm in which all device data are [...] Read more.
The widespread adoption of Internet of Things (IoT) technology has significantly expanded the scale at which devices are connected, posing new challenges to maintaining symmetry in network management. Traditional centralized identification architectures adopt a symmetric processing paradigm in which all device data are uniformly transmitted to the cloud for processing. However, this rigid symmetric structure fails to accommodate the asymmetric distribution typical of IoT edge devices. To address these challenges, this paper proposes an asymmetric identification framework based on cloud–edge collaboration, exploring a high-performance, resource-efficient, and privacy-preserving solution for IoT device identification. The proposed region-specific personalized algorithm (FedRP) introduces a region-specific, personalized identification approach grounded in federated learning principles. Firstly, FedRP leverages a decentralized processing framework to enhance data security by processing data locally. Secondly, it employs a personalized federated learning framework to optimize local models, thus improving identification accuracy and effectiveness. Finally, FedRP strategically separates the personalized parameters of transformer-based blocks from shared parameters and selectively transmits them, reducing the burden on network resources. Comprehensive comparative experiments demonstrate the efficacy of the proposed approach for large-scale IoT environments, which are characterized by numerous devices and complex network conditions. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

27 pages, 502 KB  
Article
A Blockchain-Based Secure Data Transaction and Privacy Preservation Scheme in IoT System
by Jing Wu, Zeteng Bian, Hongmin Gao and Yuzhe Wang
Sensors 2025, 25(15), 4854; https://doi.org/10.3390/s25154854 - 7 Aug 2025
Viewed by 386
Abstract
With the explosive growth of Internet of Things (IoT) devices, massive amounts of heterogeneous data are continuously generated. However, IoT data transactions and sharing face multiple challenges such as limited device resources, untrustworthy network environment, highly sensitive user privacy, and serious data silos. [...] Read more.
With the explosive growth of Internet of Things (IoT) devices, massive amounts of heterogeneous data are continuously generated. However, IoT data transactions and sharing face multiple challenges such as limited device resources, untrustworthy network environment, highly sensitive user privacy, and serious data silos. How to achieve fine-grained access control and privacy protection for massive devices while ensuring secure and reliable data circulation has become a key issue that needs to be urgently addressed in the current IoT field. To address the above challenges, this paper proposes a blockchain-based data transaction and privacy protection framework. First, the framework builds a multi-layer security architecture that integrates blockchain and IPFS and adapts to the “end–edge–cloud” collaborative characteristics of IoT. Secondly, a data sharing mechanism that takes into account both access control and interest balance is designed. On the one hand, the mechanism uses attribute-based encryption (ABE) technology to achieve dynamic and fine-grained access control for massive heterogeneous IoT devices; on the other hand, it introduces a game theory-driven dynamic pricing model to effectively balance the interests of both data supply and demand. Finally, in response to the needs of confidential analysis of IoT data, a secure computing scheme based on CKKS fully homomorphic encryption is proposed, which supports efficient statistical analysis of encrypted sensor data without leaking privacy. Security analysis and experimental results show that this scheme is secure under standard cryptographic assumptions and can effectively resist common attacks in the IoT environment. Prototype system testing verifies the functional completeness and performance feasibility of the scheme, providing a complete and effective technical solution to address the challenges of data integrity, verifiable transactions, and fine-grained access control, while mitigating the reliance on a trusted central authority in IoT data sharing. Full article
(This article belongs to the Special Issue Blockchain-Based Solutions to Secure IoT)
Show Figures

Figure 1

32 pages, 1435 KB  
Review
Smart Safety Helmets with Integrated Vision Systems for Industrial Infrastructure Inspection: A Comprehensive Review of VSLAM-Enabled Technologies
by Emmanuel A. Merchán-Cruz, Samuel Moveh, Oleksandr Pasha, Reinis Tocelovskis, Alexander Grakovski, Alexander Krainyukov, Nikita Ostrovenecs, Ivans Gercevs and Vladimirs Petrovs
Sensors 2025, 25(15), 4834; https://doi.org/10.3390/s25154834 - 6 Aug 2025
Viewed by 949
Abstract
Smart safety helmets equipped with vision systems are emerging as powerful tools for industrial infrastructure inspection. This paper presents a comprehensive state-of-the-art review of such VSLAM-enabled (Visual Simultaneous Localization and Mapping) helmets. We surveyed the evolution from basic helmet cameras to intelligent, sensor-fused [...] Read more.
Smart safety helmets equipped with vision systems are emerging as powerful tools for industrial infrastructure inspection. This paper presents a comprehensive state-of-the-art review of such VSLAM-enabled (Visual Simultaneous Localization and Mapping) helmets. We surveyed the evolution from basic helmet cameras to intelligent, sensor-fused inspection platforms, highlighting how modern helmets leverage real-time visual SLAM algorithms to map environments and assist inspectors. A systematic literature search was conducted targeting high-impact journals, patents, and industry reports. We classify helmet-integrated camera systems into monocular, stereo, and omnidirectional types and compare their capabilities for infrastructure inspection. We examine core VSLAM algorithms (feature-based, direct, hybrid, and deep-learning-enhanced) and discuss their adaptation to wearable platforms. Multi-sensor fusion approaches integrating inertial, LiDAR, and GNSS data are reviewed, along with edge/cloud processing architectures enabling real-time performance. This paper compiles numerous industrial use cases, from bridges and tunnels to plants and power facilities, demonstrating significant improvements in inspection efficiency, data quality, and worker safety. Key challenges are analyzed, including technical hurdles (battery life, processing limits, and harsh environments), human factors (ergonomics, training, and cognitive load), and regulatory issues (safety certification and data privacy). We also identify emerging trends, such as semantic SLAM, AI-driven defect recognition, hardware miniaturization, and collaborative multi-helmet systems. This review finds that VSLAM-equipped smart helmets offer a transformative approach to infrastructure inspection, enabling real-time mapping, augmented awareness, and safer workflows. We conclude by highlighting current research gaps, notably in standardizing systems and integrating with asset management, and provide recommendations for industry adoption and future research directions. Full article
Show Figures

Figure 1

21 pages, 1800 KB  
Article
GAPSO: Cloud-Edge-End Collaborative Task Offloading Based on Genetic Particle Swarm Optimization
by Wu Wen, Yibin Huang, Zhong Xiao, Lizhuang Tan and Peiying Zhang
Symmetry 2025, 17(8), 1225; https://doi.org/10.3390/sym17081225 - 3 Aug 2025
Viewed by 423
Abstract
In the 6G era, the proliferation of smart devices has led to explosive growth in data volume. The traditional cloud computing can no longer meet the demand for efficient processing of large amounts of data. Edge computing can solve the energy loss problems [...] Read more.
In the 6G era, the proliferation of smart devices has led to explosive growth in data volume. The traditional cloud computing can no longer meet the demand for efficient processing of large amounts of data. Edge computing can solve the energy loss problems caused by transmission delay and multi-level forwarding in cloud computing by processing data close to the data source. In this paper, we propose a cloud–edge–end collaborative task offloading strategy with task response time and execution energy consumption as the optimization targets under a limited resource environment. The tasks generated by smart devices can be processed using three kinds of computing nodes, including user devices, edge servers, and cloud servers. The computing nodes are constrained by bandwidth and computing resources. For the target optimization problem, a genetic particle swarm optimization algorithm considering three layers of computing nodes is designed. The task offloading optimization is performed by introducing (1) opposition-based learning algorithm, (2) adaptive inertia weights, and (3) adjustive acceleration coefficients. All metaheuristic algorithms adopt a symmetric training method to ensure fairness and consistency in evaluation. Through experimental simulation, compared with the classic evolutionary algorithm, our algorithm reduces the objective function value by about 6–12% and has higher algorithm convergence speed, accuracy, and stability. Full article
Show Figures

Figure 1

23 pages, 3580 KB  
Article
Distributed Collaborative Data Processing Framework for Unmanned Platforms Based on Federated Edge Intelligence
by Siyang Liu, Nanliang Shan, Xianqiang Bao and Xinghua Xu
Sensors 2025, 25(15), 4752; https://doi.org/10.3390/s25154752 - 1 Aug 2025
Viewed by 487
Abstract
Unmanned platforms such as unmanned aerial vehicles, unmanned ground vehicles, and autonomous underwater vehicles often face challenges of data, device, and model heterogeneity when performing collaborative data processing tasks. Existing research does not simultaneously address issues from these three aspects. To address this [...] Read more.
Unmanned platforms such as unmanned aerial vehicles, unmanned ground vehicles, and autonomous underwater vehicles often face challenges of data, device, and model heterogeneity when performing collaborative data processing tasks. Existing research does not simultaneously address issues from these three aspects. To address this issue, this study designs an unmanned platform cluster architecture inspired by the cloud-edge-end model. This architecture integrates federated learning for privacy protection, leverages the advantages of distributed model training, and utilizes edge computing’s near-source data processing capabilities. Additionally, this paper proposes a federated edge intelligence method (DSIA-FEI), which comprises two key components. Based on traditional federated learning, a data sharing mechanism is introduced, in which data is extracted from edge-side platforms and placed into a data sharing platform to form a public dataset. At the beginning of model training, random sampling is conducted from the public dataset and distributed to each unmanned platform, so as to mitigate the impact of data distribution heterogeneity and class imbalance during collaborative data processing in unmanned platforms. Moreover, an intelligent model aggregation strategy based on similarity measurement and loss gradient is developed. This strategy maps heterogeneous model parameters to a unified space via hierarchical parameter alignment, and evaluates the similarity between local and global models of edge devices in real-time, along with the loss gradient, to select the optimal model for global aggregation, reducing the influence of device and model heterogeneity on cooperative learning of unmanned platform swarms. This study carried out extensive validation on multiple datasets, and the experimental results showed that the accuracy of the DSIA-FEI proposed in this paper reaches 0.91, 0.91, 0.88, and 0.87 on the FEMNIST, FEAIR, EuroSAT, and RSSCN7 datasets, respectively, which is more than 10% higher than the baseline method. In addition, the number of communication rounds is reduced by more than 40%, which is better than the existing mainstream methods, and the effectiveness of the proposed method is verified. Full article
Show Figures

Figure 1

21 pages, 4738 KB  
Article
Research on Computation Offloading and Resource Allocation Strategy Based on MADDPG for Integrated Space–Air–Marine Network
by Haixiang Gao
Entropy 2025, 27(8), 803; https://doi.org/10.3390/e27080803 - 28 Jul 2025
Viewed by 502
Abstract
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, [...] Read more.
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, UAVs and LEO satellites, traditional optimization methods encounter significant limitations due to non-convexity and the combinatorial explosion in possible solutions. A multi-agent deep deterministic policy gradient (MADDPG)-based optimization algorithm is proposed to address these challenges. This algorithm is designed to minimize the total system costs, balancing energy consumption and latency through partial task offloading within a cloud–edge-device collaborative mobile edge computing (MEC) system. A comprehensive system model is proposed, with the problem formulated as a partially observable Markov decision process (POMDP) that integrates association control, power control, computing resource allocation, and task distribution. Each M-IoT device and UAV acts as an intelligent agent, collaboratively learning the optimal offloading strategies through a centralized training and decentralized execution framework inherent in the MADDPG. The numerical simulations validate the effectiveness of the proposed MADDPG-based approach, which demonstrates rapid convergence and significantly outperforms baseline methods, and indicate that the proposed MADDPG-based algorithm reduces the total system cost by 15–60% specifically. Full article
(This article belongs to the Special Issue Space-Air-Ground-Sea Integrated Communication Networks)
Show Figures

Figure 1

17 pages, 1850 KB  
Article
Cloud–Edge Collaborative Model Adaptation Based on Deep Q-Network and Transfer Feature Extraction
by Jue Chen, Xin Cheng, Yanjie Jia and Shuai Tan
Appl. Sci. 2025, 15(15), 8335; https://doi.org/10.3390/app15158335 - 26 Jul 2025
Viewed by 462
Abstract
With the rapid development of smart devices and the Internet of Things (IoT), the explosive growth of data has placed increasingly higher demands on real-time processing and intelligent decision making. Cloud-edge collaborative computing has emerged as a mainstream architecture to address these challenges. [...] Read more.
With the rapid development of smart devices and the Internet of Things (IoT), the explosive growth of data has placed increasingly higher demands on real-time processing and intelligent decision making. Cloud-edge collaborative computing has emerged as a mainstream architecture to address these challenges. However, in sky-ground integrated systems, the limited computing capacity of edge devices and the inconsistency between cloud-side fusion results and edge-side detection outputs significantly undermine the reliability of edge inference. To overcome these issues, this paper proposes a cloud-edge collaborative model adaptation framework that integrates deep reinforcement learning via Deep Q-Networks (DQN) with local feature transfer. The framework enables category-level dynamic decision making, allowing for selective migration of classification head parameters to achieve on-demand adaptive optimization of the edge model and enhance consistency between cloud and edge results. Extensive experiments conducted on a large-scale multi-view remote sensing aircraft detection dataset demonstrate that the proposed method significantly improves cloud-edge consistency. The detection consistency rate reaches 90%, with some scenarios approaching 100%. Ablation studies further validate the necessity of the DQN-based decision strategy, which clearly outperforms static heuristics. In the model adaptation comparison, the proposed method improves the detection precision of the A321 category from 70.30% to 71.00% and the average precision (AP) from 53.66% to 53.71%. For the A330 category, the precision increases from 32.26% to 39.62%, indicating strong adaptability across different target types. This study offers a novel and effective solution for cloud-edge model adaptation under resource-constrained conditions, enhancing both the consistency of cloud-edge fusion and the robustness of edge-side intelligent inference. Full article
Show Figures

Figure 1

40 pages, 16352 KB  
Review
Surface Protection Technologies for Earthen Sites in the 21st Century: Hotspots, Evolution, and Future Trends in Digitalization, Intelligence, and Sustainability
by Yingzhi Xiao, Yi Chen, Yuhao Huang and Yu Yan
Coatings 2025, 15(7), 855; https://doi.org/10.3390/coatings15070855 - 20 Jul 2025
Viewed by 1063
Abstract
As vital material carriers of human civilization, earthen sites are experiencing continuous surface deterioration under the combined effects of weathering and anthropogenic damage. Traditional surface conservation techniques, due to their poor compatibility and limited reversibility, struggle to address the compound challenges of micro-scale [...] Read more.
As vital material carriers of human civilization, earthen sites are experiencing continuous surface deterioration under the combined effects of weathering and anthropogenic damage. Traditional surface conservation techniques, due to their poor compatibility and limited reversibility, struggle to address the compound challenges of micro-scale degradation and macro-scale deformation. With the deep integration of digital twin technology, spatial information technologies, intelligent systems, and sustainable concepts, earthen site surface conservation technologies are transitioning from single-point applications to multidimensional integration. However, challenges remain in terms of the insufficient systematization of technology integration and the absence of a comprehensive interdisciplinary theoretical framework. Based on the dual-core databases of Web of Science and Scopus, this study systematically reviews the technological evolution of surface conservation for earthen sites between 2000 and 2025. CiteSpace 6.2 R4 and VOSviewer 1.6 were used for bibliometric visualization analysis, which was innovatively combined with manual close reading of the key literature and GPT-assisted semantic mining (error rate < 5%) to efficiently identify core research themes and infer deeper trends. The results reveal the following: (1) technological evolution follows a three-stage trajectory—from early point-based monitoring technologies, such as remote sensing (RS) and the Global Positioning System (GPS), to spatial modeling technologies, such as light detection and ranging (LiDAR) and geographic information systems (GIS), and, finally, to today’s integrated intelligent monitoring systems based on multi-source fusion; (2) the key surface technology system comprises GIS-based spatial data management, high-precision modeling via LiDAR, 3D reconstruction using oblique photogrammetry, and building information modeling (BIM) for structural protection, while cutting-edge areas focus on digital twin (DT) and the Internet of Things (IoT) for intelligent monitoring, augmented reality (AR) for immersive visualization, and blockchain technologies for digital authentication; (3) future research is expected to integrate big data and cloud computing to enable multidimensional prediction of surface deterioration, while virtual reality (VR) will overcome spatial–temporal limitations and push conservation paradigms toward automation, intelligence, and sustainability. This study, grounded in the technological evolution of surface protection for earthen sites, constructs a triadic framework of “intelligent monitoring–technological integration–collaborative application,” revealing the integration needs between DT and VR for surface technologies. It provides methodological support for addressing current technical bottlenecks and lays the foundation for dynamic surface protection, solution optimization, and interdisciplinary collaboration. Full article
Show Figures

Graphical abstract

Back to TopTop