Next Article in Journal
Study on the Degradation Pattern of Impact Crater Populations in Yutu-2′s Rovering Area
Previous Article in Journal
Change Detection Methods for Remote Sensing in the Last Decade: A Comprehensive Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Deep Neural Network-Based Flood Monitoring System Fusing RGB and LWIR Cameras for Embedded IoT Edge Devices

1
Department of Artificial Intelligence and Robotics, Sejong University, 209 Neungdong-ro, Gwangjin-gu, Seoul 05006, Republic of Korea
2
Department of Electronic Engineering, Korea National University of Transportation, 50 Daehak-ro, Chungju-si 27469, Republic of Korea
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(13), 2358; https://doi.org/10.3390/rs16132358
Submission received: 9 April 2024 / Revised: 19 June 2024 / Accepted: 25 June 2024 / Published: 27 June 2024

Abstract

Floods are among the most common disasters, causing loss of life and enormous damage to private property and public infrastructure. Monitoring systems that detect and predict floods help respond quickly in the pre-disaster phase to prevent and mitigate flood risk and damages. Thus, this paper presents a deep neural network (DNN)-based real-time flood monitoring system for embedded Internet of Things (IoT) edge devices. The proposed system fuses long-wave infrared (LWIR) and RGB cameras to overcome a critical drawback of conventional RGB camera-based systems: severe performance deterioration at night. This system recognizes areas occupied by water using a DNN-based semantic segmentation network, whose input is a combination of RGB and LWIR images. Flood warning levels are predicted based on the water occupancy ratio calculated by the water segmentation result. The warning information is delivered to authorized personnel via a mobile message service. For real-time edge computing, the heavy semantic segmentation network is simplified by removing unimportant channels while maintaining performance by utilizing the network slimming technique. Experiments were conducted based on the dataset acquired from the sensor module with RGB and LWIR cameras installed in a flood-prone area. The results revealed that the proposed system successfully conducts water segmentation and correctly sends flood warning messages in both daytime and nighttime. Furthermore, all of the algorithms in this system were embedded on an embedded IoT edge device with a Qualcomm QCS610 System on Chip (SoC) and operated in real time.
Keywords: flood monitoring; semantic segmentation; network slimming; edge computing; surveillance camera; multimodal sensor flood monitoring; semantic segmentation; network slimming; edge computing; surveillance camera; multimodal sensor

Share and Cite

MDPI and ACS Style

Lee, Y.J.; Hwang, J.Y.; Park, J.; Jung, H.G.; Suhr, J.K. Deep Neural Network-Based Flood Monitoring System Fusing RGB and LWIR Cameras for Embedded IoT Edge Devices. Remote Sens. 2024, 16, 2358. https://doi.org/10.3390/rs16132358

AMA Style

Lee YJ, Hwang JY, Park J, Jung HG, Suhr JK. Deep Neural Network-Based Flood Monitoring System Fusing RGB and LWIR Cameras for Embedded IoT Edge Devices. Remote Sensing. 2024; 16(13):2358. https://doi.org/10.3390/rs16132358

Chicago/Turabian Style

Lee, Youn Joo, Jun Young Hwang, Jiwon Park, Ho Gi Jung, and Jae Kyu Suhr. 2024. "Deep Neural Network-Based Flood Monitoring System Fusing RGB and LWIR Cameras for Embedded IoT Edge Devices" Remote Sensing 16, no. 13: 2358. https://doi.org/10.3390/rs16132358

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop