Next Article in Journal
Comparing Link Budget Requirements for Future Space-Based Interferometers
Previous Article in Journal
Current Status of Remote Sensing for Studying the Impacts of Hurricanes on Mangrove Forests in the Coastal United States
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

MAFNet: Multimodal Asymmetric Fusion Network for Radar Echo Extrapolation

by
Yanle Pei
1,2,
Qian Li
1,2,*,
Yayi Wu
1,2,
Xuan Peng
1,2,
Shiqing Guo
1,2,
Chengzhi Ye
2,3 and
Tianying Wang
2,3
1
The College of Meteorology and Oceanography, National University of Defense Technology, Changsha 410005, China
2
The High Impact Weather Key Laboratory of China Meteorological Administration (CMA), Changsha 410005, China
3
The Institute of Meteorological Sciences of Hunan Province, Changsha 410118, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(19), 3597; https://doi.org/10.3390/rs16193597
Submission received: 8 August 2024 / Revised: 17 September 2024 / Accepted: 24 September 2024 / Published: 26 September 2024
(This article belongs to the Special Issue Advanced AI Technology for Remote Sensing Analysis)

Abstract

Radar echo extrapolation (REE) is a crucial method for convective nowcasting, and current deep learning (DL)-based methods for REE have shown significant potential in severe weather forecasting tasks. Existing DL-based REE methods use extensive historical radar data to learn the evolution patterns of echoes, they tend to suffer from low accuracy. This is because data of radar modality face difficulty adequately representing the state of weather systems. Inspired by multimodal learning and traditional numerical weather prediction (NWP) methods, we propose a Multimodal Asymmetric Fusion Network (MAFNet) for REE, which uses data from radar modality to model echo evolution, and data from satellite and ground observation modalities to model the background field of weather systems, collectively guiding echo extrapolation. In the MAFNet, we first extract overall convective features through a global shared encoder (GSE), followed by two branches of local modality encoder (LME) and local correlation encoders (LCEs) that extract convective features from radar, satellite, and ground observation modalities. We employ an multimodal asymmetric fusion module (MAFM) to fuse multimodal features at different scales and feature levels, enhancing radar echo extrapolation performance. Additionally, to address the temporal resolution differences in multimodal data, we design a time alignment module based on dynamic time warping (DTW), which aligns multimodal feature sequences temporally. Experimental results demonstrate that compared to state-of-the-art (SOTA) models, the MAFNet achieves average improvements of 1.86% in CSI and 3.18% in HSS on the MeteoNet dataset, and average improvements of 4.84% in CSI and 2.38% in HSS on the RAIN-F dataset.
Keywords: radar echo extrapolation (REE); convective nowcasting; multimodal data; asymmetric fusion radar echo extrapolation (REE); convective nowcasting; multimodal data; asymmetric fusion

Share and Cite

MDPI and ACS Style

Pei, Y.; Li, Q.; Wu, Y.; Peng, X.; Guo, S.; Ye, C.; Wang, T. MAFNet: Multimodal Asymmetric Fusion Network for Radar Echo Extrapolation. Remote Sens. 2024, 16, 3597. https://doi.org/10.3390/rs16193597

AMA Style

Pei Y, Li Q, Wu Y, Peng X, Guo S, Ye C, Wang T. MAFNet: Multimodal Asymmetric Fusion Network for Radar Echo Extrapolation. Remote Sensing. 2024; 16(19):3597. https://doi.org/10.3390/rs16193597

Chicago/Turabian Style

Pei, Yanle, Qian Li, Yayi Wu, Xuan Peng, Shiqing Guo, Chengzhi Ye, and Tianying Wang. 2024. "MAFNet: Multimodal Asymmetric Fusion Network for Radar Echo Extrapolation" Remote Sensing 16, no. 19: 3597. https://doi.org/10.3390/rs16193597

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop