Next Article in Journal
Urban Building Type Mapping Using Geospatial Data: A Case Study of Beijing, China
Next Article in Special Issue
Modality-Free Feature Detector and Descriptor for Multimodal Remote Sensing Image Registration
Previous Article in Journal
Landsat-8 and Sentinel-2 Canopy Water Content Estimation in Croplands through Radiative Transfer Model Inversion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images

School of Mathematics and Statistics, Xi’an Jiaotong University, Xi’an 710049, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(17), 2804; https://doi.org/10.3390/rs12172804
Submission received: 31 July 2020 / Revised: 22 August 2020 / Accepted: 24 August 2020 / Published: 29 August 2020
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing)

Abstract

Pansharpening is a typical image fusion problem, which aims to produce a high resolution multispectral (HRMS) image by integrating a high spatial resolution panchromatic (PAN) image with a low spatial resolution multispectral (MS) image. Prior arts have used either component substitution (CS)-based methods or multiresolution analysis (MRA)-based methods for this propose. Although they are simple and easy to implement, they usually suffer from spatial or spectral distortions and could not fully exploit the spatial and/or spectral information existed in PAN and MS images. By considering their complementary performances and with the goal of combining their advantages, we propose a pansharpening weight network (PWNet) to adaptively average the fusion results obtained by different methods. The proposed PWNet works by learning adaptive weight maps for different CS-based and MRA-based methods through an end-to-end trainable neural network (NN). As a result, the proposed PWN inherits the data adaptability or flexibility of NN, while maintaining the advantages of traditional methods. Extensive experiments on data sets acquired by three different kinds of satellites demonstrate the superiority of the proposed PWNet and its competitiveness with the state-of-the-art methods.
Keywords: pansharpening; component substitution; multiresolution analysis; neural networks; adaptive weight pansharpening; component substitution; multiresolution analysis; neural networks; adaptive weight
Graphical Abstract

Share and Cite

MDPI and ACS Style

Liu, J.; Feng, Y.; Zhou, C.; Zhang, C. PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images. Remote Sens. 2020, 12, 2804. https://doi.org/10.3390/rs12172804

AMA Style

Liu J, Feng Y, Zhou C, Zhang C. PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images. Remote Sensing. 2020; 12(17):2804. https://doi.org/10.3390/rs12172804

Chicago/Turabian Style

Liu, Junmin, Yunqiao Feng, Changsheng Zhou, and Chunxia Zhang. 2020. "PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images" Remote Sensing 12, no. 17: 2804. https://doi.org/10.3390/rs12172804

APA Style

Liu, J., Feng, Y., Zhou, C., & Zhang, C. (2020). PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images. Remote Sensing, 12(17), 2804. https://doi.org/10.3390/rs12172804

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop