Next Article in Journal
Multipath Mitigation for BOC Signals Based on Prompt-Assisted-Offset Correlator
Previous Article in Journal
Measurement Method and Influencing Mechanism of Urban Subdistrict Vitality in Shanghai Based on Multisource Data
Previous Article in Special Issue
Ore-Waste Discrimination Using Supervised and Unsupervised Classification of Hyperspectral Images
 
 
Article
Peer-Review Record

An Unmixing-Based Multi-Attention GAN for Unsupervised Hyperspectral and Multispectral Image Fusion

Remote Sens. 2023, 15(4), 936; https://doi.org/10.3390/rs15040936
by Lijuan Su, Yuxiao Sui and Yan Yuan *
Reviewer 1:
Reviewer 2:
Remote Sens. 2023, 15(4), 936; https://doi.org/10.3390/rs15040936
Submission received: 27 December 2022 / Revised: 6 February 2023 / Accepted: 6 February 2023 / Published: 8 February 2023
(This article belongs to the Special Issue Advances in Hyperspectral Remote Sensing: Methods and Applications)

Round 1

Reviewer 1 Report

The authors introduce an unsupervised GAN model based on attention mechanism for spectral image fusion. The manuscript is of high quality and I recommend accept.

However, before publication, the following revisions should be made:

In section 3, the 3D spectral images were unfolded, which means that the spatial information may be lost in some way. The authors should discuss the pros and cons of this operation and mention other models such as vision transform which directly extract spatial information without unfolding.

It would be better if the authors present convergence curves of discriminators (D-net 1 and 2).

 

I may have missed, but I don't see weight initialization and parameter update strategies.

 

 

Author Response

Thank you for your comments. 

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

This work introduces the use of a GAN architecture to perform hyperspectral and multispectral data fusion. The design of the GAN is indeed interesting since it is based on the use of unmixing and uses a new attention mechanism proposed by the authors for this task. The study shows that the proposed method achieves better results than other methods (even similar DL) methods. In general I find this is a good quality work that introduces some interesting ideas for solving the problem of HSI-MSI data fusion. I recommend the publicatin of the article, but I have some minor the comments that the authors may consider to address:

- In section 4.4 it is introduced the joint loss function where the L_5 loss makes use of a sparsity parameter (set to 0.001). Later it is discussed the importance of this parameter and I wonder what is its role. Have you tried to change its value?

- In line 378 there is a reference to equation 26, but that does not correspond to a loss function. Perhaps it is meant equation 35?

- In section 5.2.1, there are results from 4 datasets in subsection 1, but later sections 2 (attention mechanism) and 4 (ablation of GAN) only 2 datasets are used. This is not very important, but I find it curious that some studies use all datasets and others only 2 wihtout a justification.

- Also in section 5.2.1 for the Nonnegative constraint function, I would suggest to describe what is the Clamp function used. At least, I am not familiar with this function.

- The caption of tables 5, 6, 7 and 8 include a clossing bracket that shall not be there at the end of teh text.

- Last sentence of section 6 does not seem right: "Ablation experiments Four open datasets were used for the comparison experiments, which demonstrate that the proposed method performs better overall."

- Finally a general comment, the method performs better than other methods using hte same input. However, the input used is kind of ideal in the sense that the MS data are coming from the HS data and the spatial resolution is degraded by a gaussian filter. In a real data case one would like to combine data from different sensors and that may include calibration/processing errors from one sensor to the other. We can not see how robust the method would be in such cases.

Author Response

Dear reviewer:

We sincerely appreciate your valuable comments in the review process.

Please see the attachement.

Best Wihes

Author Response File: Author Response.pdf

Back to TopTop