Next Article in Journal
Improved General Polarimetric Model-Based Decomposition for Coherency Matrix
Previous Article in Journal
Editorial for Special Issue: “How the Combination of Satellite Remote Sensing with Artificial Intelligence Can Solve Coastal Issues”
Previous Article in Special Issue
Reconstruction of Compressed Hyperspectral Image Using SqueezeNet Coupled Dense Attentional Net
 
 
Article
Peer-Review Record

A Multi-Attention Autoencoder for Hyperspectral Unmixing Based on the Extended Linear Mixing Model

Remote Sens. 2023, 15(11), 2898; https://doi.org/10.3390/rs15112898
by Lijuan Su, Jun Liu, Yan Yuan * and Qiyue Chen
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3:
Remote Sens. 2023, 15(11), 2898; https://doi.org/10.3390/rs15112898
Submission received: 25 February 2023 / Revised: 19 May 2023 / Accepted: 31 May 2023 / Published: 2 June 2023
(This article belongs to the Special Issue Deep Learning for the Analysis of Multi-/Hyperspectral Images)

Round 1

Reviewer 1 Report

The manuscript (remotesensing-2275349) presents a multi-attention autoencoder for hyperspectral unmixing based on extended linear mixing model. The contribution is adopting multi-attention autoencoder with a novel sparse constraint for hyperspectral unmixing. The motivation is reasonable, and the performance of the proposed method is also good. However, there are some issues the authors need to address before publication. The issues are listed as follows:

1. Please check the spelling of the words. For example, line-404 methcanisms, desinged, line-507 rrepresentational.

2. Line-107, LeakyReLu should be written as LeakyReLU

3. Line-232 where is the “channel attention module”?

4. The authors use LeakyReLU activation function in encoder network. Why use ReLU activation function in spectral attention module? Please explain the reason for this design.

 

Author Response

Dear reviewer

We sincerely appreciate your valuable comments in the review process.  We have revised the manuscript according your comments. Please refer the attachment and the revised manuscript.

Best wishes

Author Response File: Author Response.pdf

Reviewer 2 Report

In this manuscript, a multi-attention autoencoder network based on extended LMM is proposed for the hyperspectral unmixing with spectral variability. The abaltion experiments are given to verify the effectiveness of the proposed network. Howevere, I have the following comments:

1. Here are some of the more recent works that can be considered to compile the Introduction part: 

[a] https://ieeexplore.ieee.org/document/10035509

 

[b] https://ieeexplore.ieee.org/document/9439249

[c] https://ieeexplore.ieee.org/document/9775570

[d] https://www.mdpi.com/2072-4292/15/2/451

 

2. In Eq. (17), L_RE is designed by spectral angle distance (SAD) instead of Euclidean distance, please explain the advantage.

3. The proposed method introduces SSA module and spatial homogeneity constraint, thus please discuss the computational complexity or time consumption.

4. In synthetic dataset experiments, the different Gaussian noise levels (e.g., SNR is 10 dB, 30 dB) should be also analyzed.

5. All experimental results should give the mean value and standard deviation after multiple independent runs.

Author Response

Dear reviewer

We sincerely appreciate your valuable comments in the review process.  We have revised the manuscript according your comments. Please refer the attachment and the revised manuscript.

Best wishes

Author Response File: Author Response.pdf

Reviewer 3 Report

The authors have proposed a multi-attention AE network (MAAENet) for hyperspectral unmixing. The manuscript is complete, and the authors try to prove the progressiveness of the algorithm through experiments. However, there are some problems that need to be revised. The comments are as follows

1.      How about the computational complexity? What are the functions of each module in the algorithm? Please add ablation experiments.

2.      How about the adaptability of the algorithm to different number of training labels, especially small labels. Please compare with the SOAT methods.

3.      How about the adaptability of the algorithm to different number of training labels, especially small labels. Please compare with the SOAT methods.

4.      The references used in the paper are relatively old, so it is recommended to update them. In addition, some more methods regarding remote sensing using graph-based methods should be investigated in your introduction, e.g., Semi-Supervised Locality Preserving Dense Graph Neural Network With ARMA Filters and Context-Aware Learning, Unsupervised Self-correlated Learning Smoothy Enhanced Locality Preserving Graph Convolution Embedding Clustering, Self-supervised Locality Preserving Low-pass Graph Convolutional Embedding, Multi-scale Receptive Fields: Graph Attention Neural Network, Multi-scale Receptive Fields: Graph Attention Neural Network, MultiReceptive Field: An Adaptive Path Aggregation Graph Neural Framework.

Some more future directions should be pointed out in the conclusion.

Author Response

Dear reviewer

We sincerely appreciate your valuable comments in the review process.  We have revised the manuscript according your comments. Please refer the attachment and the revised manuscript.

Best wishes

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

All my comments are addressed in the revised version. GNN should be GCN in  the conclusions. It can be accepted for publication.

Reviewer 3 Report

No more comments. The paper can be accepted as present form.

Back to TopTop