Next Article in Journal
SM9 Identity-Based Encryption with Designated-Position Fuzzy Equality Test
Previous Article in Journal
Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors
 
 
Article
Peer-Review Record

Enhancing Anomaly Detection for Cultural Heritage via Long Short-Term Memory with Attention Mechanism

Electronics 2024, 13(7), 1254; https://doi.org/10.3390/electronics13071254
by Yuhan Wu 1, Yabo Dong 1,*, Zeyang Shan 1, Xiyu Meng 1, Yang He 1, Ping Jia 2,* and Dongming Lu 1
Reviewer 1: Anonymous
Reviewer 2:
Electronics 2024, 13(7), 1254; https://doi.org/10.3390/electronics13071254
Submission received: 21 January 2024 / Revised: 7 March 2024 / Accepted: 16 March 2024 / Published: 28 March 2024

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The paper proposes a hybrid model, LSTM-Attention, to enhance anomaly detection in cultural heritage preservation. By combining LSTM and attention mechanisms with environmental factors, the model aims to improve prediction accuracy. It introduces a novel threshold extraction method and demonstrates superior performance through extensive experiments.

My Comments:

1) How does the proposed LSTM-Attention model address the limitations of existing anomaly detection methods in cultural heritage preservation?

2) Can you elaborate on the specific environmental factors considered in the model and how they contribute to anomaly detection?

3) What criteria were used to select the types of anomalies (seasonal, trend, Shapelet, mixed) introduced in the study?

4) How does the proposed threshold extraction method based on extreme value theory and recurrence interval calculation improve upon existing approaches?

5) Can you provide insights into the ablation study conducted to assess the contribution of the attention mechanism?

6) How was the performance of the LSTM-Attention model compared to other baseline models (ARIMA, VAR, CNN, LSTM), and what metrics were used for evaluation?

7) What challenges were encountered in implementing the algorithm in practical engineering applications, and how were they addressed?

8) What considerations were taken into account when determining the time steps for anomaly detection, and how flexible is the model in adjusting these steps?

9) How does the proposed anomaly detection methodology accommodate factors like acid rain, chemicals, and atmospheric pollution, which were mentioned as potential limitations?

10) In future research, how do you plan to expand the exploration of other influencing factors and optimize the model's performance further?

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

Summary of the paper title: Enhancing anomaly detection for cultural heritage via long short-term memory with attention mechanism

 

The research paper introduces a novel anomaly detection algorithm for ancient buildings, utilizing the LSTM-Attention model, which considers environmental factors. The study addresses the limitations of existing warning mechanisms by proposing a hybrid model that combines LSTM and attention mechanisms to detect anomalies in cultural heritage. The attention mechanism extracts temporal dependencies, while LSTM captures global long-term patterns. The paper also introduces a novel threshold extraction method to reduce reliance on prior knowledge. The model outperforms previous methods, demonstrating the effectiveness and superiority of the proposed approach. The algorithm has been successfully implemented in engineering and employed in practice, providing valuable guidelines for the preservation of ancient buildings worldwide. The paper includes a comprehensive discussion of model architecture, parameter sensitivity, performance comparison, and implications for future research. The findings underscore the importance and practical applicability of the proposed method for anomaly detection in the conservation of cultural heritage. The research suggests further considerations for expanding the model's understanding of influencing factors, improving model performance, and exploring additional methods applicable to cultural heritages. 

 

I congratulate all the authors for their wonderful thinking and applications of ML in this area. The way they carried out their research work is excellent and appreciable. 

 

Questions to be answered.

 

In which way LSTM is suitable for the attention model in your research?

Choosing an LSTM-Attention model is not universally beneficial. It's essential to consider the specific task, data availability, and computational resources before deciding whether is it explained in this article?

Combining LSTMs with attention mechanisms leads to a more complex model architecture, requiring more computational resources and potentially longer training times compared to simpler models.

In section 2. Materials and Methods, 2.1 this section is not defined anything. 

Figure1. can you make any marks on the image so that it can easily visible. 

What about addition anomaly dataset? Give few details about it?

Line – 277 “??” what it means?

Line 466 replace (.)  and use “and”

Line 173,169, 172,334 – Shapelet (s-small letter)

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

 Accept in present form.

Back to TopTop