Next Article in Journal
Analytical Delay Modeling for a Sub-Threshold Cell Circuit with the Inverse Gaussian Distribution Function
Previous Article in Journal
Layout of MOS Varactor with Improved Quality Factor for Cross-Coupled Differential Voltage-Controlled Oscillator Applications
 
 
Article
Peer-Review Record

A Multi-Hop Graph Neural Network for Event Detection via a Stacked Module and a Feedback Network

Electronics 2023, 12(6), 1386; https://doi.org/10.3390/electronics12061386
by Liu Liu 1,2, Kun Ding 2,*, Ming Liu 2 and Shanshan Liu 2
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Electronics 2023, 12(6), 1386; https://doi.org/10.3390/electronics12061386
Submission received: 11 February 2023 / Revised: 8 March 2023 / Accepted: 13 March 2023 / Published: 14 March 2023

Round 1

Reviewer 1 Report

This work proposed a combination of Multi-hop Graph Neural Network with 

Aggregation of different neighbors with different weights and then applying a feedback algorithm as gating to filter the transmission of heterogeneous features. It was applied for an event detection problem 2005 dataset.  And compared the performance metrics of F1, precision, and recall with seven other works for event detection. 

 

Points for the authors: 

 

The manuscript needed to be proofread before considering for submission. It has typos and misleading language due to commas separating the wrong clauses.

 

In the abstract: This sentence is vague “However, these methods face the problems of over-smoothing and semantic feature destruction, when containing multiple GNN layers. For the reasons, this paper proposes an improved GNN for event detection, which mainly consists of stacked structure module and feedback network.” It is unclear if your work proposes these combined methods in general or if your contribution is regarding the case study in hand, event detection. Is this proposed idea novel? If so, you need to specify; otherwise, the contribution is not new and should be stated as such. 

 

In line 42: cite this statement “BERT cannot effectively capture long-distance structured text features.”

 

In line 44 you started the paragraph with “For the above reasons,” it is not clear which reasons you are referring to. Is it because compared with traditional event detection? Or because BERT cannot effectively capture long-distance structured text features?  

 

In line 54: GCN is first mentioned and not abbreviated anywhere in the manuscript. What do you mean GCN? If the authors mean Graph Convolutional Networks, then its idea needs to be introduced in the introduction.

 

At the end of the introduction, the main contribution of this manuscript is vague. Are those the methods used in the research? 

 

In line 88: what do you mean by the two in “The difference between the two?” two studies?

 

In line 106: If Figure 1 shows your proposed framework, you must declare that this is your proposed FB-GCN architecture.

 

In line 115: be consistent in writing the abbreviation.  Sometimes you call the used pre-trained model BERT, and sometimes, you use lowercase Bert.

 

In line 133: do you mean spaCy library instead of spAcy? If not, please cite this tool.

 

In line 221: section 4.2. needs to be rewritten in a way that more details about the library used but with less information about the programming language. Can the experiment be repeated with any language, or must it be Python 3.8?  

 

In table 1, the values should be explained, and the reasoning behind choosing them should be listed. Readers should know why those values were picked. Is it because you did a hyperparameter tuning, or were they just the default values of the used libraries? 

 

There are two tables numbering 2 in the manuscript. Change the second Table 2 to table 3 instead. As in line 280, Table 3 is mentioned where no Table 3 exists in the manuscript. 

 

In Tables 2 and 3, bold the highest value in each column and mention this in the text. 

 

The conclusion section is relatively weak and short. Only a few elements exist in your conclusions, and you need to elaborate as readers usually read abstracts and conclusions before deciding to read the manuscript.  You need to restate your research topic, restate the thesis, summarize the main points, state the significance or results, and conclude your thoughts.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Multi-hop Graph Neural Network for Event Detection via Stacked Module and Feedback Network

This paper is well-written but requires some major improvements, which are mentioned below:

1.     Authors claimed that the paper proposes an improved GNN for event detection (lines 12,13), but I haven't seen detailed information about how they improved the existing GNN approach. Please discuss.

2.     Main contributions of the paper should be mentioned in the Introduction section.

3.     Please make Related Work section separately with recently published event detection approaches (papers) such as traditional Image processing/Computer Vision, Machine/Deep Learning and other famous approaches.

4.     Figure 1 should be cited in the text before it appears, not after Figures.

5.     Any way to reduce the computational cost compared to other approaches, please discuss.

6.     More information is required about the method followed in the so-called subjective evaluation. I mean about the procedure and environment (the information provided to the subjects.

7.     The authors didn’t mention any limitations of the proposed method, they should. And future research direction on how to fix that challenge.

8.     Can the authors comment on the minimal requirement for the training data set (e.g., the number of texts required)? Are you using the augmentation technique to increase the training dataset artificially?

9.     Achieved results caused by overfitting? Please discuss this in the experiment section.

10.  Could you make a separate section for Dataset and give more detailed data using tables or visual information?

 

11. Conclusion part is needed to be written with the major findings of this article.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

This paper proposes an improved graph neural network (GNN) for event detection in text processing. The paper highlights the limitations of the previous state-of-the-art studies on event detection using GNNs, which are mainly focused on long-distance feature extraction, facing problems of oversmoothing and semantic feature destruction. The proposed model comprises a stacked structure module and a feedback network that address these limitations. The paper also describes the architecture of the model and how it incorporates semantic and structural information to recognize event triggers and classify event types. Overall, the paper presents a well-written and informative study that contributes to the existing literature on event detection. The proposed method is compared with state-of-the-art methods on the ACE 2005 dataset, achieving competitive results.

 

One of the strengths of the paper is the proposed multi-hop graph neural network, which effectively captures the long distance interrelation between each candidate trigger word and its related entities or other triggers. The stacked different GNNs and feedback network improve the representation of nodes by calculating weights of their multi-hop neighbors, and alleviate semantic feature attenuation caused by the increase of GNN hidden layers. However, there are some weaknesses in the paper:

 

1. The paper lacks an in-depth analysis of the limitations of the proposed method. Although the proposed method achieves competitive results, it is unclear how it performs on other datasets or in different scenarios.

 

2. In general, although the paper is well-written, it lacks clarity in its presentation in some parts. Some sentences are poorly constructed, and the paper could benefit from a more organized structure. Some aspects should be reviewed:

 

2.1. In the affiliation of the authors, the country is missing.

 

2.2. All the acronyms should be defined before they are used (for example, you use acronym GCN without being defined).

 

2.3. Figures 1, 5 and 6 appear before they are mentioned in the text. All the figures and tables should be placed after they are mentioned in the text, not before.

 

2.4. In lines 54-62 of the introduction, after ";" you must use lowercase, not uppercase.

 

3. The paper lacks detailed information about the dataset and evaluation metrics used in the experiments, which makes it difficult to evaluate the results properly. You should describe better the dataset used and the evaluation metrics.

 

4. The related work section of the paper could be improved. The related work section mainly covers the application of deep learning models in event detection, but it lacks a discussion on the limitations of previous studies and how the proposed method overcomes them.

 

5. In the experimental section, what was the GPU used in the experiments? The paper lacks from a detailed analysis of the computational efficiency of the proposed method. What were the training/test times of the proposed method, and the methods used for comparison?

 

6. In section 4.2, in Table 1, you present the hyperparameters used in your experiments. How did you select these parameters? Can you guarantee that these are the optimal parameters? Have you tried other parameters?

 

7. Finally, the conclusion could benefit from a more detailed and critical analysis of the experimental results. Specifically, the authors could discuss the limitations of their approach and the factors that may have affected the performance of their model. For example, the authors could discuss the impact of the size and diversity of the training data, the complexity of the event types, and the robustness of the model to noisy or ambiguous text. For example, the authors say that they "plan not to use syntax analysis tools to generate the graph structure of words for GNNs"; it would better if the authors discussed in more detail how they plan to achieve this and what they expect the impact of this change to be.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors skipped one of my original comments. In line 43, the authors claimed that “BERT cannot <effectively> capture long-distance structured text features.” You need to add a citation/reference to support the statement. I know one of the disadvantages of BERT (and many other transformer models) is that it handles a maximum of 512 tokens max and trims anything beyond this length. Is this what you mean by cannot effectively capture long-distance text? Please explain this in the manuscript and find a reference to cite this statement. 

Author Response

Question:The authors skipped one of my original comments. In line 43, the authors claimed that “BERT cannot <effectively> capture long-distance structured text features.” You need to add a citation/reference to support the statement. I know one of the disadvantages of BERT (and many other transformer models) is that it handles a maximum of 512 tokens max and trims anything beyond this length. Is this what you mean by cannot effectively capture long-distance text? Please explain this in the manuscript and find a reference to cite this statement.

Answer:We are so sorry for missing this question. As you said, the length of data processed by BERT is usually limited to 512 words. This obviously limits the scope of its semantic analysis. Obviously, the GNN method has no such limitations. At the same time, GNN methods can use syntax analysis to efficiently obtain the dependency between words. However, BERT needs to learn the relationship between words through the self-attention mechanism. We have added relevant references and made further explanations in our manuscript.

Reviewer 2 Report

Paper was improved more than first submission. 

Author Response

Dear reviewer:

Thank you for your comments concerning our manuscript entitled “Multi-hop Graph Neural Network for Event Detection via Stacked Module and Feedback Network”. Those comments are valuable and very helpful. We have read through comments carefully and have made corrections. Based on the instructions provided in your letter, we uploaded the file of the revised manuscript.

We highly appreciate your time and consideration.

Sincerely.

Liu Liu.

Reviewer 3 Report

The authors have properly reviewed the manuscript, answering all questions and comments. Therefore, I am satisfied with this new version and believe it is acceptable for publication.

Author Response

Dear reviewer:

Thank you for your comments concerning our manuscript entitled “Multi-hop Graph Neural Network for Event Detection via Stacked Module and Feedback Network”. Those comments are valuable and very helpful. We have read through comments carefully and have made corrections. Based on the instructions provided in your letter, we uploaded the file of the revised manuscript.

We highly appreciate your time and consideration.

Sincerely.

Liu Liu.

Back to TopTop