Next Article in Journal
Experimental Study on FM-CSK Communication System for WSN
Previous Article in Journal
Research on the Finite Time Compound Control of Continuous Rotary Motor Electro-Hydraulic Servo System
 
 
Article
Peer-Review Record

A Multivariate Temporal Convolutional Attention Network for Time-Series Forecasting

Electronics 2022, 11(10), 1516; https://doi.org/10.3390/electronics11101516
by Renzhuo Wan 1,2,†, Chengde Tian 1,†, Wei Zhang 1, Wendi Deng 1 and Fan Yang 3,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3:
Electronics 2022, 11(10), 1516; https://doi.org/10.3390/electronics11101516
Submission received: 21 April 2022 / Revised: 29 April 2022 / Accepted: 29 April 2022 / Published: 10 May 2022
(This article belongs to the Topic Artificial Intelligence Models, Tools and Applications)

Round 1

Reviewer 1 Report

I found the method proposed in the article original and interesting. In overall, the article is well written and the results are clearly proposed. I have just some minor issues reported below. Since most of these issues are typos, I invite the authors to read the manuscript one more time before submitting the revised version.

  • The term 'multivariate' is repeated in the title. In essence, it is ok but it may sound not so good. I wonder whether one could sacrifice the first occurrence of the term.
  • In the abstract and also in the introduction there are several undefined acronyms: CNN, RNN, LSTM, GRU, ConvLSTM, TCN. Even though they are pretty popular, I would add their explanation to ease the readability.
  • lines 116-120: In thi sentence there are some repetitions, please rephrase a bit.
  • Line 123: 'he' should be 'The'.
  • Line 124: 'And' at the beginning of the sentence seems a typo.
  • Line 223: I would put the comma after 'MTCAN' and not after 'in this section'.
  • Line 224: 'is' should be 'are' (refferred to 'The three datasets').
  • Line 310: I would add 'the' before 'longer'.
  • Line 332: I suppose that 'Model: ' at the beginning of the sentence is a typo.
  • Lines 339: 'all' should be 'All'.
  • Line 346: the word 'dataset' is repeated too many times, please rephrase.

Author Response

Dear Reviewer,

The authors thanks for your pertinent comments. We made serious consideration. You may find the revisions from the attached file. 

Thank you.

Regards,

Fan

Author Response File: Author Response.pdf

Reviewer 2 Report

In the submitted manuscript "A multivariate temporal convolutional attention network for multivariate time series forecasting” the authors propose a multivariate time series forecasting model, multivariate temporal convolutional attention network (MTCAN), which is based on a self-attentive mechanism.

After a validation process that they performed on three different multivariate time series datasets, they claim that such an approach outperforms the state-of-the-art competitors(LSTM, GRU, ConvLSTM, and TCN) in terms of prediction accuracy and generalization.

 

Although the manuscript is well written, I suggest to the authors a careful re-reading of it in order to fix some minor problems such as:

- “one of crucial” instead of “one of the crucial”;

- “parallel computation” instead of “parallel computations” (or “a parallel computation”;

- “is organized as:” instead of “is organized as follows:”;

- “LSTM adopts cell state" instead of “LSTM adopts a cell state”;

- “and then cooperate” instead of “and then cooperates;

- “memory of past information” instead of “the memory of past information”;

- “asymmetric factor to the structure” instead of “asymmetric factor in the structure”;

- “(RMSE), relative root error (RRSE) and empirical” instead of “(RMSE), relative root error (RRSE), and empirical”;

- “model.The” instead of “model. The”;

- … And so on.

 

The “Background” section should be renamed as “Background and Related Work” and it should discuss further literature works that are close (i.e., that deal with time series forecasting tasks) or directly related to the considered research domain, as to provide an overview to the readers.

 

The references used by the authors are largely not up to date and they are not sufficient in number, even considering that many of them are preprint versions: the authors should check if there are more recent works among the mentioned literature ones, avoid citing preprint versions (e.g., arXiv), as they have not passed a peer review process, and therefore their scientific contributions are not certified in any way.

In addition, according to my previous observation, the authors should add and discuss additional works very close or directly related to the domain taken into account, such as, just by way of example:

(-) Huang, Bingqing, et al. "A Novel Model Based on DA-RNN Network and Skip Gated Recurrent Neural Network for Periodic Time Series Forecasting." Sustainability 14.1 (2021): 326.

(-) Jin, Xue-Bo, et al. "A variational Bayesian deep network with data self-screening layer for massive time-series data forecasting." Entropy 24.3 (2022): 335.

(-) Carta, Salvatore, et al. "A local feature engineering strategy to improve network anomaly detection." Future Internet 12.10 (2020): 177.

(-) Mathonsi, Thabang, and Terence L. van Zyl. "A statistics and deep learning hybrid method for multivariate time series forecasting and mortality modeling." Forecasting 4.1 (2021): 1-25.

(-) … And so on.

 

The formal approach appears reasonable, the related experimental results have been presented in a quite clear form to the readers, and the conclusions are convincing.

 

Concerning the “Conclusion” section, the authors should summarize in a clear way all the main steps of their manuscript, in order to offer to the readers a brief but complete summary of the performed work.

In addition, in this section, the authors should better underline the possible advantages of the proposed approach, highlighting their scientific contribution with regard to both the literature and the real-world applications.

Author Response

Dear Reviewer,

The authors thanks for your pertinent comments. We made serious consideration. You may find the revisions from the attached file. And the point-by-point response is also appended below.

Thank you.

Regards,

Fan

Author Response File: Author Response.pdf

Reviewer 3 Report

Excellent work, I’d like to thank the authors very much. The proposed approach is well-described and validated, and the manuscript is well-written as well. I only have a few minor comments to consider, please.

(1)

I recommend touching on the explainability of predictions that could be achieved thanks to the proposed architecture. This point would be a very important aspect to highlight since attention mechanism can improve the explainability of models in general. This can also be investigated in detail as part of the future work.

 

(2)

The introduction/background should refer to more recent contributions that applied CNN-based architectures for multivariate time series modeling. For example:

https://doi.org/10.1109/ICHI48887.2020.9374393

https://ieeexplore.ieee.org/abstract/document/8970899

 

(3)

The Keras reference should be cited, please.

Chollet, F. (2015). Keras. Github Repository: https://github.com/fchollet/keras

 

Author Response

Dear Reviewer,

The authors thanks for your pertinent comments. We made serious consideration. You may find the revisions from the attached file. And the point-by-point response is also appended below.

Thank you.

Regards,

Fan

Author Response File: Author Response.pdf

This manuscript is a resubmission of an earlier submission. The following is a list of the peer review reports and author responses from that submission.


Back to TopTop