Next Article in Journal
Photovoltaic Power-Stealing Identification Method Based on Similar-Day Clustering and QRLSTM Interval Prediction
Previous Article in Journal
Pressure Sensitivity Prediction and Pressure Measurement of Fast Response Pressure-Sensitive Paint Based on Artificial Neural Network
 
 
Article
Peer-Review Record

EffShuffNet: An Efficient Neural Architecture for Adopting a Multi-Model

Appl. Sci. 2023, 13(6), 3505; https://doi.org/10.3390/app13063505
by Jong-In Kim 1,†, Gwang-Hyun Yu 2,†, Jin Lee 2, Dang Thanh Vu 2, Jung-Hyun Kim 1, Hyun-Sun Park 1, Jin-Young Kim 2,* and Sung-Hoon Hong 2,*
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Appl. Sci. 2023, 13(6), 3505; https://doi.org/10.3390/app13063505
Submission received: 2 February 2023 / Revised: 6 March 2023 / Accepted: 7 March 2023 / Published: 9 March 2023
(This article belongs to the Section Computing and Artificial Intelligence)

Round 1

Reviewer 1 Report

The paper developed an efficient neural architecture for adopting multi-model called EffShuff-Block. The EffShuff-Block model has lightweight structure while maintaining high classification performance. The experimental results show that the EffShuff-Block model has better performance when compared with state-of-the-art structures and a smaller model size is achieved. Overall, this paper is well organized, and relevant methods are comprehensively described. Some of my comments and suggestions are given below.

 

(1) The abstract should be a total of about 200 words maximum, and more details can be found here: https://www.mdpi.com/journal/applsci/instructions.

 

(2) Section 2 introduces some lightweight neural networks, such as ShuffleNet and MobileNet. The application of lightweight networks for image classification should be introduced as well. Here are some examples: doi:10.3390/electronics8111354; doi: 10.1007/s00170-022-10335-8.

 

(3) Figure 8 presents the evolution of accuracy related to different models. The authors should clarify how many times these models have been trained, as the results of single training process is not fair enough. This is also applicable to Figure 9.

 

(4) The experimental results indicate that EffShuff-Block is superior to other existing networks. It is recommended that the authors should conduct in-depth comparison and analysis to explain why this block has the best performance or clarify the advantages of the proposed model over the existing ones.

Author Response

Thank you for the opportunity to revise our manuscript, "EffShuff-Block: An efficient neural architecture for adopting multi-model". We appreciate the careful review and constructive suggestions. We believe that the manuscript is substantially improved after making the suggested revisions. Following this letter are the editor and reviewer comments with our responses in italics, including how and where the text was modified. Changes made in the manuscript are marked using track changes. The revision has been developed in consultation with all coauthors, and each author has approved the final form of this revision. The agreement form signed by each author remains valid. Thank you for your consideration.  

 

 

Sincerely, 

JongIn Kim

Author Response File: Author Response.pdf

Reviewer 2 Report

Authors please look into the line number 114 and 116 that you not mentioned the reference number.

also in table 1 the last 2 entries without any data need to be specified.

Author Response

Thank you for the opportunity to revise our manuscript, "EffShuff-Block: An efficient neural architecture for adopting multi-model". We appreciate the careful review and constructive suggestions. We believe that the manuscript is substantially improved after making the suggested revisions. Following this letter are the editor and reviewer comments with our responses in italics, including how and where the text was modified. Changes made in the manuscript are marked using track changes. The revision has been developed in consultation with all coauthors, and each author has approved the final form of this revision. The agreement form signed by each author remains valid. Thank you for your consideration.  

 

 

Sincerely, 

JongIn Kim

Author Response File: Author Response.pdf

Reviewer 3 Report

This manuscript proposed a light-weight CNN architecture, named EffShuff-Block. Overall, the experimental results are promising. However, the authors should give a detailed explanation about the design logic of EffShuff-Block.

1. Figure 4 and Figure 5. The authors should demonstrate the design logic of these two block. What are the advantages and disadvantages of ShuffleNet_v2? And why EffShuff-Block can resolve these problems?

2. Experimental results. In order to prove the effectiveness of the two block, please give the results of ablation experiments.

3. Figure 8, Figure 9. Please provide hight resolution figures with accurate labels.

Author Response

Thank you for the opportunity to revise our manuscript, "EffShuff-Block: An efficient neural architecture for adopting multi-model". We appreciate the careful review and constructive suggestions. We believe that the manuscript is substantially improved after making the suggested revisions. Following this letter are the editor and reviewer comments with our responses in italics, including how and where the text was modified. Changes made in the manuscript are marked using track changes. The revision has been developed in consultation with all coauthors, and each author has approved the final form of this revision. The agreement form signed by each author remains valid. Thank you for your consideration.  

 

 

Sincerely, 

JongIn Kim

Author Response File: Author Response.pdf

Reviewer 4 Report

This paper proposes a novel EffShuff-Block model based on a CNN architecture. The topic is interesting, and the paper is well-prepared. However, there are some minor typos/grammar errors. The language could be improved slightly. I've shared my suggestions below.

1) The authors must mention the novelty of the research in the introduction part.

2) Figure 4 should be updated. It's not easy to read.

3) Figure 8 should be updated. It's not easy to read.

4) Figure 9 should be updated. It's not easy to read.

5) References are appropriate but inadequate. More novel references are needed in the introduction. Furthermore, section 2 must be improved.

Author Response

Thank you for the opportunity to revise our manuscript, "EffShuff-Block: An efficient neural architecture for adopting multi-model". We appreciate the careful review and constructive suggestions. We believe that the manuscript is substantially improved after making the suggested revisions. Following this letter are the editor and reviewer comments with our responses in italics, including how and where the text was modified. Changes made in the manuscript are marked using track changes. The revision has been developed in consultation with all coauthors, and each author has approved the final form of this revision. The agreement form signed by each author remains valid. Thank you for your consideration.  

 

 

Sincerely, 

JongIn Kim

Author Response File: Author Response.pdf

Round 2

Reviewer 3 Report

  Overall, the novelty of this manuscript is very limited, and the authors load too many concepts. I recommend the authors demonstrate their method more clearly and accurately. 

  1. The concepts in the manuscript are confusing. What’s the difference and relationship between EffShuff-Block, EffShuff-Transitions, EffShuff-Transitions Block, EffShuff-Block model, EffShuff-Dense basic Block, EffShuff-Dense Transition Block and EffShuff-Dense-Block.

  2. P5, L209. “The EffShuff-Block model is a lightweight convolutional neural network that incorporates EffShuff-Block and EffShuff-Transition following the general stem section.” According to this description, I can not understand the content of Figure 3. Please describe the proposed method and network more accurately.

Author Response

The authors express gratitude for the reviewer's feedback, which has resulted in substantial improvements to the manuscript, as noted in the first revision. Specifically, the authors have provided additional detail on the novelty of our approach in both the Introduction and Method sections.

While the authors acknowledge that our approach may not be entirely novel, we contend that the study's findings are indeed significant. First, the authors have introduced a modified architecture, EffShuffNet, that preserves the benefits of the Shuffle layer while significantly reducing the number of parameters. Second, the authors have demonstrated that our modifications to dense connections can be integrated without much increasing the model's complexity, while simultaneously enhancing performance. The study includes thorough comparisons with state-of-the-art deep convolutional models, covering not only a multi-label dataset but also two fine-grain datasets that are publicly available, ensuring the validity of the comparison.

The authors recognize the limitations of the research and intend to address them in future studies. We plan to conduct additional experiments using automatic hyper-parameter tuning methods, as proposed in a prior publication. Furthermore, we have endeavored to deploy our model on edge devices, such as smartphones or online services, for age and gender prediction, as well as other applications.

Author Response File: Author Response.pdf

Back to TopTop