Next Article in Journal
The Role of Front-End AC/DC Converters in Hybrid AC/DC Smart Homes: Analysis and Experimental Validation
Previous Article in Journal
Audio Feature Engineering for Occupancy and Activity Estimation in Smart Buildings
 
 
Article
Peer-Review Record

A Highly Robust Binary Neural Network Inference Accelerator Based on Binary Memristors

Electronics 2021, 10(21), 2600; https://doi.org/10.3390/electronics10212600
by Yiyang Zhao, Yongjia Wang, Ruibo Wang, Yuan Rong and Xianyang Jiang *
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Electronics 2021, 10(21), 2600; https://doi.org/10.3390/electronics10212600
Submission received: 14 September 2021 / Revised: 13 October 2021 / Accepted: 14 October 2021 / Published: 25 October 2021
(This article belongs to the Section Microelectronics)

Round 1

Reviewer 1 Report

This research proposed a powerful BNN inference accelerator, including a binary sigmoid activation function.
 The authors claim that the proposed memristor structure and CCVS circuit solve the memristor array's mapping positive and negative weights issues. 
In addition, they argue that memristor structure eliminates the current sneak effect under the minimum conductance status.
1. The authors need a good review of the background.
2. The references provided must be applicable and sufficient.
3. The title is applicable and appropriate. 
4. Experimental setup environment needs a more extensive explanation.
5. What is the future work? How can other researchers in the field expand this work?
6. The sections seem not smoothly connected. Perhaps, the authors may reread the paper thoroughly and improve its flow and coherence. 

Author Response

Response to Reviewer 1 Comments

 

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript. We have studied comments carefully and have made correction which we hope meet with approval. The responds to your comments are as flowing:

 

Point 1: The authors need a good review of the background.

 

Response 1: We have added some new references in the latest version of the article. Some reports about newly fabricated binary memristors-based neuromorphic Integrated Circuits (IC) in Introduction and some articles about lab made memristor devices in Section 2.2. I think these literatures will help readers a lot to understand the state-of-the-art MBNNs and the value of our work.

 

Point 2: The references provided must be applicable and sufficient.

 

Response 2: We have added 14 references in the latest version to enrich the background of the article. We think the 14 new references and the original 37 references are all appropriate and helpful to the readers of Electronic.

 

Point 3: The title is applicable and appropriate. 

 

Response 3: Thanks for your comment.

 

Point 4: Experimental setup environment needs a more extensive explanation.

 

Response 4: A New Section named “Experiment Environment and Configuration” was added in the latest edition, and the information about experimental environment is elaborated in this section.

 

Point 5: What is the future work? How can other researchers in the field expand this work?

 

Response 5:

Lately, binary memristors-based neuromorphic Integrated Circuits (IC) is gradually becoming a trend of In-Memory Computing, but the state-of-the-art fabricated memristor-based chips were mainly adopting differential pair structure. A highly robust structure has important implications for industry. Other researchers could design memristor-based chips based on this work by perfecting the peripheral circuits.

The MBNN designed and implemented in this work is a kind of fully connected structure. As future work, we will aim at exploring other memristive neural networks with more complex structures, such as convolutional neural network (CNN) and recurrent neural network (RNN), which need to further optimize the function and timing of the peripheral control circuit.

 

Point 6: The sections seem not smoothly connected. Perhaps, the authors may reread the paper thoroughly and improve its flow and coherence. 

 

Response 6: Thanks for your advice. In the latest version, we follow this writing logic: In Section 1, we introduced the background of the article, from the proposal of memristor to the state-of-the-art fabricated memristor-based chips, and then we discussed the current state of research on different structures of memristor-based circuit. In Section 2, we introduced our optimized BNN algorithm, then we presented how to develop the proposed MBNN from a memristor model. In Section 3, we talked about results and analysis. Experiment Environment and Configuration were detailed in Section 4. At the end is our conclusion. Hopefully this arrangement of sections will make you feel cohesion

 

We tried our best to improve the manuscript and made some changes in the manuscript. We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

Author Response File: Author Response.pdf

Reviewer 2 Report

In the manuscript "A Highly Robust BNN Inference Accelerator Based on Binary Memristors", the authors proposed a novel MBNN with two-column reference structure that effectively solves the problem of sneak current effect and validated it by Cadence circuit simulation.

Even though the manuscript is well-organized and well-written, there are still few concerns from the reviewer.

 

  1. When given an acronym for the first time, the definition should be provided. In the conclusion part, "the authors proposed a novel MBNN with two-column reference structure", should be corrected to "the authors proposed a novel MBNN with a two-column reference structure". Please check other parts to correct grammar errors.
  2. The "Introduction" part is well-written. The author stated, "In recent years, deep learning algorithms such as deep neural networks (DNNs) have achieved great success in a wide range of artificial intelligence (AI) applications, including, but not limited to, image recognition, natural language processing, and pattern recognition." This part could be enriched by added other applications of DNN, such as Cerebral Microbleeds Detection System (electronics10182208), Indoor Two-Dimensional Localization (electronics10172166), Renewable Energy (j.enconman.2020.112964), Potato Leaf Disease Recognition (electronics10172064).
  3. In equation (6), switch the order 0 and 1 to make it consistent with equation (2). This kind of function is similar to the definition of the basis function in multivariate adaptative regression splines (MARS) (j.eswa.2021.114565).
  4. Please revise figure (10). The structure of the neural network should be connected, refer to the structure figures in 10.1007/978-3-319-24574-4_28, j.enconman.2020.112964, 10.1109/3DV.2016.79.
  5. In the "Experimental results and analysis" part, please provide more information about the environment setting information, and how the dataset is divided (the ratio of training and testing).
  6. In the "Experimental results and analysis" part, the authors presented many results in figures, please also provide the results related to the figures in tables.
  7. In the "Experimental results and analysis" part, please also provide the computational time information.
  8. The are other data mining methods, such as MARS (j.eswa.2021.114565,j.apenergy.2019.04.084), SVM (electronics10172083,j.enconman.2019.06.082), that the authors may implement by using the similar method proposed in this manuscript.

 

Hopefully, these comments will help to improve the manuscript.

 

Overall, the research conducted is original and interesting, and the writing is organized and clear.

Author Response

Response to Reviewer 2 Comments

 

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript. We have studied comments carefully and have made correction which we hope meet with approval. The responds to your comments are as flowing:

 

Point 1: When given an acronym for the first time, the definition should be provided. In the conclusion part, "the authors proposed a novel MBNN with two-column reference structure", should be corrected to "the authors proposed a novel MBNN with a two-column reference structure". Please check other parts to correct grammar errors.

 

Response 1: We are sorry for the grammer mistakes and we have revised them in the latest version. Section “Abbreviations” was added to make it easier for readers to get the  definition  of acronyms.

 

Point 2: The "Introduction" part is well-written. The author stated, "In recent years, deep learning algorithms such as deep neural networks (DNNs) have achieved great success in a wide range of artificial intelligence (AI) applications, including, but not limited to, image recognition, natural language processing, and pattern recognition." This part could be enriched by added other applications of DNN, such as Cerebral Microbleeds Detection System (electronics10182208), Indoor Two-Dimensional Localization (electronics10172166), Renewable Energy (j.enconman.2020.112964), Potato Leaf Disease Recognition (electronics10172064).

 

Response 2: Thank you for your advice, and these references have been added to the latest edition.

 

Point 3: In equation (6), switch the order 0 and 1 to make it consistent with equation (2).

 

Response 3: Thank you for your correction, and the order has been modified.

 

Point 4: Please revise figure (10). The structure of the neural network should be connected.

 

Response 4: Yes, the BNN in this article is a fully connected topology, the neural network should be connected. We have redrawn Figure 10 in the latest edition.

 

Point 5: In the "Experimental results and analysis" part, please provide more information about the environment setting information, and how the dataset is divided (the ratio of training and testing).

 

Response 5: The environment setting information is described in detail in Section “Experiment Environment and Configuration”, a newly added section. The information about how to generate and divide dataset is explained in detail in Section 2.1 of the latest edition.

 

Point 6: In the "Experimental results and analysis" part, the authors presented many results in figures, please also provide the results related to the figures in tables.

 

Response 6: Yes, these results related to the figures are all provided in the latest edition.

 

Point 7: In the "Experimental results and analysis" part, please also provide the computational time information.

 

Response 7: The computational time information is tabulated in Section “Experiment Environment and Configuration” of the latest edition.

 

Point 8: The are other data mining methods, such as MARS, SVM, that the authors may implement by using the similar method proposed in this manuscript.

 

Response 8: Thank you for your advice. We think these methods maybe not suitable for this article, but we can use it in the future work.

 

We tried our best to improve the manuscript and made some changes in the manuscript. We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

Author Response File: Author Response.pdf

Reviewer 3 Report

The authors Zhao, Wang, Wang, Rong, and Jiang have submitted a manuscript entitled "A Highly Robust BNN Inference Accelerator Based on Binary Memristors" to the journal Electronics.

 

The introduction provides sufficient background and includes all relevant references, even if the available experimental data on memristors and memristor networks could be included (I suggest to add a paragraph with related references). The research design is appropriate. The methods are described, but such description can be improved (see comments below). The results are clearly presented (apart from the results in Figure 2). Discussion of data and conclusions are adequately supported by the results (please, see comments below for integration).

English language and style are minor spell check required.

I do not detect plagiarism and I do not detect inappropriate citations.

In general, I do not see any ethical issues along the manuscript.

In terms of originality, significance of content, quality of presentation, scientific soundness, interest to the readers, I think that the manuscript deserves publication in the journal Photonics. For this reason, I recommend the editorial board to accept the manuscript in the present form.

 

 

I have some comments:

1) The title "A Highly Robust BNN Inference Accelerator Based on Binary Memristors" includes an acronym. In my opinion, in order to have a more clear title for a wide readership, I would suggest to skipe acronyms and to use "binary neural network" instead of "BNN".

 

2) I do not understand the data in Figure 2. The authors state "the hidden layer has more than 1000 neurons, the recognition accuracy will nearly saturate, meanwhile, recognition accuracy difference between networks with one and three hidden layers is only about 1%, a slight degradation". It is not clear the calculations (or experimenta data?) related to the three plots. If they use a model, which model? I suggest the authors to be more clear in this part.

 

3) Section 3 is called "Experimental results and analysis". However, the authors focus on simulations and the simulations are not connected to experimental data on real devices. I suggest to call the section "Results and Analysis".

 

4) The literature on experimental memristor is becoming singificantly vast. How can the results of the authors corroborate existing experimental data on memristors and memristor network. I think that a link to the available experimental data would be absolutely beneficial for the wide community that reads the journal Electronics.

 

5) The methods could be better described. As far as I have understood, the authors use Verilog-AMS as design language and Cadence Spectre for simulations. I am not very familiar with these software and it could be that most of the Electronics readership is not familiar. Perhaps, the authors can explain a bit (with a dedicated section?) better the motivation of the use of these softwares. Moreover, are there any particular algorithm that have been used and that the authors want to make available to help the reproducibility of the simulations.

 

6) The dimensions should always be separated by a space from the values.

Author Response

Response to Reviewer 3 Comments

 

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript. We have studied comments carefully and have made correction which we hope meet with approval. The responds to your comments are as flowing:

 

Point 1: The title "A Highly Robust BNN Inference Accelerator Based on Binary Memristors" includes an acronym. In my opinion, in order to have a more clear title for a wide readership, I would suggest to skipe acronyms and to use "binary neural network" instead of "BNN".


 

Response 1: The title of the article has been changed to "A Highly Robust Binary Neural Network Inference Accelerator Based on Binary Memristors".

 

Point 2: I do not understand the data in Figure 2. The authors state "the hidden layer has more than 1000 neurons, the recognition accuracy will nearly saturate, meanwhile, recognition accuracy difference between networks with one and three hidden layers is only about 1%, a slight degradation". It is not clear the calculations (or experimenta data?) related to the three plots. If they use a model, which model? I suggest the authors to be more clear in this part.

 

Response 2: The network topology of BNNs in this work is a kind of multi-layer perceptron (MLP), all layers in the BNN are fully connected layers where all the inputs from one layer are connected to every unit of the next layer. Thanks for your this comment, because there are other neural networks using binary weight, such as binary convolutional neural networks. This modification could help to clear up the reader's confusion.

 

Point 3: Section 3 is called "Experimental results and analysis". However, the authors focus on simulations and the simulations are not connected to experimental data on real devices. I suggest to call the section "Results and Analysis".


 

Response 3: The title of this section has been corrected to " Results and Analysis ".

 

Point 4: The literature on experimental memristor is becoming singificantly vast. How can the results of the authors corroborate existing experimental data on memristors and memristor network. I think that a link to the available experimental data would be absolutely beneficial for the wide community that reads the journal Electronics.

 

Response 4: We have added some new references in the latest version of the article. Some reports about newly fabricated binary memristors-based neuromorphic Integrated Circuits (IC) in Introduction and some articles about lab made memristor devices in Section 2.2. I think these literatures will help readers a lot to understand the state-of-the-art of MBNNs.

 

Point 5: The methods could be better described. As far as I have understood, the authors use Verilog-AMS as design language and Cadence Spectre for simulations. I am not very familiar with these software and it could be that most of the Electronics readership is not familiar. Perhaps, the authors can explain a bit (with a dedicated section?) better the motivation of the use of these softwares. Moreover, are there any particular algorithm that have been used and that the authors want to make available to help the reproducibility of the simulations.


 

Response 5: Yes, your understanding is correct, Verilog-AMS is a design language and Cadence Spectre is a tool for simulations. We have added the introduction about Verilog-AMS and Cadence Virtuoso Spectre at the beginning of Section 3.1. With the help of Verilog-AMS and Cadence Virtuoso Spectre, circuit simulation could be conducted to low-level and detailed, which is the key to probing and addressing the problem of sneak paths. We are sorting out the code and may put it on Github in the future, and would be happy to help interested readers reproduce the simulation.

 

Point 6: The dimensions should always be separated by a space from the values.

 

Response 6: We are sorry for the spelling mistakes and we have revised them in the latest version.

 

We tried our best to improve the manuscript and made some changes in the manuscript. We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors claim that the proposed two-column reference scheme reduces the number of memristors and the latency to refresh memristor array by nearly 50%.
 Conclusions are concise and well supported.
The bibliography is adequate in breadth and depth of coverage.

- It would be better if the authors could discuss the pros and cons of the proposed idea in detail.
- Please resize and increase the quality of figures 1, 5, 6, 11 and 13.
- Please resize and increase the quality of Tables 2,3,4,5,6, and 7.
- What are the limitations of this paper?
- It would be better if the authors could highlight the practical implementation of this paper for readers?

Author Response

Response to Reviewer 1 Comments

 

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript, these comments are all valuable and very helpful for revising and improving our paper, as well as the important guiding significance to our researches. We have studied comments carefully and have made correction which we hope meet with approval. The responds to your comments are as flowing:

 

Point 1: It would be better if the authors could discuss the pros and cons of the proposed idea in detail.

 

Response 1: In terms of training method, the proposed MBNN could not achieve weight updating after each iteration in hardware. However, the optimization of the training algorithm is more convenient for the proposed, because there is no need to redesign the inference circuit. Although BNNs show worse performance than analog networks on complex and heavy tasks because of low precision weights, the proposed method of mapping weights is of great significance for both binary and analog memristive neural networks. These contents are detailed in the new section “Discussion”. Besides, when it comes to comparison with the traditional structure, section “Conclusion” has a detailed explanation.

 

Point 2: Please resize and increase the quality of figures 1, 5, 6, 11 and 13.

 

Response 2: Yes, we have rearranged the figures in the latest version to make them fit the article better.

 

Point 3: Please resize and increase the quality of Tables 2,3,4,5,6, and 7.

 

Response 3: Yes, we have redrawn the Tables in the latest version by reducing font size.

 

Point 4: What are the limitations of this paper?

 

Response 4: One of the major limitations of our work is that we do not include a training peripheral circuit in the MBNN implementation, which means that ex-situ training method is adopted in this work. However, compared with in-situ training method, it has unique advantages. This part is detailed in the new section “Discussion”.

 

Point 5: It would be better if the authors could highlight the practical implementation of this paper for readers?

 

Response 5: The nanoscale edge-computing systems are application candidates for the proposed MBNN. Meanwhile, the proposed method of mapping weights is of great significance to the future hardware implementation for both binary and analog memristive neural networks. This part is detailed in the new section “Discussion”.

 

We tried our best to improve the manuscript and made some changes in the manuscript. We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

 

 

Author Response File: Author Response.pdf

Reviewer 2 Report

The authors have addressed my concerns.

Author Response

Response to Reviewer 2 Comments

 

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript, those comments were all valuable and very helpful for revising and improving our paper, as well as the important guiding significance to our researches.

We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

Author Response File: Author Response.pdf

Reviewer 3 Report

The authors have provided a revised version of the manuscript "A Highly Robust Binary Neural Network Inference Accelerator Based on Binary Memristors" for the journal Electronics.

In my opinion, the manuscript is improved with respect to the original submission. The authors have performed several revisions and integrations. 

The introduction provides sufficient background and includes all relevant references. The research design is appropriate. The methods are adequately described. The results are clearly presented. Discussion of data and conclusions are adequately supported by the results.

English language and style are minor spell check required.

I do not detect plagiarism and I do not detect inappropriate citations.

In general, I do not see any ethical issues along the manuscript.

In terms of originality, significance of content, quality of presentation, scientific soundness, interest to the readers, I think that the manuscript deserves publication in the journal Electronics. For this reason, I recommend the editorial board to accept the manuscript in the present form.

Author Response

Dear Editors and Reviewers:

Thank you for your comments concerning our manuscript, those comments were all valuable and very helpful for revising and improving our paper, as well as the important guiding significance to our researches.

We appreciate for your warm work earnestly.

Once again, thank you very much for your comments and suggestions.

Author Response File: Author Response.pdf

Back to TopTop