Next Article in Journal
Improving OSAHS Prevention Based on Multidimensional Feature Analysis of Snoring
Previous Article in Journal
Deep Learning of Sensor Data in Cybersecurity of Robotic Systems: Overview and Case Study Results
 
 
Article
Peer-Review Record

55 nm CMOS Mixed-Signal Neuromorphic Circuits for Constructing Energy-Efficient Reconfigurable SNNs

Electronics 2023, 12(19), 4147; https://doi.org/10.3390/electronics12194147
by Jiale Quan 1,2,3, Zhen Liu 1,2,3, Bo Li 1,3, Chuanbin Zeng 1,3 and Jiajun Luo 1,3,*
Reviewer 2:
Electronics 2023, 12(19), 4147; https://doi.org/10.3390/electronics12194147
Submission received: 22 August 2023 / Revised: 2 October 2023 / Accepted: 3 October 2023 / Published: 5 October 2023
(This article belongs to the Section Microelectronics)

Round 1

Reviewer 1 Report

This paper proposes a 55nm CMOS Mixed-Signal Neuromorphic Circuits for Constructing Energy-Efficient Reconfigurable SNNs. The paper is well written, but in my opinion, it lacks a better description of the state of the art and a better explanation of the novelty of this work. Significantly, the authors should compare their linearity and resolution results with the literature in order to demonstrate their contribution (current references 2023, 2022).

Page 2, Figure 1: why the authors are introducing the Vth”, AW, Vwth? Vth”, AW, Vwth are not shown in the schematic and is not used afterward.

Page 2, Figure 1: Please define “Vth”, AW, Vwth. Then, how do the authors achieve the aspect ratio of all transistors? Please give more details or a reference.

 

VTH and Vth are the same?

Page 10, line 329. The chip measurement setup is shown in Figure 7(c). The measurement system consists of an Xilinx Artix-7 FPGA, Which model?

Page 10, line 328. Although the description of coarse measurement is clear, the description of the SNN lacks of basic principles.

Page 16, line 492: authors say that  "compared" with other 2 works… Is it better or worse (current references)?

Please check all acronymous, equations and figures, for example, Figure 13 missing the letter at x asis

Minor issues in english

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Reviewer 2 Report

This work is to propose a design solution to a reconfigurable mixed-signal SNN implemented by a set of digital/analog neuromorphic circuits. The circuit is fabricated with 55nm technology and measured with LIF neuron behavior, SDSP learning algorithm, Pavlov associative learning experiment and binary classification task. The results from the measurement are showing cost-effectiveness and energy-efficiency of the design. The topic itself is interesting and helpful to the neuromorphic circuits/module designs, but there are still lots of technical issues not fully provide by current manuscripts, so the conclusion is not fully supported by the evidences provided. Thus, the reviewer would suggests major revision before considering accept it as potential publication.

Here are several critical points:

1. In most neural networks, concurrent data processing is key to the throughput and training time. What is the concurrent training ability for the design proposed here? Is it just a single neuron?

2. In Table1, multiple works are compared in regards of supply voltage, chip area and engery per spike. Not sure if this is valid comparisons, is it also possible to provide the training frequency in those works?

3. The design is only evaluated at 30Hz-1kHz. Not sure if such working frequency is sufficient for real future SNN applications?

4. Measurement seems only taken circuit level data, any higher level data could be shown? Such as accuracy vs training epoch which is key to SNN.

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

Thanks for the authors to address the technical issues in the previous version, the reviewer has no more questions. Thus, would like to suggest to accept as is.

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Back to TopTop