Next Article in Journal
A Boundary Distance-Based Symbolic Aggregate Approximation Method for Time Series Data
Previous Article in Journal
Special Issue “Nonsmooth Optimization in Honor of the 60th Birthday of Adil M. Bagirov”: Foreword by Guest Editors
 
 
Article
Peer-Review Record

Differential Evolution with Linear Bias Reduction in Parameter Adaptation

Algorithms 2020, 13(11), 283; https://doi.org/10.3390/a13110283
by Vladimir Stanovov *,†, Shakhnaz Akhmedova and Eugene Semenkin
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Algorithms 2020, 13(11), 283; https://doi.org/10.3390/a13110283
Submission received: 19 October 2020 / Revised: 6 November 2020 / Accepted: 6 November 2020 / Published: 9 November 2020
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)

Round 1

Reviewer 1 Report

The problem is interesting and the contribution aims to allow more exploration at the beginning of the search and faster convergence to the best located optimum at the end. While the manuscript is written well and the experiments well-validated with significance tests, some issues the authors may elaborate on to render completeness are:

1. How does the LBR technique generalize to other evolutionary and swarm intelligence algorithms? How can one go from validation on DE to extrapolating that this scheme will work well in other CI algorithms? 

2. How do the results extend to multi-objective problems and unconstrained functions? If such experimentation is beyond the scope of this work, elaborated general discussions on the integration of this approach in such problems will offer some insight for other researchers trying to do reproducible research.

Please make these minor changes and resubmit.

 

Author Response

Thank you for yor important notes.

  1. The LBR could be used for any EA or SI algorithm with numerical parameters, for example success-history based genetic algorithm. This is included in the discussion section.
  2. To the best of our knowledge there have been only few studies on success-history based parameter adaptation for multiobjective algorithms. However, as long as LBR only requires current and total computational resource to tune parameters, it could be implemented for multiobjective optimization as well. This is included in the discussion section.

Reviewer 2 Report

This paper proposes a linear bias reduction (LBR) strategy for parameter adaptation, which properly expands the parameter range in the early stage, and narrows the range in the later stage, such that exploration and exploitation can be enhance in the early and later stages, respectively.

The proposed LBR strategy is well illustrated and presented; its function has been experimentally verified by applying it on various differential evolution variants, on solving the CEC 2017 and 2020 benchmark problems.

The paper is overall well written, with a few minor suggestions:

  1. Line 37, Page 2, it says "... the third section proposes the Linear Bias Reduction approach, the fourth section contains the experimental results and comparison ...", while they are actually Sec 2.3 and Sec. 3, respectively.
  2. It would be better if Tables 2-5 and 7-10 can be reorganized, such that, for example, the results of L-SHADE and L-SHADE-LBR can be presented side by side, and thus easily compared.

 

Author Response

Thank you for yor important notes.

  1. Corrected.
  2. Presenting LSHADE and LSHADE-LBR, jSO and jSO-LBR, DISH and DISH-LBR, side by side would either lead to 6 pages of tables instead of 4 (CEC2017), as long as only 2 algorithms would be in one group of columns. Otherwise, the results of either jSO and jSO-LBR or DISH and DISH-LBR would be disconnected anyway. We believe that current tables organization is quite compact and sufficient (because the statistical tests are given), although if you insist on reorganizing the tables please submit this issue again.
Back to TopTop