Next Article in Journal
Perylene Based Solution Processed Single Layer WOLED with Adjustable CCT and CRI
Next Article in Special Issue
The Enlightening Role of Explainable Artificial Intelligence in Chronic Wound Classification
Previous Article in Journal
Gain Expressions for Capacitive Wireless Power Transfer with One Electric Field Repeater
 
 
Article
Peer-Review Record

FastUAV-NET: A Multi-UAV Detection Algorithm for Embedded Platforms

Electronics 2021, 10(6), 724; https://doi.org/10.3390/electronics10060724
by Amir Yavariabdi 1,*, Huseyin Kusetogullari 2,3, Turgay Celik 4,5 and Hasan Cicek 1
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3:
Reviewer 4:
Electronics 2021, 10(6), 724; https://doi.org/10.3390/electronics10060724
Submission received: 9 February 2021 / Revised: 3 March 2021 / Accepted: 6 March 2021 / Published: 19 March 2021

Round 1

Reviewer 1 Report

This paper presents a novel real time deep learning-based framewort for detecting and tracking Unmanned Aerial Vehicles in video streams.

In Introduction the state of the art object detection and tracking algorthms with advantages and disadvantages of the used classification algorithms are woreked out well. The disadvantages of the detection strategies YOLOVv2 and YOLOVv3-tiny compared to the proposed FastUVA-net method are also clearly identified.

In 2 the VOLOv3-tiny detection method with advantages and disadvantages of  is described in detail. In particular the improvements that have to be implemented in order to be able to use the method for the proposed application. 

In 3 the proposed tracking framework with the 2 sub-steps improvement of the YOLOv3-tiny multi UAV detection and sKCF tracker is presented as the new solution.

In 4 results and discussion the examination structure and the results are shown in the appendix of the video data sets used. The results are described in detail, presented meaningfully as a picture, diagram and table and compared with results from state of the art.

In Conlcusion the added value of the proposed solution is demonstrated on the basis of the results obtained. However, an outlook could be added as to how the framework could be further improved in the future.

 

Author Response

Thank you for your feedback and comments that helps to improve the quality of the manuscript. Please find the response to the reviewer document attached.

Author Response File: Author Response.pdf

Reviewer 2 Report

Dear authors,

The paper is well structured and flows well. Also, it is very well written. The Introduction and Background sections provide simple and elegant good explanations on real-time deep neural network framework to detect and track multi-UAV in airborne videos. The authors synthesize very clearly the background needed to understand the contribution of the paper. Moreover, the structure and operation of the proposed framework are very well explained. 

I believe that the contribution may be strengthened if a better explanation would be provided for the presented formulas.

Author Response

Thank you for your feedback and comments that helps to improve the quality of the manuscript. Please find the response to the reviewer document attached.

Author Response File: Author Response.pdf

Reviewer 3 Report

The authors proposed an innovative method for detecting and tracking
Unmanned Aerial Vehicles (UAVs) in video streams. The paper overall is interesting for readers, and should be published in this Journal. I have a few suggestions:

-A new section can be added to just highlight the technical contribution of the paper so that readers can follow the storyline better.

-It'd be encouraged that authors share the code and/or data so that readers can reproduce the results or improve the proposed model. 

Author Response

Thank you for your feedback and comments that helps to improve the quality of the manuscript. Please find the response to the reviewer document attached.

Author Response File: Author Response.pdf

Reviewer 4 Report

The authors proposed a real-time deep learning-based framework for detecting and tracking Unmanned Aerial Vehicles (UAVs) in video streams. For detection, the authors use the Darknet-19 architecture of YOLOv3-tiny is widened based on the Inception module. For tracking UAVs, the authors use a scalable Kernel Correlation Filter (sKCF). 

1)    The authors consider a real-time deep learning-based framework for detection. Please explain in more detail a real-time deep learning-based framework in Figure 2. 
2)    Please introduce and explain clearly the architecture of the Darknet-19 neural network. 
3)    In UAVs pursuit problem, the vital task is to detect and track a target or leader
UAV using a tracker or follower UAV. For example, one UAV wants to detect and track the leader UAV, how to classify the leader UAV and UAV members of the leader?
4)    Please plot the error or KPI to compare between your proposal and the Darknet-19 architecture of YOLOv3-tiny.

 

Author Response

Thank you for your feedback and comments that helps to improve the quality of the manuscript. Please find the response to the reviewer document attached.

Author Response File: Author Response.pdf

Round 2

Reviewer 4 Report

Thank you for your response.

According to your reply, as shown as "Currently, the scope of this paper is to detect and track a target or leader UAV(s) using a tracker or follower UAV" and my opinion, I think the title should be changed, i.e., "A UAV Detection and Tracking Algorithm for Embedded Platforms" because the simulation results only show to detect only one UAV.

Author Response

Review Comments:

1) According to your reply, as shown as "Currently, the scope of this paper is to detect and track a target or leader UAV(s) using a tracker or follower UAV" and my opinion, I think the title should be changed, i.e., "A UAV Detection and Tracking Algorithm for Embedded Platforms" because the simulation results only show to detect only one UAV.

Answer: Thank you for your comment to improve the quality of the manuscript. We have updated the title in the paper. We removed tracking in the title. Many thanks.

Author Response File: Author Response.pdf

Back to TopTop