Next Article in Journal / Special Issue
An Experimental Apparatus for Icing Tests of Low Altitude Hovering Drones
Previous Article in Journal
Design and Implementation of a UUV Tracking Algorithm for a USV
Previous Article in Special Issue
Low-Altitude Sensing of Urban Atmospheric Turbulence with UAV
 
 
Article
Peer-Review Record

Multi-Camera Networks for Coverage Control of Drones

by Sunan Huang *, Rodney Swee Huat Teo and William Wai Lun Leong
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Submission received: 27 January 2022 / Revised: 25 February 2022 / Accepted: 25 February 2022 / Published: 3 March 2022
(This article belongs to the Special Issue Unconventional Drone-Based Surveying)

Round 1

Reviewer 1 Report

An interesting article is presented for review, which deals with the development of an algorithm for optimal coverage of the observed area by a group of UAVs with downward-facing cameras on controlled gimbal. An original solution is proposed, the effectiveness of which is shown by the simulation.

However, the reviewer has several questions and comments:

  1. Line 93 - it is not clearly shown where the perpendicularity of e_1 and ov_3 follows from.
  2. The quality of Figure 2 needs to be improved. Figure 2 is poorly comparable with Figure 1, which makes it difficult to understand the above calculations when defining the vectors e.
  3. Model (14) assumes independent control for each of the coordinates, but for UAVs, the control loops are usually interconnected.
  4. In the algorithm in Fig. 3, in theory, there is not enough feedback from the UAV about its control parameters. It is useful to provide the diagram in Fig. 3 with the designations of variables and parameters that are transferred between blocks at the steps of the algorithm for better perception and understanding.
  5. What kind of stability is discussed in Theorem 3.1.1? Stability only for the angle of view?
  6. Fig.5 (a) and the rest - it is required to explain which of the curves means what.
  7. In Fig. 5-7, the ellipse apparently represents the density function, but it rather corresponds to the case e^(−(x/20)^2−((y−20)/40)^2), and not
    e^(−((x-20)/20)^2−((y−40)/40)^2. Verification and clarification required.
  8. Technical notes:

Line 15 - the abbreviation MUAV is not entered;

Line 26 etc. – “etal” must be separated by “et al”

Line 34 - the abbreviation FOV is not entered;

Line 166 - Replace "Section II" with "Section 2"

Line 232 - u_gi is not defined;

Author Response

Response toReviewer 1:

1.Line 93 - it is not clearly shown where the perpendicularity of e_1 and ov_3 follows from.

e1 is the normal vector of one side of the pyramid, while ov3 is within the side of the pyramid. Thus, e1 is perpendicular of ov3

2.The quality of Figure 2 needs to be improved. Figure 2 is poorly comparable with Figure 1, which makes it difficult to understand the above calculations when defining the vectors e.

Figure 2 is one side view of Figure 1.  Both o in the figures are the same. The same observation can been  seen for  f of the focal length in both figures. We improve Figure 2 in the revised version.

3.Model (14) assumes independent control for each of the coordinates, but for UAVs, the control loops are usually interconnected.

In the revised version, we state this point clearly, that is  "For each UAV, their dynamical model is independent. For neighboring UAVs, the wireless communication network is assumed to be connected during the mission."

4.In the algorithm in Fig. 3, in theory, there is not enough feedback from the UAV about its control parameters. It is useful to provide the diagram in Fig. 3 with the designations of variables and parameters that are transferred between blocks at the steps of the algorithm for better perception and understanding.

For Figure 3, we re-plot it and add more variables.

5.What kind of stability is discussed in Theorem 3.1.1? Stability only for the angle of view?

Stability is to ensure that 1)the coverage is bounded when applying coverage control with view angle law;2)it can be converged to the desired view angle. A remark is added in the revised version.

6.Fig.5 (a) and the rest - it is required to explain which of the curves means what.

We add some labels in Figure 5a

7.In Fig. 5-7, the ellipse apparently represents the density function, but it rather corresponds to the case e^(−(x/20)^2−((y−20)/40)^2), and not
e^(−((x-20)/20)^2−((y−40)/40)^2. Verification and clarification required.

Since we cannot show all density functions in Figure, Figs 5-7 show the rough cover range of the density function

8.Technical notes:Line 15 - the abbreviation MUAV is not entered;

Line 26 etc. – “etal” must be separated by “et al”

Line 34 - the abbreviation FOV is not entered;

Line 166 - Replace "Section II" with "Section 2"

Line 232 - u_gi is not defined;

   We have corrected these errors.

Reviewer 2 Report

The article is interesting from a scientific point of view and for the readers who are interested in Multiple Unmanned Multirotors systems. As the authors of the paper point out, “coverage control is attracting research interest in MUMs”. The authors' original contribution was to propose a distributed coverage control approach for MUMs with downward facing cameras. Existing coverage control algorithm has been extended to incorporate a new sensor model, which is downward facing and allows pan-tilt-zoom (PTZ). The main assumptions of the wireless camera network are described in the paper. Simulation tests of the proposed coverage controls are given in this section “Simulation Studies”.

Comments:

  • In line 60 - Please explain on what basis do the authors guarantee complete exclusion of collisions? Later in the article it is mentioned about 85%?
  • In line 212 - Did the research cover the situation of how to detect invisible and obscured areas and how to conduct a flight in such cases to optimize its effectiveness?
  • Please add a section “Discussion” to your paper. In line with the journal's guidelines in the discussion section: “Authors should discuss the results and how they can be interpreted in perspective of previous studies and of the working hypotheses. The findings and their implications should be discussed in the broadest context possible and limitations of the work highlighted. Future research directions may also be mentioned. This section may be combined with Results.”
  • How is the problem of recognizability of the studied phenomenon in images from various drones solved? Images are acquired from different heights and distances, with different resolutions, and vary in terms of distortion. Assuming that the drone-camera system treated as a unity in the experiment is a dynamic system, it moves in space, when determining the FOV range of individual cameras, the displacement vector of the system should be taken into account.
  • It should be noted that the deformation of reality that may arise in the acquired image, resulting from the differentiation of the tilt angle and rotation of the camera (movement on the gimbal). Optimizing the position of the camera in terms of maximizing the area covered by monitoring may have a negative impact on the recognition of the studied phenomena on the acquired image.
  • Does the research show how the optimal arrangement of cameras on many drones is synchronized in time simultaneously? Could this be added to the your paper?
  • Please correct the references according to the guidelines “And add the note to the text together with the citation: … recently reported by Díaz et al. [10]”. Applies to entries in lines 48, 49, 50, 51, 55 and further in the text of the paper.
  • Please add the name of the section "References" in line 412.
  • Please correct the list of cited authors, self-citations for 23 literature items, 7 include publications by one of the co-authors (over 30%).

The article may be accepted for publication after minor revision (corrections to minor methodological errors and text editing).

 

 

Author Response

Response to Reviewer2

1.In line 60 - Please explain on what basis do the authors guarantee complete exclusion of collisions? Later in the article it is mentioned about 85%?

  • 85% is coverage ratio. It is user-defined. Usually, users can define it >85%.

2.In line 212 - Did the research cover the situation of how to detect invisible and obscured areas and how to conduct a flight in such cases to optimize its effectiveness?

In this paper, we don’t handle this situation.  Our another paper is discussing this issue. See our latest paper,  3D Multi-Camera Coverage Control of Unmanned Aerial Multirotors. S Huang, WL Leong, RSH Teo,2021 International Conference on Unmanned Aircraft Systems (ICUAS), 877-884,2021.

3.Please add a section “Discussion” to your paper. In line with the journal's guidelines in the discussion section: “Authors should discuss the results and how they can be interpreted in perspective of previous studies and of the working hypotheses. The findings and their implications should be discussed in the broadest context possible and limitations of the work highlighted. Future research directions may also be mentioned. This section may be combined with Results.”

In the revised version, we add this section with conclusion section together.

4.How is the problem of recognizability of the studied phenomenon in images from various drones solved? Images are acquired from different heights and distances, with different resolutions, and vary in terms of distortion. Assuming that the drone-camera system treated as a unity in the experiment is a dynamic system, it moves in space, when determining the FOV range of individual cameras, the displacement vector of the system should be taken into account.

Thank you for your  question. In the revised version, we discuss this point. From your question, in fact, it is a 3D coverage issue. We have given a primary result (see the answer of the point 2)

5.It should be noted that the deformation of reality that may arise in the acquired image, resulting from the differentiation of the tilt angle and rotation of the camera (movement on the gimbal). Optimizing the position of the camera in terms of maximizing the area covered by monitoring may have a negative impact on the recognition of the studied phenomena on the acquired image.

Yes, we agree with your comments. The deformation will affect the coverage. In future, we will consider some constraints in the optimal coverage such that the image quality of the coverage can be ensured.

6.Does the research show how the optimal arrangement of cameras on many drones is synchronized in time simultaneously? Could this be added to the your paper?

The optimal coverage is checked by each UAV. When the user-defined coverage ratio is reached, the optimal coverage is reached. In practical situation, they may have some delays due to the communication latencies.

7.Please correct the references according to the guidelines “And add the note to the text together with the citation: … recently reported by Díaz et al. [10]”. Applies to entries in lines 48, 49, 50, 51, 55 and further in the text of the paper.

  We have revised these parts.

8.Please add the name of the section "References" in line 412.

In the revised version, we add it.

9.Please correct the list of cited authors, self-citations for 23 literature items, 7 include publications by one of the co-authors (over 30%).

In the revised version, our two papers are removed.

The article may be accepted for publication after minor revision (corrections to minor methodological errors and text editing).

Back to TopTop