Next Article in Journal
Finite Element In-Depth Verification: Base Displacements of a Spherical Dome Loaded by Edge Forces and Moments
Previous Article in Journal
Machine Learning-Assisted Characterization of Pore-Induced Variability in Mechanical Response of Additively Manufactured Components
 
 
Article
Peer-Review Record

Optimal Multi-Sensor Obstacle Detection System for Small Fixed-Wing UAVs

Modelling 2024, 5(1), 16-36; https://doi.org/10.3390/modelling5010002
by Marta Portugal and André C. Marta *
Reviewer 1:
Reviewer 2:
Reviewer 3:
Modelling 2024, 5(1), 16-36; https://doi.org/10.3390/modelling5010002
Submission received: 12 October 2023 / Revised: 8 December 2023 / Accepted: 12 December 2023 / Published: 20 December 2023

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The paper mentions the optimization of sensor configurations for UAVs, but the primary contributions are not explicitly stated in the abstract. Is this optimization approach a novel contribution in the realm of UAV sensor configurations? It would be beneficial to clearly highlight this in the abstract.

 

Methodology:

 

The authors introduced a "scenario generation" algorithm, but it lacks detailed description. To enhance reproducibility, it would be prudent to provide more explicit steps or an algorithmic flowchart.

Why was the Genetic Algorithm chosen as the optimization technique? Were other contemporary optimization techniques considered? The rationale behind this selection needs to be more transparent.

 

Results and Discussion:

 

When discussing the performance of different sensors, it would be helpful to provide a table or graphical representation detailing the performance of ultrasonic sensors, laser rangefinders, LIDAR, and RADAR across various scenarios.

The paper refers to certain optimized configurations, as illustrated in Figure 12, but it lacks a comprehensive explanation as to why these configurations are deemed optimal. Additional data or experiments might be necessary to substantiate these conclusions.

 

Conclusion:

 

While the conclusion acknowledges LIDAR as an effective sensor, has there been a consideration regarding its stability, cost, and other potential challenges in real-world scenarios? It would be apt to discuss these practical aspects in the conclusion section.

 

References:

 

The paper cites numerous external sources, but it does not explicitly state the specific contributions of each reference within the text. It would be valuable to pinpoint the key content of these references in the main text.

In summary, while the subject and methodology of the paper hold value, there's a need for further refinement and detail to ensure the quality and accuracy of the article.

 

For literature, the authors can consider referring to:

"

Tahir, A., Böling, J., Haghbayan, M. H., Toivonen, H. T., & Plosila, J. (2019). Swarms of unmanned aerial vehicles—A survey. Journal of Industrial Information Integration, 16, 100106.

Yasin, J. N., Mohamed, S. A., Haghbayan, M. H., Heikkonen, J., Tenhunen, H., & Plosila, J. (2020). Unmanned aerial vehicles (uavs): Collision avoidance systems and approaches. IEEE access, 8, 105139-105155.

Lu, Y., Xue, Z., Xia, G. S., & Zhang, L. (2018). A survey on vision-based UAV navigation. Geo-spatial information science, 21(1), 21-32.

Jiang, S., Tarabalka, Y., Yao, W., Hong, Z., & Feng, G. (2023). Space-to-speed architecture supporting acceleration on VHR image processing. ISPRS Journal of Photogrammetry and Remote Sensing, 198, 30-44.

Sareh, P., Chermprayong, P., Emmanuelli, M., Nadeem, H., & Kovac, M. (2018). Rotorigami: A rotary origami protective system for robotic rotorcraft. Science Robotics, 3(22), eaah5228.

Wang, C., Wang, J., Wang, J., & Zhang, X. (2020). Deep-reinforcement-learning-based autonomous UAV navigation with sparse rewards. IEEE Internet of Things Journal, 7(7), 6180-6190.

Fraga-Lamas, P., Ramos, L., Mondéjar-Guerra, V., & Fernández-Caramés, T. M. (2019). A review on IoT deep learning UAV systems for autonomous obstacle detection and collision avoidance. Remote Sensing, 11(18), 2144.

Xie, R., Meng, Z., Wang, L., Li, H., Wang, K., & Wu, Z. (2021). Unmanned aerial vehicle path planning algorithm based on deep reinforcement learning in large-scale and dynamic environments. IEEE Access, 9, 24884-24900.

Zhao, C., Fu, C., Dolan, J. M., & Wang, J. (2021). L-shape fitting-based vehicle pose estimation and tracking using 3d-lidar. IEEE Transactions on Intelligent Vehicles, 6(4), 787-798.

García-Fernández, M., López, Y. Á., Arboleya, A., González-Valdés, B., Rodríguez-Vaqueiro, Y., Gómez, M. E. D. C., & Andrés, F. L. H. (2017). Antenna diagnostics and characterization using unmanned aerial vehicles. IEEE Access, 5, 23563-23575.

 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

1.      In the legend of Figure 6, what does the red line refers to?

2.      The optimization problem in equation 9) is overly simplistic. I don’t think the contribution of this paper is significant due to this formulation. In addition, using GA is not a good idea. This problem can be formulated as quadratic programming, which can be solved using mathematical optimization solvers and libraries such as Gurobi and CPLEX. It is probable of obtaining the global optimal solution. There is no need using metaheuristic algorithms to approximate the optimal solution.

Comments on the Quality of English Language

A moderate editing of English language is required.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

Comments and Suggestions for Authors

The article and the research topic are interesting. Still, at the beginning of the article, there is no explanation for the questions: 1) why did the authors limit themselves to fixed-wing UAVs when detecting obstacles? 2) why is the problem solved only in 2D? 3) why is the entire solution calculation offline? In the article's introduction, the authors should better describe the concept of their solution, comment on these questions, give an opinion, even if there is a problem in the proposed solution, and clarify the problems appropriately. The reader learns about the limitations mentioned in the last chapters, which is not good. In the article, I repeatedly drew attention to the missing information regarding the critical criteria for obstacle detection - what time must be met. It would be ideal to explain the entire concept of the solution at the very beginning with an explanation of the individual limitations and how they could be solved further.

In the 1st row - What impact will the limitation of the simulation on the UAV have on the study's results? They will also apply to other types of UAVs, or it will be possible to modify the input parameters during simulations for different UAVs.

In the 32nd -34th rows – You are also considering legislative restrictions (direct visibility, maximum distance from the pilot, etc.). One thing is what the system will allow, and another is the legislative restrictions.

In the 35th row, please first define the term fixed-wing UAV.

In the 37th row – detection takes place during the flight/simulation, i.e., it is considered with real-time evaluation and critical time for detection. Or does the entire calculation take place offline?

In the 49th row – Sense and Avoid System – this system is mentioned in several places in the article. It is the name of a specific design or solution. Did you use it in the experiment?

In the 52nd row - based on which you chose these sensors. Didn't you consider a camera, for example?

In the 3rd figure - why don't you tie it with the third dimension? The UAV has a certain height, and the obstacle's edge will be partially vertical.

In the 86th row - why only the middle of the obstacle? I think the obstacle's edge should be evaluated as a collision.

In the 105th row, it would be good to supplement the diagram with the marking of beta angles to understand the problem better.

The text between formulas (1b) and (2) – a diagram describing the variables would also suit this text.

In the 114th -116th rows, the lidar output is polar coordinates, i.e., angles and lengths.

In the 117th row, please specify the numerical value for fair distances.

In the 120th row - please clarify. Either you are considering a 3D lidar such as Velodyne or others (I'm afraid that the one you have listed in Table 1 is not 3D), or you want to use a 2D lidar and use the console to tilt the lidar track - that would be problematic from the point of view of data synchronisation and especially with in terms of speed of calculation and construction of 3D clouds.

In the 122nd- 125th rows - In the introduction of the article, where the problem is divided into four parts, it will be necessary to explain in more detail what you do after detecting an obstacle. If I understand correctly, you are trying to see the entire block to recalculate the trajectory. Is it feasible given the possible/acceptable delay of the system given a potential collision?

- connecting the first and last detected point can only apply to regular (symmetrical) obstacles, where most obstacles are visible. This will not be a frequent situation in outdoor conditions.

In the 4th figure – in the legend- the UAV symbol is missing, the measured beams from the lidar, and there is an incorrect symbol for the detected point.

-          Please explain the difference between the obstacle model and the obstacle.

In the 128th- 133rd rows, please clarify how the trajectory is recalculated when detecting an obstacle. What is the criterion for the distance of the trajectory from the obstacle? If you consider that the obstacle has a certain radius, do you increase the radius and design a new trajectory route that does not interfere with the radius? This concept should be explained.

In the 145th row – "output from radar are polar" – Output from the lidar are also polar coordinates. You can recalculate the output at a known angle, even with laser rangefinders.

In the 146 – 147 rows – could you please explain better the sentence "Due to its straightforward implementation,..."

Formula 7 – writing a covariance matrix for the KF characteristic is very little. You could at least state what was the subject of the estimate.

In the 155th row – Synchronisation and mutual transformation of measurements into one coordinate system is a much bigger problem when connecting several sensors.

In the 168 row - "optimal Sensing System", - It is necessary to describe how you generated the individual scenarios, specify the conditions that must be met as soon as possible, and at least the time is critical for detecting the obstacle, the obstacle should be modelled, and that is due to the replanning of the route. Next, explain why you created the individual scenarios and with what parameters. Have you completed your program? If there are problems with obstacle detection, is it possible to use it for other UAVs as well?

In the 171st row, you also solve moving obstacles in the study. How do you recalculate the path of a trajectory with a moving obstacle?

In the 175th row - you could at least describe the principle of referring to the dissertation when generating scenarios.

In the 5th figure - it would be good to describe the picture,

- the colour of the line is lost in the picture; you can choose colours with a higher contrast or thicker data lines and a larger ending.

Legend outside the picture, this is how it looks like another calculation point.

In the 191st row - please explain the term safety radius.

In the 6th figure – add the symbol for UAV in the legend; the sign for the obstacle is a circle. Why does the designed trajectory pass through the obstacle?

In the 201st and 202nd rows – I think limiting the solution to 2D is a problem. How can you solve obstacle avoidance only in 2D? Do you tie only with obstacles that have perfectly vertical walls? This must be justified.

In the 205th row - why did you choose this particular UAV model? What limitations does this choice of model have on the results of your work? Can the user enter UAV parameters when generating scenarios?

In the 2nd table - please explain the parameters in the second and third columns.

Please explain the terms safety and collision radius in the 208th and 209th rows.

In the 7th figure - should the metric function f(beta) be written in the description of the Y axis?

In the 214th row – “range of range fingers 100m.” At a speed of 15 m/s, the UAV passes this section in about 6.7 seconds. This is the time required to detect the obstacle and recalculate the route. These evaluations would also be suitable for other sensors.

In the 217th and 218th rows - I don't understand when the UAV takes off. You can't change the position/orientation of the sensors, or will there be a servo for this purpose?

In the 226th-228th rows - more critical are the platform parameters on the UAV, where the detection itself will run, or how is it supposed to be? The detection will have to run in real-time, and the parameters of the specific platform used on the UAV should probably be selected during the simulation.

In the 236th row – the sensor range of 6m is minimal for detecting obstacles due to the speed of the UAV.

In the 8th figure - does the sensor range fit proportionally to the dimensions of the UAV? - the manufacturer offers the basic dimensions of the UAV

- the legend is incomplete

- slide the angle description onto a white surface for better readability.

In the 252nd row - do you solve the entire calculation offline? This will have to be explained in the article's introduction: why like this and especially then, what is the significance of detecting obstacles, for example, moving ones, if you do it offline?

In the 289th row - do you also evaluate the false positivity of obstacles?

In the 290th row- please specify the sensors' required angle and length accuracy. Is this a significant parameter, or rather the speed of measurement?

In the 304th row- what is the significance of changing the orientation of the lidar if we are talking about a solution in the horizontal plane? That's just a matter of azimuth processing. The obstacle in the measurements will be visible even after turning the lidar. The problem can only arise if a narrow FOV is set due to the measurement speed.

In the 312th row - you did not consider using a camera (monocular/stereo). There are small, powerful cameras with low power consumption and an autofocus function. The problem may be the processing time, but that will probably be the case with radar and lidar.

In the 342nd row - you could plot the most suitable configuration on the detailed diagram of the UAV, which the manufacturer offers on the site, together with the power supply and the location of the sensors and the evaluation platform.

In the 364 row - as part of the future work, it will mainly be necessary to resolve the limitation of the work to a 2D solution and calculation that runs offline - i.e. optimal route proposal. I may have misunderstood, but could you please clarify?

 

 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

Comments and Suggestions for Authors

Thanks for the response from authors. Most of them answer my comments. However, there are still some issues:

Although the authors think using GA is very suitable for this problem, I still think it is necessary to compare the optimization result with the global optimum. There is no real-time need for this problem, we don't need to sacrifice the accuracy to have less computation time. How optimal the result is? It is the most crucial question. 

Comments on the Quality of English Language

Moderate english revision is needed

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 3

Reviewer 2 Report

Comments and Suggestions for Authors

No further questions

Back to TopTop