Next Article in Journal
Multiscale Maize Tassel Identification Based on Improved RetinaNet Model and UAV Images
Previous Article in Journal
GBB-Nadir and KLIMA: Two Full Physics Codes for the Computation of the Infrared Spectrum of the Planetary Radiation Escaping to Space
 
 
Article
Peer-Review Record

Research on Arc Sag Measurement Methods for Transmission Lines Based on Deep Learning and Photogrammetry Technology

Remote Sens. 2023, 15(10), 2533; https://doi.org/10.3390/rs15102533
by Jiang Song 1,2, Jianguo Qian 2, Zhengjun Liu 1, Yang Jiao 3, Jiahui Zhou 4, Yongrong Li 1,*, Yiming Chen 1, Jie Guo 3 and Zhiqiang Wang 3
Reviewer 1:
Reviewer 2:
Remote Sens. 2023, 15(10), 2533; https://doi.org/10.3390/rs15102533
Submission received: 23 February 2023 / Revised: 6 May 2023 / Accepted: 9 May 2023 / Published: 11 May 2023

Round 1

Reviewer 1 Report

The paper describes a method for measuring arc sag using drones. I think the method is theoretically applicable, I just think the article is missing some key information.

Here are my comments:

The article gives the formulas for calculating the calibration, but does not indicate how the camera calibration was actually performed ? Was some calibration plate used or was the calibration done during the calculation of the bundle adjustment as part of the real project calculation ? From what distance was the calibration performed ?

The drone used for imaging is a Phantom 4 RTK. From what altitude were the images or video taken ? Please also add flight path and other parameters like ground sample distance, camera settings, etc.

The article mentions the use of a bundle adjustment calculation (Fig. 7), but it is not further elaborated how it was done ? Please add.

In table 5, errors even around 0.5 m are given, which is very large indeed. From these figures I have the feeling that the measurement is wrong ... how were these values obtained ? Please describe the calculation procedure in detail.

Author Response

Thank you very much for your comment. Please see the attachment for my modification instructions.

Author Response File: Author Response.docx

Reviewer 2 Report

This paper proposes an automatic spacer bar segmentation algorithm to extract the spacer bars and calculates the center coordinates based on UAV inspection video data. The contribution of this paper is not well organized. If the main contribution is the CM-Mask-RCNN, it is too weak as it just combines existing works. There is no flowchart of the whole workflow, which leads to a poor reading experience.  More specifically, here are some comments:

(1) 4.2 is too weak. How to determine the corresponding points from the stereo images automatically? How's the accuracy only using the segmentation results? 

(2) The tile of Table 5 is unclear. What is true value? The coordinates? length? Also, the evaluation criteria are not defined.

(3) Workflow is missing. The structure of the method section is very poor. Each section has its own workflow.

(4) 4.3. How to determine the XOY plane using (xi,yi,zi)?

(5)  Experiments. Show the 3D visualization of the fitted transmission line.

(6) The title is not suitable and does not show the novelty.

Author Response

Thank you very much for your comment. Please see the attachment for my modification instructions.

Author Response File: Author Response.docx

Round 2

Reviewer 1 Report

Please add to the article the justification of the big values in Tab. 7 similar to what was stated in the cover letter (responce 4 - red color ).

Author Response

Thanks for this suggestion. We have added relevant content on error source analysis in the article based on your suggestion. Please refer to the revised draft for detailed information.

Author Response File: Author Response.docx

Reviewer 2 Report

The authors have done good work.

Author Response

Thank you for your review

Back to TopTop