Next Article in Journal
Structural Characterization of Acid DES-Modified Alkaline Lignin and Evaluation of Antioxidant Properties
Previous Article in Journal
Forest Resources Projection Tools: Comparison of Available Tools and Their Adaptation to Polish Conditions
 
 
Article
Peer-Review Record

AMDNet: A Modern UAV RGB Remote-Sensing Tree Species Image Segmentation Model Based on Dual-Attention Residual and Structure Re-Parameterization

Forests 2023, 14(3), 549; https://doi.org/10.3390/f14030549
by Haozhe Huang 1,†, Feiyi Li 1,†, Pengcheng Fan 2, Mingwei Chen 1, Xiao Yang 1, Ming Lu 1, Xiling Sheng 1,3, Haibo Pu 1,3,* and Peng Zhu 2,*
Reviewer 1:
Reviewer 2:
Forests 2023, 14(3), 549; https://doi.org/10.3390/f14030549
Submission received: 18 January 2023 / Revised: 24 February 2023 / Accepted: 27 February 2023 / Published: 10 March 2023
(This article belongs to the Section Forest Inventory, Modeling and Remote Sensing)

Round 1

Reviewer 1 Report

The entitled " AMDNet: Modern UAV RGB Remote Sensing Tree Species Image Segmentation Model Based on Dual Attention Residual and Structure Re-parameterization" investigated the Tree Species Image Segmentation using deep learning: AMDNet based on UAV data. The manuscript was in good shape, but the method of dual residual attention module was nothing new as it is widely applied in deep learning domains. The authors should focus on what they have done and the what they have found otherwise the new improvement by adding the dual residual attention module. The language of the manuscript needs to be well edited. I would recommend the major revision for it as there are some issues (see below specific comments) needed to be handled before final acceptance.

Specific comments:

1. Line 57. The UAV mounted with varieties of sensors has been a promising platform for collecting high spatial and resolution of images. Please add necessary references for this, and I am recommending some for this:

1) Estimating canopy-scale chlorophyll content in apple orchards using a 3D radiative transfer model and UAV multispectral imagery

2) Machine Learning-Based Approaches for Predicting SPAD Values of Maize Using Multi-Spectral Images

3) The use of unmanned aerial vehicles (UAVs) for remote sensing and mapping

2. line 62. The references were all no-related with medicine. This should be well revised.

3. line 81 and 83. Keep the same uniform of all names for all references, full name or family name et al.

4. line 156. Since the altitude range being 900—3522m, and the relative altitude between UAV and ground is different, which will cause the great difference in the spatial resolution and further influence the accuracy of results. Clarify this point clearly.

5. line 204. Revise this sentence.

6. Improve all figures with high quality.

7. In section 2, it is necessary to identify whether the UAV imagery was collected hovering or by route; if by route, a map of the UAV imagery of the study region, ideally accompanied by an elevation map of the study area, must be included in Figure 1. If hovering, the time range for these image collections, as well as the radiation correction method employed, should be provided.

8. The effects of gaps and reflections were not properly considered in this study, and these regions are classified as trees or roads in Figure 8.

9. The tree species depicted in Figure 8 deviate from the description in Table 1, and the distribution of each tree species is also lacking. It is also suggested to include an examination of the reasons for this distribution.

10. In Table 7. The difference in the accuracy using EncNet, SegFormer, AMD Net* was quite little.

 

Author Response

Dear reviewers:

On behalf of my co-authors, we thank you very much for allowing us to revise our manuscript, we sincerely appreciate you for providing the constructive comments and suggestions on our manuscript entitled “AMDNet: Modern UAV RGB Remote Sensing Tree Species Image Segmentation Model Based on Dual Attention Residual and Structure Re-parameterization” (ID: forests-2199343).

The comments are very beneficial for improving the quality of our manuscript. We have tried our best to revise our manuscript according to the comments, which are marked in purple in the paper and Thesis revisions in English are marked in red. In the document, we would like to response to the questions of reviewers one by one.

Again, we would like to express our great appreciation to you and the reviewers for the comments on our paper. Looking forward to your comments

 

 

Thank you and best regards.

 

Yours sincerely,

 

Haibo Pu

Professor, Internet of things Engineering

Corresponding author:

Name: Haibo Pu

E-mail: [email protected]

Author Response File: Author Response.docx

Reviewer 2 Report

This paper proposed a dual attention residual network (AMDNet) and a re-parameterized model approach to improve tree species segmentation and classification from UAV data. Compared to current models such as UNet, the author’s model performed better in speed and accuracy, as well as was easier to deploy, providing a new way for remote sensing image tree species classification in practical applications.

In the results section authors presented information that suits better and is necessary for the material and methods section. Although the material and methods section is impressively detailed and necessary, it lacks information on research validation that is given in the results section.

The results section could be shortened and focused on the research results, and not on methods applied or discussions, as there are the proper sections to put the information needed. Authors can manage this section turning it more interesting and focused to make it reproducible by readers.

Lines 162 to 170: there are some mistyping errors on species names, which also should be highlighted in italic.

Fig.1: the quality of images b) to f) should be improved, and the figure's title should show the correct scientific name of the species.

Line 185: there is one mistyping error here that happens in other parts of the paper (ex: … ma-nipulations…, and lines 279 and 298)

Table 1: use only the scientific name of the species.

Line 329: citation is not correct (In 2022, Liu et al)

Lines 468 to 478: here in the results authors presented information that suits better and is necessary for the material and methods section. 

Line 618: it is the discussion section, not the conclusion.

Lines 619 and 620: authors should remove the guidance information.

Author Response

Dear reviewers:

On behalf of my co-authors, we thank you very much for allowing us to revise our manuscript, we sincerely appreciate you for providing the constructive comments and suggestions on our manuscript entitled “AMDNet: Modern UAV RGB Remote Sensing Tree Species Image Segmentation Model Based on Dual Attention Residual and Structure Re-parameterization” (ID: forests-2199343).

The comments are very beneficial for improving the quality of our manuscript. We have tried our best to revise our manuscript according to the comments, which are marked in purple in the paper and Thesis revisions in English are marked in red. In the document, we would like to response to the questions of reviewers one by one.

Again, we would like to express our great appreciation to you and the reviewers for the comments on our paper. Looking forward to your comments

 

 

Thank you and best regards.

 

Yours sincerely,

 

Haibo Pu

Professor, Internet of things Engineering

Corresponding author:

Name: Haibo Pu

E-mail: [email protected]

Author Response File: Author Response.docx

Round 2

Reviewer 1 Report

The manuscript was much improved as suggested. I would recommend the acceptance of this manuscript.

Author Response

Dear reviewers:

On behalf of my co-authors, we thank you very much for your approval of our paper, we sincerely appreciate you for providing the constructive comments and suggestions on our manuscript entitled “AMDNet: Modern UAV RGB Remote Sensing Tree Species Image Segmentation Model Based on Dual Attention Residual and Structure Re-parameterization” (ID: forests-2199343).

Thank you and best regards.

Yours sincerely,

Back to TopTop