Next Article in Journal
Intrinsically Safe Drone Propulsion System for Underground Coal Mining Applications: Computational and Experimental Studies
Next Article in Special Issue
A Method for Forest Canopy Height Inversion Based on UAVSAR and Fourier–Legendre Polynomial—Performance in Different Forest Types
Previous Article in Journal
Distributed Model Predictive Consensus Control of Unmanned Surface Vehicles with Post-Verification
Previous Article in Special Issue
Transferability of Models for Predicting Rice Grain Yield from Unmanned Aerial Vehicle (UAV) Multispectral Imagery across Years, Cultivars and Sensors
 
 
Article
Peer-Review Record

Retrieval of Fractional Vegetation Cover from Remote Sensing Image of Unmanned Aerial Vehicle Based on Mixed Pixel Decomposition Method

by Mengmeng Du 1,*, Minzan Li 2, Noboru Noguchi 3, Jiangtao Ji 1 and Mengchao (George) Ye 4
Reviewer 1:
Reviewer 2: Anonymous
Submission received: 6 December 2022 / Revised: 26 December 2022 / Accepted: 5 January 2023 / Published: 7 January 2023

Round 1

Reviewer 1 Report

Thank you for submitting your paper. The objective of the presented study is to improve the inversion accuracy of wheat plant density with the help of using mixed pixel decomposition for determining fractional vegetation cover. The article's aim is of interest to the scientific community. However, a few issues must be resolved before considering the publication of manuscript. Details follow:

1. Please use a more specific term instead of using " remotely sensed” for captured images.

2. Please define all abbreviations at the first use in the text such as " DNs…"

3. The abstract is including many results please reconsider it.

4. Please describe the numbers " 0.118, 0.055". Are these digital numbers and if so, please use the decimal range for averaged values (0-255).

5. Please use detailed figure legends such as including units, and percentages.  Please expand the figure captions to give detailed information about the chart.

6. It is not easy to compare the results from the tables. Please use normalized data for the results.  

7. The author is claiming " The altitude of drone flight was set to 80 m above ground level to acquire aerial images with a high spatial resolution of about 2.5 cm". The image in Figure 2 has a lower resolution than expected. Please describe if there is any external noise affected by the imaging device.  

8. In Table 4, please express spectral bands with a range of wavelengths in nm/um units.

9. In Figure 4 the legends are blurry.

10. In general the charts are blurred please check the quality of the figures.

11. References are adequate.

12. Please compare your results with the recent studies in this field. It is important to expand comparison by including state-of-the-art studies.

Author Response

Dear reviewer, thank you for your valuable comments. The paper has been revised according to your suggestion.

  1. Please use a more specific term instead of using " remotely sensed” for captured images.

Corrected. The term of “remote sensing image” was used instead of the previous “remotely sensed image”.

 

  1. Please define all abbreviations at the first use in the text such as " DNs…"

Corrected. It should be “reflectance”, not DN, as the raw image of drone was calibrated by using 2 reference panels, described in Line 155-157.

  1. The abstract is including many results please reconsider it.

Corrected. The contents in abstract were rearranged, and the “comparative study” part was deleted.

  1. Please describe the numbers " 0.118, 0.055". Are these digital numbers and if so, please use the decimal range for averaged values (0-255).

Corrected. We are sorry for the misunderstanding description. These numbers are not “DN” of the raw image but the calibrated reflectance by using 2 reference panels. So we corrected the header of Table 4 by replacing “DN” with “reflectance”.

  1. Please use detailed figure legends such as including units, and percentages. Please expand the figure captions to give detailed information about the chart.

The horizontal axis of the chart is FVC (Fractional Vegetation Cover), which is dimensionless and refers to the fraction of the land surface covered by green foliage in the two-dimensional plane, described in Line 78-79.

The longitudinal axis is the wheat plant density with the unit of (plants/m2).

The caption was expanded as “Wheat plant density inversion models based on FVC values calculated by using different methods.”

The note was also complemented with “y1, y2, and y3 indicates the predicted wheat plant density from FVCMPD, FVCSVM, and FVCIT, respectively.”

  1. It is not easy to compare the results from the tables. Please use normalized data for the results.

In order to make the contents more straightforward, the residue plots were added as Figure 8.

  1. The author is claiming " The altitude of drone flight was set to 80 m above ground level to acquire aerial images with a high spatial resolution of about 2.5 cm". The image in Figure 2 has a lower resolution than expected. Please describe if there is any external noise affected by the imaging device.

As a matter of fact, the spatial resolution of the remote sensing image is indeed about 2.5 cm, and all images are consistent with regard to this point. There is no alteration nor external noise affecting spatial resolution the remote sensing image in this study. The authors are wondering that it could be due to the effect of mosaicking when an image is zoomed too much for visual interpretation.

  1. In Table 4, please express spectral bands with a range of wavelengths in nm/um units.

Color images of consumer-level camera include 3 channels of blue, green, and red. But in most cases, the specific spectral range of each channel remains inexplicit. In this study, we used the DJI mini drone with the CMOS imaging sensor to capture color image of wheat field. The image was processed in ArcGIS environment to get 3 individual images of blue band, green band, and red band. But unfortunately, the information on spectral range is still unclear. So, like all other agricultural remote sensing researches, the authors only denoted the band name without spectral ranges.

  1. In Figure 4 the legends are blurry.

Corrected.

  1. In general the charts are blurred please check the quality of the figures.

Corrected.

  1. References are adequate.

 

  1. Please compare your results with the recent studies in this field. It is important to expand comparison by including state-of-the-art studies.

Corrected. Line 397 to 417 were added.

 

 

 

 

Reviewer 2 Report

1. Authors are suggested to add color figure of Figure 1.

2. It is suggested to mention about the spectral bands (RGB/MS), flying height and GSD in Table 1.

3. Limitation of the study need to be discussed

4. Discussions can be slightly more elaborative. Also some lines of present Discussion should be in Introduction/ Literature review. For eg line no. 331 to 354 are more suitable in later part of Introduction section. This will make the flow.

5. At many places authors used the term “drone” it is suggested to use different term as per the paper RPV, UAV, UAS, RPAS ... or just drone?
https://doi.org/10.1111/phor.12244
.

6. Literature review can be more extensive.

7. Some sentences need restructuring:-
a. Line 88-89 "Pixels that contain one kind of ground object are endmembers, while mixed pixels include two or more kinds of ground object"

b. Line 109 "very few literatures" - generally in this sense plural literature is used. Try to rephrase the sentence to communicate meaning properly.

c. At many places sentences are starting with the word And which can be avoided.

8. Authors can provide a table summarising MPD models applied in UAV images in literature review.

9. For comparative study authors have used only one machine learning algorithms which is SVM method but in literature there are many machine learning algorithms which can also be used for comparative study.

 

Author Response

Dear reviewer, thank you for your valuable comments. The paper has been revised according to your suggestion.

  1. Authors are suggested to add color figure of Figure 1.

Figure 1 is indeed a color figure. It might because the picture size is limited, which cannot clearly show the green pixels, but it is the original remote sensing image of the drone.

  1. It is suggested to mention about the spectral bands (RGB/MS), flying height and GSD in Table 1.

Corrected. The spectral band and GSD were added in Table 1.

  1. Limitation of the study need to be discussed.

Corrected. Limitation of the study was added in Line 434-437.

  1. Discussions can be slightly more elaborative. Also some lines of present Discussion should be in Introduction/ Literature review. For eg line no. 331 to 354 are more suitable in later part of Introduction section. This will make the flow.

Corrected. Line no. 331 to 354 were moved to the Introduction section. More literatures were added in the Discussion part as Line 397 to 417.

  1. At many places authors used the term “drone” it is suggested to use different term as per the paper “RPV, UAV, UAS, RPAS ... or just drone?”

Corrected. The authors used the term of “remote sensing image of drone” for inversion of wheat plant density, and replaced other places with the widely used term of “UAV remote sensing image” when quoting other references.

  1. Literature review can be more extensive.

Corrected. More literature reviews were inserted in the introduction section and discussion section.

  1. Some sentences need restructuring: a. Line 88-89 "Pixels that contain one kind of ground object are endmembers, while mixed pixels include two or more kinds of ground object" . b. Line 109 "very few literatures" - generally in this sense plural literature is used. Try to rephrase the sentence to communicate meaning properly. c. At many places sentences are starting with the word “And” which can be avoided.

Corrected.

a. Changed to “Endmembers only contain one object, while mixed pixels include two or more kinds of objects”

b. This sentence was deleted.

c. The conjunction word was modified according to the context.

  1. Authors can provide a table summarising MPD models applied in UAV images in literature review.

Literature review on MPD models applied in UAV images was added as Line 113-135. But as the models, methods and objects are various with different researches, the authors cannot find a suitable table to compare these literatures altogether.

  1. For comparative study authors have used only one machine learning algorithms which is SVM method but in literature there are many machine learning algorithms which can also be used for comparative study.

There are enormous kinds of models that can be used to segment images, but it is not the main topic of this study. In this study, the authors introduced and evaluated the mixed pixel decomposition method applied to estimation of wheat plant density. SVM is the widely used machine learning model, so the authors just used it as a comparison with the proposed method.

 

Round 2

Reviewer 1 Report

The revised version of manuscript is improved. A few minor changes are required. Please adjust legends and captions of Fig. 3, which is not clear to read. Please add a unit for axis of the figure.

In Fig. 5, the letters are used in different font style.

As a suggestion, please use a part of the whole map to show a comparison with actual image and processed image.

 

 

Author Response

Dear reviewer,

Thanks again for your valuble comments.

Both Figure 3 and Figure 5 were renewed.

Comparison of enlarged map of wheat plant density with the actual field image was also added in Figure 9.

Best regards.

 

Reviewer 2 Report

Still, in many places, the term drone is used. In the previous review report, I suggested the authors to go through the following paper for terminologies

RPV, UAV, UAS, RPAS … or just drone? (https://doi.org/10.1111/phor.12244)

 

 

 

 

 

Author Response

Dear reviewer,

The authors apologize for overlooking many places with the term of drone.

This issue was solved  by replacing the term of drone with UAV for all places.

Thanks again.

Back to TopTop