Next Article in Journal
Carbon Sink Performance Evaluation and Socioeconomic Effect of Urban Aggregated Green Infrastructure Based on Sentinel-2A Satellite
Previous Article in Journal
Research Opportunity on Fractional Cover of Forest: A Bibliometric Review
 
 
Article
Peer-Review Record

Automated Extraction of Forest Burn Severity Based on Light and Small UAV Visible Remote Sensing Images

Forests 2022, 13(10), 1665; https://doi.org/10.3390/f13101665
by Jiangxia Ye 1, Zhongyao Cui 1, Fengjun Zhao 2,* and Qianfei Liu 3,*
Reviewer 1:
Reviewer 2:
Forests 2022, 13(10), 1665; https://doi.org/10.3390/f13101665
Submission received: 31 August 2022 / Revised: 30 September 2022 / Accepted: 6 October 2022 / Published: 10 October 2022
(This article belongs to the Section Natural Hazards and Risk Management)

Round 1

Reviewer 1 Report

The manuscript uses the decision tree method to monitor the damaged forest area with UAV images. It has some serious problems need to be overcome before publication.

1. In the title ‘Forest Damage Information’ have wide meaning, authors should make it clearer;

2. The abstract section is confusing, for example, the expression ‘coarse spatial resolution of satellite image mapping’ is wrong in this manuscript, it is obvious that some satellites products have very high spatial resolution and can used for the task discussed in this study.

3. L54, ‘remote sensing indices’ should be spectral indices.

4. L102, what is the meaning ‘However, little research has been done on recognizing vegetation damage levels 102 and computer segmentation’?

5. L133, the eagle eye map is not necessary,

6. Figure 3,4, what is the meaning of the ‘count’?

7. Equation 2, what does the symbol XG mean? Similar problem exists in equations 4,5,6,7 and table 3.

8 How to determine the thresholds in table 3?

9. The legend in Figure 6 should be rebuilt.

10. Line 383-391, the font needs to be unified.

11. Authors should compare the results with some other methods, such as the Random forest, SVM etc.

12. There are also some grammatical problems.

Author Response


  - The manuscript "Automated Extraction of Forest damage information from Fire Based on Small UAV Visible Remote Sensing Images" identify the damage degree information of forest trees after fire by using object-oriented method. The results show that different types of forest damage, namely, unburned, dead, damaged, and burned can be recognized by the method mentioned above

However I have some comments to discuss with the authors:

Title
- In the title ‘Forest Damage Information’ have wide meaning, authors should make it clearer.

Response: Thank your comments and suggestion. The forest damage information really have wide meaning,but here we refer specially to the level of forest damage from fire based on tree burn severity. According to China’s forestry standards [1], the forest damage or burn severity from fire is divided into unburned, dead, damaged, and burnt. To clarify the topic, We changed “Forest Damage information” to “Forest Burn Severity.” Meanwhile, we added one citation to describe the classification criteria of the tree damage degree from fire in national forestry standards [1], which can be seen on page 4, Line157. Additionally, we revised the term in the whole paper to make it consistent.

Abstract
- The abstract section is confusing, for example, the expression ‘coarse spatial resolution of satellite image mapping’ is wrong in this manuscript, it is obvious that some satellites products have very high spatial resolution and can used for the task discussed in this study.

Response: We are sorry for expressing this without careful consideration and have revised it. We have reorganized the paper to avoid the reviewer's confusion, especially in the abstract section. The objective of this paper is to explore the possibility of the light and small visible UAVs for replacing laborious manual field survey of forest burn severity in the national forestry industrial standard based on remotes sensing indices and pattern recognition. We intend to express that UAVs has a higher resolution than satellite imagery so that the forest burn severity can be extracted more accurately.To avoid confusion, we delete "coarse spatial resolution of satellite image mapping" and emphasize "limitation of temporal resolution of satellite imagery and poor objectivity of visual interpretations", which can be seen on page 1, Line13. 

As to the discussion section, we have revised it and added four new citations following the suggestion to make it more logical. We discussed the defects of low and medium-resolution satellite remote sensing images in forest fire research [2-3] and the advantages of high-resolution satellite remote sensing images, but there are also some disadvantages such as expensive and difficult data acquisition timely [4-5], which can be seen on page 14, Line 447.

Introduction
- L54, Change "remote sensing indices" to "spectral indices"

Response: Thank your suggestion, we have corrected it.

- L102, what is the meaning ‘However, little research has been done on recognizing vegetation damage levels 102 and computer segmentation’?

Response: We are so sorry for presenting our idea unclearly. Many previous researches focused on the extraction of the forest burned areas, and quite a few emphasized the detailed forest damage degree of the fire area related to forest fire assessment. However, the existing forest burn severity studies often extract light, medium, and heavy classes [6-7] rather than substantive grades corresponding to the national forestry industrial fire assessment standards,which is more practical in forest management.

As to the method, there are many studies on forest fire investigation and assessment using pixel based spectrum classification, such as Maximum Likelihood (MLH), Spectral Angle Map (SAM), and Normalized Difference Vegetation Index (NDVI) threshold method [8-10]. However, there is little focus on identifying forest burn severity by objected-oriented method, which has the advantage of integrating multiple image patterns such as spectral, texture, and spatial relationship features to assess objects damage other than pixels with pepper phenomenon. We have modified this, which can be seen on page 3, Line 102.

- L133, the eagle eye map is not necessary,

Response: Thank your comments and suggestion.We remap it without the eagle map.

- Figure 3, 4, what is the meaning of the ‘count’?

Response: Figure 3 shows various land cover types' average spectrum and brightness response. Y-axis represents the brightness and spectral value of a specific type of ground object in the single band of the UAV image.

Figure 4 shows an R, G, and B pixel distribution map. Y-axis represents the number of pixels at a specific pixel value in the UAV image for different types of objects

- Equation 2, what does the symbol XG mean? Similar problem exists in equations 4,5,6,7 and table 3.

Response: We are sorry that our mistakes confused the audience. We have corrected XG to EXG, the excess green index proposed by former researchers, which can be seen on page 8.

Equations 2, 4, and 5 are constructed by all ground objects' characteristics and differences in spectral response based on Figure 3. In order to highlight the characteristics of local objects, customized spectral features are proposed. Equation 4 is used to extract burnt forest, while Equation 5 is for identifying damaged forest. We have already marked.

Equation 6 and Equation 7 are to distinguish objects with significant color differences between unburned, complete burnt forest, and reservoirs based on the spectral characteristics F and N proposed by former researchers, which can be seen on page 9, Line 256, references[39]. We have already marked. In table 3, R, G, and B represent the three red, green, and blue bands in the UAV image, respectively. A, C, N, and F represent customized spectral features. H and I are the hue and intensity in the HIS transformation, representing the object's color and the brightness of the color, respectively. EXG, excess green index. VDVI, visible-band difference vegetation index. Length and width represent length and width in the shape feature, respectively. Standard deviation G, Standard deviation of the Green band in spectral features; Brightness, Brightness in spectral features; GLCM Contrast, GLCM StdDev, GLCM Mean, GLCM Dissimilarity, GLCM Entropy, and GLCM Homogeneity represent Contrast, Standard deviation, Mean, Dissimilarity, Entropy, Homogeneity of Gray Level Concurrence Matrix, respectively. We have already marked.

 

Results

- How to determine the thresholds in table 3?

Response: The determination of the threshold value in the optimal segmentation scale is highly related to the empirical knowledge of the operator, who must repeatedly experiment, determine the segmentation parameters and establish feature rules based on his empirical knowledge by reviewing much literature to segment the target accurately. We first adopt the model of scale parameter segmentation ESP(Estimation of Scale Parameter), in which peak points with noticeable changes are obtained, and then determine the optimal parameters for the segmentation scale through many experiments.

- The legend in Figure 6 should be rebuilt.

Response: Thank your comments and suggestion.We have changed it.

- Line 383-391, the font needs to be unified.

Response: We are so sorry for that and have changed it, which can be seen on page 12, Line 383-391.

- Authors should compare the results with some other methods, such as the Random forest, SVM etc.

Response: Thank your comments and suggestion. We reorganized the paper and added the Support Vector Machine method to enhance the results and discussion. We have added three citations to expound the support vector machine model [11-13], which can be seen on page 7, Line 217. We also introduced Support Vector Machine to map the Forest burn severity, which can be seen on page 11, Line 350. Additionally, we revised the accuracy section involved, which can be seen on page 12, Line 351, and also added an accuracy evaluation table of Support Vector Machine results, which can be seen on page 13, Line 393.

- There are also some grammatical problems.

Response: We have a high grammal check for the whole paper.

Reviewer 2 Report

The article under review is interesting and is devoted to the study of the consequences of forest fires using UAV. Despite the extreme relevance and high methodological study of the material, there are a number of questions to the article. In the article, only one case of fire in a specific forest area with a certain composition of forest species is used as an analysis. In this regard, doubts arise about the possibility of extrapolating the results obtained by the authors to other areas. To what extent are the forest communities represented in the burnt area typical and what is the area of their distribution? In addition, the authors nowhere indicate the nature of the forest fire. Judging by the photo from Table 1, we are talking about a ground fire. In this case, the crowns are practically not damaged and the processes of reforestation proceed differently than during crown fire. This aspect also reduces the possibility of extrapolation of the results obtained by the authors.

Author Response

-Reviewer 2

- The article under review is interesting and is devoted to the study of the consequences of forest fires using UAV. Despite the extreme relevance and high methodological study of the material, there are a number of questions to the article. In the article, only one case of fire in a specific forest area with a certain composition of forest species is used as an analysis. In this regard, doubts arise about the possibility of extrapolating the results obtained by the authors to other areas. To what extent are the forest communities represented in the burnt area typical and what is the area of their distribution? In addition, the authors nowhere indicate the nature of the forest fire. Judging by the photo from Table 1, we are talking about a ground fire. In this case, the crowns are practically not damaged and the processes of reforestation proceed differently than during crown fire. This aspect also reduces the possibility of extrapolation of the results obtained by the authors.

Response: Thanks remind. The paper aims to describe a forest burn severity extracting method based on UAV image characteristics to replace the present field survey according to the national forestry inventory standards for any forest burn severity. The forest in our study area represents a typical coniferous and broad-leaved forest in southwest China. We only stress three dominant forest species, Pinus yunnanensis, Pinus armandi, and Cyclobalanopsis glaucoides Schotky. There are also other species with a small proportion in the fire area, such as  Alnus nepalensis, Quercus variabilis, and other shrubs, etc. Additionally, the forest has been continuously in succession and shows a relatively complex composition. So we just selected this forest fire scenario considering data availability and representativeness as other related research did [10], which took Pinus densiflora forest burn severity mapping as the case study, see page 18. However, whether the forest species impacts the method and results deserves further future attention.

According to the statistical survey, the burned forest acreage is approximately 386.7 hm2, and the study area is 55.24 hm2, located at the border of the damaged area, including all tree burn severity types in forestry standards [1]. Therefore, the study area is representative of the entire forest fire area. We have modified it, which can be seen on page 3, Line 123.

We also reviewed the official document about this fire and confirmed that the forest fire is a crown fire. After the forest fire was suppressed, we could not immediately go to the fire spot for UAV data acquisition and ground survey due to local forest fire management policy and weather conditions. However, the UAVs photography and field investigation of forest damage level post-fire were obtained before the rain season avoiding confusion resulting from vegetation succession. We described the nature of forest fires in our article, which can be seen on page 3, Line 120.

 

    Note: In addition to those response to comments above, we also have improved the whole manuscript to make it more logical and consistence. 

  1. State Forestry Administration. Survey method for the causes of forest fire and the damage of forest resources. LY/T 1846-2009. 2009-06-18.
  2. Bisquert, M.; Caselles, E.; Sánchez, J. M.; Caselles, V. Application of artificial neural networks and logistic regression to the prediction of forest fire danger in Galicia using MODIS data. Int J Wildland Fire 2012, 21, 1025-1029.
  3. Yankovich, K. S.; Yankovich, E. P.; Baranovskiy, N. V. Classification of Vegetation to Estimate Forest Fire Danger Using Landsat 8 Images: Case study. Math Probl Eng 2019, 4, 1-14.
  4. Song, Q.; Hu, Q.; Zhou, Q.; Hovis, C.; Xiang, M.; Tang, H.; Wu, W. In-Season Crop Mapping with GF-1/WFV Data by Combining Object-Based Image Analysis and Random Forest. Remote Sens 2017, 9, 1184-1203..
  5. Lv, Z.; Liu, T.; Wan, Y.; Benediktsson, J. A.; Zhang, X. Post-Processing Approach for Refining Raw Land Cover Change Detection of Very High-Resolution Remote Sensing Images. Remote Sens 2018, 10, 472-491.
  6. Llorens, R.; Sobrino, J. A.; Fernández, C.; Fernández-Alonso, J. M.; Vega, J. A. A methodology to estimate forest fires burned areas and burn severity degrees using Sentinel-2 data. Application to the October 2017 fires in the Iberian Peninsula. Int J Appl Earth Obs2021, 95, 102243.
  7. Zheng, Z.; Zeng, Y.; Li, S.; Huang, W. Mapping burn severity of forest fires in small sample size scenarios. Forests 2018, 9, 608.
  8. Woo, H.; Acuna, M.; Madurapperuma, B.; Jung, G.; Woo, C.; Park, J. Application of Maximum Likelihood and Spectral Angle Mapping Classification Techniques to Evaluate Forest Fire Severity from UAV Multi-spectral Images in South Korea. Sensor Mater 2021, 33, 3745-3760.
  9. Zidane, I. E.; Lhissou, R.; Ismaili, M.; Manyari, Y.; Mabrouki, M. Characterization of Fire Severity in the Moroccan Rif Using Landsat-8 and Sentinel-2 Satellite Images. IJASEIT 2021, 11, 71-83.
  10. Shin, J.-i.; Seo, W.-w.; Kim, T.; Park, J.; Woo, C.-s. Using UAV Multispectral Images for Classification of Forest Burn Severity—A Case Study of the 2019 Gangneung Forest Fire. Forests 2019, 10, 1025.
  11. HSU, C. W.; LIN, C. J. A comparison of methods for multiclass support vector machines. IEEE T Neural Networ 2002, 13, 415-425.
  12. Chaudhuri, A.; Kajal, D.; Chatterjee, D. A Comparative Study of Kernels for the Multi-class Support Vector Machine. 2008 Fourth International Conference on Natural Computation, Jinan, China, 18-20 October 2008.
  13. Polat, K.; Gunes, S. A novel hybrid intelligent method based on C4.5 decision tree classifier and one-against-all approach for multi-class classification problems. Expert Syst Appl 2009, 36, 1587-1592.

 

Round 2

Reviewer 1 Report

I have no comments on this version.

Reviewer 2 Report

Article may be published

Back to TopTop