Next Article in Journal
Ensemble Machine Learning of Random Forest, AdaBoost and XGBoost for Vertical Total Electron Content Forecasting
Previous Article in Journal
A Multi-Pulse Cross Ambiguity Function for the Wideband TDOA and FDOA to Locate an Emitter Passively
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Novel Burned-Area Subpixel Mapping (BASM) Workflow for Fire Scar Detection at Subpixel Level

1
College of Forestry, Central South University of Forestry and Technology, Changsha 410004, China
2
Department of Geological Engineering, Montana Technological University, Butte, MT 59701, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(15), 3546; https://doi.org/10.3390/rs14153546
Submission received: 18 June 2022 / Revised: 11 July 2022 / Accepted: 22 July 2022 / Published: 24 July 2022
(This article belongs to the Section Forest Remote Sensing)

Abstract

:
The accurate detection of burned forest area is essential for post-fire management and assessment, and for quantifying carbon budgets. Therefore, it is imperative to map burned areas accurately. Currently, there are few burned-area products around the world. Researchers have mapped burned areas directly at the pixel level that is usually a mixture of burned area and other land cover types. In order to improve the burned area mapping at subpixel level, we proposed a Burned Area Subpixel Mapping (BASM) workflow to map burned areas at the subpixel level. We then applied the workflow to Sentinel 2 data sets to obtain burned area mapping at subpixel level. In this study, the information of true fire scar was provided by the Department of Emergency Management of Hunan Province, China. To validate the accuracy of the BASM workflow for detecting burned areas at the subpixel level, we applied the workflow to the Sentinel 2 image data and then compared the detected burned area at subpixel level with in situ measurements at fifteen fire-scar reference sites located in Hunan Province, China. Results show the proposed method generated successfully burned area at the subpixel level. The methods, especially the BASM-Feature Extraction Rule Based (BASM-FERB) method, could minimize misclassification and effects due to noise more effectively compared with the BASM-Random Forest (BASM-RF), BASM-Backpropagation Neural Net (BASM-BPNN), BASM-Support Vector Machine (BASM-SVM), and BASM-notra methods. We conducted a comparison study among BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra using five accuracy evaluation indices, i.e., overall accuracy (OA), user’s accuracy (UA), producer’s accuracy (PA), intersection over union (IoU), and Kappa coefficient (Kappa). The detection accuracy of burned area at the subpixel level by BASM-FERB’s OA, UA, IoU, and Kappa is 98.11%, 81.72%, 74.32%, and 83.98%, respectively, better than BASM-RF’s, BASM-BPNN’s, BASM-SVM’s, and BASM-notra’s, even though BASM-RF’s and BASM-notra’s average PA is higher than BASM-FERB’s, with 89.97%, 91.36%, and 89.52%, respectively. We conclude that the newly proposed BASM workflow can map burned areas at the subpixel level, providing greater accuracy in regards to the burned area for post-forest fire management and assessment.

1. Introduction

Forest fires represent one of the biggest threats to plant ecological systems and resources [1], and they are partially responsible for the increase of greenhouse gas in the atmosphere [2,3]. Around 350 million hectares of land are affected by forest fires annually due to natural or human causes [4,5,6]. Forest fires happen very frequently. A total of 3.52 million hectares of forests were burned in China from 2001 to 2015 according to National Forest Fire Prevention Plan (2016–2025) [7]. Mapping the burned area accurately and timely is essential for post-forest fire management and assessment [8], quantifying carbon budgets [9], and for analyzing the relationship between vegetation and climate [9]. The traditional methods for mapping burned area were followed using artificial ground surveys [10] and field sketches [11]. These methods for mapping burned area were very inefficient, and these data were unreliable when analyzed at large scale [12].
Satellite-remote sensing can acquire images continuously, even over remote areas, and has the advantage of wide coverage, being timeless, and low cost [13]. Burned area mapping based on satellite image has always been a hot spot of research [14,15,16,17]. Table 1 summarizes eleven widely used burned-area products.
However, the highest spatial resolution of the burned-area products is 30 m, as derived from Landsat data and shown in Table 1. Research on burned-area mapping with high spatial resolution satellite data, such as Sentinel 2 of 10 m spatial resolution, has been actively pursued [6,8,15,29,30]. Stroppiana et al. [15] mapped burned areas based on the Sentinel 2 data using an automatic machine learning (ML) algorithm from highly reliable fire points. Florath and Keller [14] detected burned areas using a generic approach based on ML and the Sentinel 2 data. Zhang et al. [6] presented a Siamese self-attention (SSA) classification strategy for burned area mapping using the Sentinel-1 & 2 data. Pinto et al. [8] applied a Burned Areas Neural Network (BA-NET) based method to map burned area using Sentinel-2 and Visible Infrared Imaging Radiometer Suite (VIIRS) data.
For any type of image data, pixel mixing is a common phenomenon [31,32,33]. A mixed pixel may contain an unknown composition of land cover types [34], affecting the simulation of radiative characteristics and inversion for land surface parameters from remote sensing data [33]. Mixing pixels has been a barrier in the application of remote sensing to burned area mapping. To overcome the barrier, a series of unmixing methods have been proposed, such as fully constrained least squares (FCLS) [35], maximum margin criterion and derivative weights (MDWSU) [36], minimum volume transform (MVT) [37], vertex component analysis (VCA) [38], simplex growing algorithm (SGA) [39], minimum volume constraint nonnegative matrix factorization (MVC-NMF) [40], successive projection algorithm (SPA) [41], etc. He et al. [42] showed that spectral unmixing methods have advantages over the remote sensing vegetation indices due to their ability to decompose the mixed pixel into several fractional abundances. Nascimento and Dias [38] performed a series of experiments using simulated and real data sets and found that the VCA algorithm performs better than the pixel purity index (PPI) method and better than or similar to the N-FINDR algorithms [43]. Miao and Qi [40] applied the MVC-NMF method to extract unsupervised endmembers from highly mixed image data. Zhang et al. [41] presented SPA that can provide a general guideline to constrain the total number of endmembers. Kumar et al. [44] estimated global land cover fractions from satellite data based on unconstrained lease squares (UCLS), fully constrained least squares (FCLS), modified fully constrained least squares (MFCLS), simplex projection (SP), sparse unmixing via variable splitting and augmented Lagrangian (SUnSAL), SUnSAL and total variation (SUnSAL TV), and Collaborateve SUnSAL (CL SUnSAL), and their results indicated that FCLS outperformed the other techniques. Thus, in this paper, we used FCLS for spectral unmixing of burned area in mixed pixels.
However, spectral unmixing methods only solve the problem of endmember types and abundances without being able to locate endmembers in a mixed pixel. To solve this problem, Atkinson [45,46] proposed the subpixel concept for precise spatial locations of endmembers in a mixed pixel based on unmixing analysis [47]. On the basis of Atkinson’s study, many algorithms have been developed [48], such as the spatial attraction model (SAM), double-calculated spatial attraction model (DSAM) [32], subpixel learning algorithms [44], subpixel edge detection method [49], etc. These subpixel techniques have found many applications [50,51,52,53]. For instance, the random forests and spatial attraction model (RFSAM) was applied to remote sensing images to improve the accuracy of subpixel mapping of wetland flooding [54]. Li et al. [55] predicted a land cover map accurately based on the spatio-temporal subpixel land cover mapping (STSPM) method. Ling et al. [56] monitored variations of reservoir surface water area accurately and timely from daily MODIS images by exploiting information at the subpixel scale. Deng and Zhu [57] successfully applied the Continuous Subpixel Monitoring (CSM) method to map urban impervious surface area (ISA%) at the subpixel scale and characterize its dynamics in Broome County, New York.
However, there are very few applications of these subpixel algorithms for burned area mapping [58]. In order to accurately map burned area, the exact location of the burned area in a mixed pixel is important for post-forest fire management and assessment, quantifying carbon budgets, etc. A good subpixel algorithm can help to map and locate burned area with subpixel accuracy, especially from low-spatial-resolution satellite images. However, current burned area mapping algorithms based on satellite remote sensing data can locate burned area only at the pixel level. The objective of this study is to develop a new algorithm to improve burned area mapping at the subpixel level. To this end, we proposed a burned area subpixel mapping (BASM) workflow. The first step is a spectral unmixing analysis using the fully constrained least squares (FCLS) method. The second step is to process the unmixing analysis results with the modified-pixel swapping algorithm (MPSA). The last step is post-classification after MPSA. We then applied the proposed BASM workflow to the Sentinel 2 data sets. The ground-truth fire-scar information provided by the Department of Emergency Management of Hunan Province, China was used as reference data for assessment.

2. Study Area

Hunan Province, China, is located in the middle Yangtze River, central China, within an area enclosed by 108°47′–114°15′ longitude and 24°38′–30°08′ latitude (Figure 1). Over the course of a year, its temperature typically varies from 7.78 °C to 33.33 °C. The rainiest month is June, with an average rainfall of 177.8 mm. The driest month is December, with an average rainfall of 35.56 mm. The relatively warm and humid weather results in lush vegetation and an evergreen landscape. Hunan Province has a subtropical, evergreen, broad-leaved forest with a forest coverage rate of 59.96%. Hunan Province has a land area of about 211,829 km2, and a mountainous area of about 108,472 km2 accounting for 51.21% of its total. Hunan Province is surrounded by mountains to the east, south, and west sides and a lake basin plain in the north, undulating in the central section, forming an asymmetrical horseshoe shaped terrain opening to the northeast. The east and south sides are mostly at an altitude of above 1000 m. The west side is 500–2099 m above sea level. The central section elevation is below 500 m. Most of the north side elevations are below 50 m. Forest and wetland ecosystems are the two major natural ecosystems. There are about 200 significant ecological zones around the world, and Hunan Province has two forest and wetland ecosystems, i.e., the subtropical, evergreen, broad-leaved forest ecological zones in Wuling Xuefeng Mountains and Nanling Luoxiao Mountains, known as the world’s most valuable ecological regions in the same latitude zone. Due to the high forest coverage rate and large mountainous area in Hunan Province, forest fire is a great potential hazard. There are many obstacles for post-forest fire management, e.g., accurate burned area mapping has to rely on manual surveys. To improve this situation, it is imperative to map burned area using automation or semi-automation algorithms, such as remote sensing-based detection and mapping methods.

3. Methodology

First, we obtained the locations of the 15 fire-scar sites based on the ground-truth fire scar information that was provided by the Department of Emergency Management of Hunan Province. Then, we downloaded the pre- and post-forest fire Sentinel 2 data sets and processed them. Second, burned area fraction was derived from the four selected Multispectral Instrument (MSI)/Sentinel 2 bands listed in Table 2 using the FCLS method [44,59,60]. Taking burned area fraction as input for the Modified Pixel Swapping Algorithm (MPSA) in Visual Studio 2012 platform edited by C# language, we obtained the burned area at subpixel level as output. Furthermore, after a post-classification clump and sieve procedure on the output, the burned area at the subpixel level was mapped. Finally, a comparison analysis and accuracy assessment of the burned area mapped on subpixel level was performed using high spatial resolution GaoFen (GF) series data sets. The flowchart for mapping burned area at the subpixel is shown in Figure 2. Methodological steps are explained in detail over the following subsections.

3.1. Satellite Data and Preprocessing

3.1.1. Satellite Data

Sentinel 2 is an Earth observation mission of European Space Agency to support environmental services and natural disaster management [61,62]. The Sentinel 2 Multispectral Instrument (MSI) data were used in this study, sampling 13 spectral bands: four bands at 10 m, six bands at 20 m, and three bands at 60 m spatial resolutions over the visible, near infrared (NIR), and short wave infrared (SWIR) (https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-2-msi/overview, last accessed on 3 April 2022) [62,63]. This makes it useful for land monitoring studies at local, regional, and global scales.
The sentinel 2 data were downloaded from United States Geological Survey (USGS, https://earthexplorer.usgs.gov/, last accessed on 1 March 2022) as a Level 1C product, which represent orthorectified, top-of-atmosphere (TOA) reflectance. GF series data sets are important component of China High-Resolution Earth Observation System (CHEOS) project sponsored by China National Space Administration (CNSA) [64]. We obtained GF series data sets from the China Centre for Resources Satellite Data and Application (http://www.cresda.com/CN/, last accessed on 2 February 2022) [65]. The GF 1 satellite, launched on 26 April 2013, is the first of China’s series of high resolution earth observation satellites operated by China [66]. The GF 1 satellite is equipped with one four-camera push-broom multispectral system to obtain an 800 km wide image and one two-camera panchromatic/multispectral system (PMS), allowing a temporal resolution of four days [67]. Each camera for the four-camera push-broom multispectral system is a wide field of view (WFV) camera with a resolution of 16 m. Thus, the four-camera push-broom multispectral system is also called a wide field of view system (WFVS). The GF 1 PMS has 5 spectral bands, of which one is the panchromatic band (band 1) of 2 m spatial resolution, and the other 4 bands (bands 2–5) are multispectral bands of 8 m resolution, as shown in Table 2. The GF 1B, GF 1C, and GF 1D satellites were launched on 31 March 2018, in coordination with GF 1 to observe earth surface. They have the same sensor systems as GF 1. The PMS on GF 2 also has 5 spectral bands but with higher spatial resolution, e.g., 1 m resolution for the panchromatic band (band 1) and 4 m resolution for the other multispectral bands (bands 2–5), as shown in Table 2. The Sentinel 2, and GF 1, GF 1B, GF 1C, GF 1D, and GF 2 satellite data sets (abbr. GF series data sets) were re-projected to a common Universal Transverse Mercator (UTM) projection with WGS84 as the datum so that all data sets can be overlaid within the same coordinate system. The Sentinel 2 data set has medium spatial resolution and was used as input to our mapping workflow of burned area at the subpixel level, and the GF series data sets have higher spatial resolution and were used as reference data for the accuracy assessment of burned area mapped at subpixel level. The Sentinel 2 satellite data were processed to bottom-of-atmosphere reflectance after radiometric calibration and atmospheric correction using the algorithm ‘Sen2Cor’ v2.8 (downloadable from http://step.esa.int/main/snap-supported-plugins/sen2cor/sen2cor_v2-8/, last accessed on 3 April 2022) [68,69,70]. Table 2 lists the specification of the Sentinel 2 bands (bands 2–4 and 8) selected for this study.

3.1.2. Data Preprocessing

To make full use of spectral information of the selected bands, we carried out two steps further on the Sentinel 2 data sets:
(1)
Geo-referencing based on Georeferencing tool in ArcMap 10.4 (ESRI) software, adding ten control points to determine the corresponding relationship between the Sentinel 2 data sets and the GF series data sets, and selecting first Order Polynomial (Affine) transformation [71,72,73,74].
(2)
Calculation of Normalized Difference Vegetation Index (NDVI) based on Band Math tool in the Environment for Visualizing Images (ENVI) software (version 5.3, Exelis, Herndon, VA, USA) using Equation (1) [75,76]:
NDVI = (NIR − Red)/(NIR + Red)
where NIR and Red represent band 2 and band 8 of Sentinel 2, respectively.
To take full advantage of all spectral information, we took four steps to process the GF series data sets (step 1, 3, 4 processed in ENVI 5.3, and step 2 processed in ArcMap 10.4):
(1)
Geometric orthorectification [77,78] was based on the RPC Orthorectification Batch tool using the Advanced Land-Observing Satellite (ALOS) digital elevation model (DEM) of 12.5 m spatial resolution [79,80].
(2)
Geo-referencing was based on the Georeferencing tool by adding ten control points to determine the correspondence between panchromatic band and multispectral bands and selecting first Order Polynomial (Affine) transformation [71,72,73,74].
(3)
Image fusion was based on the Pan Sharpening Batch tool to fuse panchromatic and multispectral bands to obtain images of 2 m resolution [77,81].
(4)
Resampling was based on the GF 2 images at 1 m spatial resolution after image fusion that were resampled to 2 m resolution using the cubic convolution interpolation method [82,83,84].

3.2. True Fire-Scar Information

Tabulated fire information collected (TFIC) that was provided by the Department of Emergency Management of Hunan Province was used as ground-truth fire-scar information for this study. Every forest fire event contained information about location, weather, starting and extinguishing times, burned area, and causes, etc. The Department of Emergency Management of Hunan Province confirmed and reported forest fire in details when a forest fire occurred (as shown in Figure 3). Field measurements, including burned area resulted from forest fire, were conducted with Global Navigation Satellite System (GNSS) survey equipment and Unmanned Aerial Vehicle (UAV). The platform was Phantom 4 Pro V2.0, and the sensor was 1-inch CMOS with 20 million effective pixels by well-trained personnel. Well-trained personnel created manually in situ vector data of burned area [85]. We transformed the vector data based on the GF series data sets into regions of interest (ROIs) to validate the proposed workflow. Within the TFIC between 2018 and 2021, fifteen fire-scar sites that caused the most environmental damages were selected as reference data for ground truth fire-scar information. Table 3 shows information concerning the 15 fire sites in Hengdong County (1HYHD), Guiyang County (2CZGY), Lanshan County (3YZLS), Hongjiang County (4HHHJ), Fenghuang County (5XXFH), Guidong County (6CZGD), Ningxiang County (7CSNX), Guiyang County (8CZGY), Beihu County (9CZBH), Liling County (10ZZLL), Sangzhi County (11ZJJSZ), Guidong County (12CZGD), Taojiang County (13YYTJ), Taojiang County (14YYTJ), and Taojiang County (15YYTJ).

3.3. BASM Approach for Fire Scar Detection

The proposed Burned Area Subpixel Mapping (BASM) workflow consists of three major steps: (1) spectral unmixing using the FCLS Spectral Unmixing tool in ENVI 5.3 software since it outperformed other techniques with the highest classification accuracy and least execution time [44], (2) obtaining burned area spatial distribution information from mixed pixels used modified-pixel swapping algorithm (MPSA), and (3) post-classification after MPSA to denoise the data [86], using the Clump tool in ENVI 5.3 software with Dilate Kernel Value of 3 and Erode Kernel Value of 3. Since burned area mapping at the subpixel level has been rarely investigated, we used the BASM to detect burned area spatial distribution within mixed pixels. The burned area abundance after FCLS was processed with the MPSA to get the final results. Based on the Sentinel 2 satellite data, we divided each pixel into 5 × 5 sub-pixels so that each subpixel has a spatial resolution of 2 m. The final result of burned areas with geographic locations at subpixel level will show the specific spatial distribution of burned areas within a mixed pixel.

3.3.1. Fully Constrained Least Squares (FCLS)

To estimate the abundance of burned area in a mixed pixel in the Sentinel 2 images, we used the well-performed FCLS [35,44,87] method for linear spectral unmixing analysis. There are two constraints imposed on the analysis: (1) endmember abundance sum-to-one j = 1 p α j = 1 , and (2) abundance nonnegativity α j 0 (1 ≤ jp). We extracted spectra of three variable set of endmembers manually: bare land, forest, and burned area from a Sentinel 2 image, and save them as spectral library (.sli format) for FCLS as input.

3.3.2. Modified Pixel Swapping Algorithm (MPSA)

Atkinson [47] proposed the pixel swapping algorithm (PSA) method that was designed to process an image of land cover proportions in K = 2 classes. In this study, the PSA method was applied to the spectral unmixing results after FCLS. If 10 × 10 (=100) subpixels are to be mapped within each mixed pixel, a land cover class (with or without burned area) with a fraction of 57 percent means that 57 subpixels were allocated to that class. Furthermore, these subpixels are randomly located within the mixed pixel. Once allocated, only the spatial arrangement of the subpixels, not the actual attribute values, can vary. The number of subpixels allocated to that class within each mixed pixel remains fixed. PSA is comprised of three basic steps, but we improved the algorithm as the Modified Pixel Swapping Algorithm (MPSA) that has four steps as detailed below: Firstly, in satellite images with 10 m spatial resolutions, we took burned area as a continuous entity. Within the abundance results after applying the FCLS algorithm, some pixels will appear as false burned area due to noise [60,88,89]. In order to minimize misclassification and effects due to noise in the FCLS results on the final burned area at subpixel level, we modified the PSA by adding the object-based approach for classifying burned area, filtered out negative burned area caused by noise, compared with the results from three pixel-based classifiers, and found the most suitable denoising method for this study. We chose a well-known object-based approach and used the three pixel-based classifiers imbedded in the ENVI 5.3 software. Specifically, for the object-based and three pixel-based classifiers, we used four models, including feature extraction rule based (FERB) [90,91], random forest (RF) [92,93], backpropagation neural net (BPNN) [94,95], and support vector machine (SVM) [96,97], that are well-known for their good performance. We manually selected 30% of the clipped image as training samples for RF, BPNN, and SVM, and used the rest as test samples. We took the result of FERB, RF, BPNN, and SVM, respectively, with the result of FCLS as the input of the BASM method. The step of traversing the pixel value of FERB, RF, BPNN, SVM respectively reduced the misclassification and effect due to noise for mapping the burned area at subpixel level. The parameter setup of the four models is shown in Table 4.
Secondly, for every subpixel pi, its attractiveness A( p i c ) in the c-th class is predicted as a distance-weighted function of its j = 1,2, …, J neighbors:
A ( p i c ) = j = 1 J λ i j z ( x j c )
where z( x j c ) is the value of the j-th subpixel belonging to the c-th class (c = 1 or 2), and λij is a distance-dependent weight given by the following equation:
λ i j = exp ( h i j α )
where hij is the distance between two subpixels pi and pj, and a is the non-linear parameter of the exponential model. In this study, a is chosen to be 3.
Thirdly, once the attractiveness of each subpixel has been calculated based on the current arrangement of subpixel classes, the subpixel algorithm ranks the values on a pixel-by-pixel basis. For each pixel, the least attractive subpixel currently allocated to a “1” (i.e., a “1” surrounded mainly by “0”s) is stored (as shown in Equation (4)):
candidate A = ( x i c : A ( p i c )   =   min ( A ) | z ( x i c )   =   1 )
The most attractive subpixel currently allocated to a “0” (i.e., a “0” surrounded mainly by “1”s) is also stored (as shown in Equation (5)):
candidate B   =   ( x j c   :   A ( p j c )   =   max ( A ) | z ( x j c )   =   0 )
Lastly, classes of subpixels are swapped as follows: if the attractiveness of the least attractive subpixel is less than that of the most attractive subpixel, the classes are swapped for the subpixels in question to enhance spatial correlation of the sub-pixels (as shown in Equation (6)):
z ( x i c ) = 0 z ( x j c ) = 1 if A i < A j
Otherwise no swapping is made.
For the convenience of presenting the results of the proposed BASM traversing through the four models, we named the BASM with FERB as BASM-FERB, with RF as BASM-RF, with BPNN as BASM-BPNN, and with SVM as BASM-SVM, respectively. Furthermore, we directly named BASM without traversing any of the four models as BASM-notra. To demonstrate minimizing misclassification and effect due to noise of the BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM, we compared them with BASM-notra. After comparing and analyzing the accuracy assessment of the BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra, the method with the highest accuracy was obtained.

3.4. Accuracy Assessment

To evaluate the performance of the BASM workflow, fifteen fire-scar sites with TFIC were selected as reference to assess the accuracy of burned area mapping at subpixel level. For each fire-scar site in a Sentinel 2 image, we used the extracted burned area vector data from the GF series data sets as the reference image due to their high spatial resolution. Five widely used quality indexes, i.e., overall accuracy (OA) [98,99], user’s accuracy (UA) [100,101], producer’s accuracy (PA) [102,103], intersection over union (IoU) [104,105], and Kappa coefficient (Kappa) [106,107] were calculated for evaluation. OA represents the proportion of the correctly predicted number of pixels to the total. UA represents the proportion of true burned area to the predicted burned area. PA represents the correctly predicted burned area to the true burned area. IoU represents the correct classification accuracy of the models. Kappa represents the agreement between the predictions and ground truth. They can be formulated as follows and calculated from the confusion matrix (as shown in Table 5).
Five widely used quality indexes given by the following equation:
O A = T P + T N T P + F P + F N + T N
U A = T P T P + F P
P A = T P T P + F N
I o U = T P T P + F P + F N
K a p p a   =   O A     P e 1     P e W h e r e , P e = ( T P   +   F P )   ×   ( T P   +   F N )   +   ( F N   +   T N )   ×   ( F P   +   T N ) ( T P   +   F P   +   T N   +   F N ) 2

4. Results

4.1. Reduction of Misclassification Due to Noise

To solve the misclassification and noise problem in mapping burned areas at the subpixel level, we analyzed misclassification and effects due to noise for the BASM-notra, BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM. We chose eleven fire-scar sites in Hunan Province, China as examples from Table 3, and they were 2CZGY, 3YZLS, 4HHHJ, 5XXFH, 6CZGD, 9CZBH, 10ZZLL, 12CZGD, 13YYTJ, 14YYTJ, and 15YYTJ, respectively.
As shown in Figure 4, burned area mapped at the subpixel level based on BASM-FERB (Panel C), BASM-RF (Panel D), BASM-BPNN (Panel E), and BASM-SVM (Panel F) presents less misclassification and effect due to noise than that based on BASM notra (the Panel B). The results indicated that the proposed BASM combining with the four models (classifiers) is effective in minimizing misclassification and effect due to noise. The effect in minimizing misclassification and effect due to noise by BASM-FERB is generally better than BASM-RF, BASM-BPNN, and BASM-SVM.

4.2. Pixel and Subpixel Mapping of Burned Area

To illustrate the advantage of the proposed BASM for mapping burned area at subpixel level compared with the traditional object-based method (FERB) and pixel-based method (RF, BPNN, and SVM), we tested it on the fifteen fire-scar sites using the Sentinel 2 satellite image. We compared results from BASM-FERB with FERB, BASM-RF with RF, BASM-BPNN with BPNN, and BASM-SVM with SVM, respectively. We took two fire-scar sites in Hunan Province, China as examples. The first fire-scar site is located in Ningxiang County with 53.06 ha burned area (i.e., 7CSNX) and the second fire-scar site is located in Guiyang County with 50.14 ha burned area (i.e., 8CZGY).
Results for the two examples were shown in Figure 5, in which we compared the burned area mapped at subpixel level with that mapped at pixel level. Comparing panel B with panel G, panel C with panel H, panel D with panel I, and panel E with panel J, we can see that methods FERB, RF, BPNN, and SVM can roughly classify the burned area range (i.e., panels B–E). However, the boundaries of the burned area mapped at pixel level are rough and the visual effect is poor, in contrast to the boundaries of the burned area mapped at subpixel level (i.e., panels G–J) that are clearer and have more spatial details. Results shown in panels B–E, and G–J in Figure 5, indicate that the burned area mapped was not fully classified. Comparing panels B–E, the burned area mapped at pixel level using the FERB (i.e., panel B) is more similar to the reference map (i.e., panel A) than others (i.e., panels C–E). Comparing panels G–J, the burned area mapped at subpixel level using BASM-FERB (i.e., panel G) is more similar to the reference map (i.e., panel F) than others (i.e., panels H–J).

4.3. Accuracy Assessment of the BASM Approach

To further assess the performance of the proposed BASM method, we evaluated it quantitatively at the fifteen fire-scar sites using the Sentinel 2 satellite images. As in Section 4.2, we took two fire-scar sites in Hendong County (i.e., 1HYHD) with 1244.42 ha burned area and Sangzhi County (i.e., 11ZJJSZ) with 34.72 ha burned area.
The results for the Hendong site are shown in Figure 6a. In the panels E–H, the BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM have shown classifying ability in the extraction and classification of burned area. Furthermore, as shown in the green circle in panels E–H, the BASM-FERB is better than BASM-RF, BASM-BPNN, and BASM-SVM, which can minimize misclassification in the burned area mapped at subpixel level and can make the burned area mapped at subpixel level accurate. As shown in Table 6, the BASM-FERB outperformed BASM-RF, BASM-BPNN, and BASM-SVM with OA, UA, IoU, and Kappa being 98.56%, 92.70%, 87.56%, and 92.56% respectively, but with PA being 94.04%, slightly lower than that of BASM-RF’s, BASM-BPNN’s, and BASM-SVM’s.
The results for the Sangzhi site (i.e., 11ZJJSZ) are shown in Figure 6b. BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM have shown good classifying ability in deriving burned area from Sentinel 2. Furthermore, the BASM-FERB can more effectively minimize misclassification on the burned area mapped at subpixel level. As shown in Table 6, BASM-FERB also performed better than BASM-RF, BASM-BPNN, and BASM-SVM with OA, UA, IoU, and Kappa being 98.81%, 83.73%, 80.28%, and 88.44% respectively, but in the area encircled by the green circle in panels E–H, BASM-RF, BASM-BPNN, and BASM-SVM could derive burned area from the Sentinel 2 image while BASM-FERB could not.
As shown in Table 6, thirteen out of the fifteen fire-scar sites demonstrated that BASM-FERB outperformed BASM-RF, BASM-BPNN, and BASM-SVM. Furthermore, the average OAs of BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra are 98.11%, 97.44%, 97.01%, 96.82%, and 77.13% respectively. The average UAs of BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra are 81.72%, 68.07%, 64.92%, 61.25%, and 18.72% respectively. The average IoUs of BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra are 74.32%, 63.43%, 59.63%, 56.98%, and 18.35% respectively. The average Kappas of BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra are 83.98%, 75.31%, 72.23%, 69.82%, and 22.99% respectively. The average PAs of BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra are 89.52%, 89.97%, 86.92%, 87.68%, and 91.36% respectively.

5. Discussion

At present, the research on burned area mapping is mainly at pixel scale [6,8,29,58,108]. Based on the moderate spatial resolution Sentinel 2 data sets, we introduced the concept of subpixel for burned area mapping, and the results had relatively high accuracy. In addition, we modified the pixel swapping algorithm by innovatively traversing the four well-performed classifiers, which reduced the misclassification and effect due to noise of burned area mapped at subpixel level. Traditionally, burned area mapping was done by artificial ground surveys and field sketches [12], which is difficult to cover a large area. Data collection and spatialization is difficult too. Burned area mapping using satellite image data has become a subject of extensive research over the past decades [6,8], and research on using remote sensing techniques to generate products of global burned area started in the late 1980s [109]. The advent of remote sensing techniques, either space-borne [15] or airborne [110], gives researchers an alternative or even better way for burned area mapping.
Reduction in noise and misclassification using remote sensing images has always been a classical problem, which has attracted a lot of research interests [111]. There are many methods for noise reduction and classification improvement, including second-generation wavelets [112], rank approximation [113], Asymmetrical Gaussian function-fitting [114], Double logistic function-fitting [114], and so on. In the current study, the burned area abundance obtained using FCLS is hindered by ubiquitous noise, thus we modified Pixel Swapping Algorithm by traversing the four well-performed classifiers or models, and then getting the burned area mapped at subpixel level that performed well in noise reduction and classification improvement.
Based on satellite remote sensing image data and techniques, there are many studies using machine learning algorithms for burned area mapping [15,17,29,75,115]. Xulu et al. [29] adopt Random Forest classifier to separate burned from unburned area using the Sentinel 2 data. Ramo et al. [17] explored the random forest, support vector machine, neural networks, and decision tree algorithm (C5.0) for classifying burned area at global scale based MODIS data. In addition, researchers used deep learning algorithm [8,30,108,116] to classify burned area from images collected by different sensors. For example, Seydi et al. [30] proposed a novel framework based on deep Siamese morphological neural network (DSMNN-Net) for burned area mapping. Zhang et al. [6] presented a Siamese self-attention classification strategy for the multi-sensor burned area mapping.
However, whether using machine learning algorithms or deep learning algorithms based on remote sensing data for burned area mapping, most current research basically performs the mapping at pixel level, and very rarely at subpixel level. In this paper, we proposed the BASM workflow and evaluated it at fifteen fire-scar sites, and the results indicated that using BASM-FERB, BASM-RF, BASM-BPNN, and BPSM-SVM for mapping burned area at subpixel level provides finer resolution when compared to FERB, RF, BPNN, and SVM at pixel level. The efficiency of the BASM-FERB, BASM-RF, BASM-BPNN, and BPSM-SVM was roughly the same, and each method took less than 30 min from the start to the output of results. In addition, burned area mapped at the subpixel level using BASM-FERB agreed very well to the reference data.
Furthermore, burned area mapping at pixel level could cause large errors. Ability to specify the distribution of endmember composition in a mixed pixel can improve such a situation [46]. Ruescas et al. [117] estimated burnt land percentage at subpixel level using Advanced Very High Resolution Radiometer (AVHRR) data with a mean error of 6.5%. In this paper, we selected endmembers manually, which was not as efficient as automatic or semiautomatic procedures and was subject to operator experience. It needs to be considered to adopt automatic or semiautomatic procedures for endmember selection, such as the Pixel Purity Index (PPI) and Sequential Maximum Angle Convex Cone (SMACC) algorithms in ENVI. Furthermore, we derived burned area abundance from the Sentinel 2 data and further output the specific spatial distribution of burned area in mixed pixels. It is widely known that, in mountainous areas, burned area mapping based on unmixing models is confounded by shadows. For example, as shown in Figure 4, the misclassification and effect due to noise, including any red area beyond the yellow line, may be caused by shadows. In order to minimize misclassification and effects confounded by shadows, we modified the PSA by traversing the result of four approaches respectively, then filtered out the negative burned area caused by noise, such as shadows. In this study, we proposed the BASM workflow for mapping burned area at the subpixel level and we compared BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra with each other. Except for the case that the BASM-FERB’s average PA (=89.52%) is slightly lower than that of BASM-RF (average PA = 89.97%) and BASM-notra (average PA = 91.36%), BASM-FERB’s OA, UA, IoU, and Kappa are significantly higher than those of the other four methods. It is of great significance for post-forest fire management and assessment to derive burned area at the subpixel level from remote sensing data. Moreover, the BASM-FERB method was successfully applied to the fifteen fire-scar sites in Hunan Province, China.

6. Conclusions

We proposed an approach named BASM to map burned area at the subpixel level using Sentinel 2 image data. The proposed method contains three main steps: (1) spectral unmixing used FCLS, (2) derivation of burned area at subpixel level using MPSA, and (3) post-classification after MPSA using Clump. The BASM was applied to fifteen fire-scar sites located in Hunan Province, China. Comparing with the traditional mapping methods (FERB, RF, BPNN, and SVM) for burned area mapping at pixel level, BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM generated burned area as mapped at the subpixel level. In terms of minimizing misclassification and effects due to noise, a comparison between BASM-notra, BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM could optimize the selection of methods to reduce noise effectively. Results showed that BASM-FERB has better mapping accuracy of burned area at the subpixel level, which is reflected in four accuracy evaluation indices i.e., OA, UA, IoU, and Kappa. The mapping accuracy of burned area at subpixel level based on the BASM-FERB has been largely enhanced with average OA (98.11%), average UA (81.72%), average IoU (74.32%), and average Kappa (83.98%), respectively. Even though BASM-RF’s and BASM-notra’s average PA (89.97%), (91.36%) is higher than BASM-FERB’s (89.52%), the difference is very small. We concluded that: (1) the BASM method developed in this study has well demonstrated its better performance in mapping burned area at the subpixel level compared to conventional methods at the pixel level, (2) the BASM in combination with any of the four well-performing classifiers can reduce misclassification and minimize noise effectively when compared with the results obtained by BASM alone, and (3) the BASM-FERB can generate more accurate burned area as mapped at subpixel level than BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra. This new mapping technique for mapping burned area at the subpixel level can enhance post-fire management and assessment and help in quantifying carbon budgets. We guess that the BASM method will also have better effects in vegetation mapping, snow mapping, and other applications. The result in this paper provides a reference for the refined utilization of satellite remote sensing data sets with moderate spatial resolution.

Author Contributions

Conceptualization, methodology, validation, formal analysis, writing—review, & editing, H.X.; conceptualization, supervision, project administration, funding acquisition, G.Z.; writing, assessment, review, & editing, Z.Z.; writing, quality control, review, & editing, X.Z.; project administration and funding acquisition, J.Z.; project administration and funding acquisition, C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Science and Technology Innovation Platform and Talent Plan Project of Hunan Province under Grant 2017TP1022, the National Natural Science Foundation of China under Grant 42074016, the Hunan Youth Fund Project under Grant 2022JJ40879, the Emergency Management Science and Technology Project of Hunan Province under Grant 2020YJ007, and the Science and Technology Planning Project of Hunan Province, China under Grant 2016SK2025.

Data Availability Statement

We thank China Centre for Resources Satellite Data and Application for providing the GF series data sets. We thank the Department of Emergency Management of Hunan Province and the Hunan Aerial Forest Fire Protection Station for their provision of ground-true fire-scar information and pictures. We thank the United States Geological Survey (USGS) for providing the sentinel 2 satellite data sets.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Matin, M.A.; Chitale, V.S.; Murthy, M.S.R.; Uddin, K.; Bajracharya, B.; Pradhan, S. Understanding forest fire patterns and risk in Nepal using remote sensing, geographic information system and historical fire data. Int. J. Wildland Fire 2017, 26, 276–286. [Google Scholar] [CrossRef] [Green Version]
  2. Fernandez-Carrillo, A.; McCaw, L.; Tanase, M.A. Estimating prescribed fire impacts and post-fire tree survival in eucalyptus forests of Western Australia with L-band SAR data. Remote Sens. Environ. 2019, 224, 133–144. [Google Scholar] [CrossRef]
  3. Sannigrahi, S.; Pilla, F.; Basu, B.; Basu, A.S.; Sarkar, K.; Chakraborti, S.; Joshi, P.K.; Zhang, Q.; Wang, Y.; Bhatt, S.; et al. Examining the effects of forest fire on terrestrial carbon emission and ecosystem production in India using remote sensing approaches. Sci. Total Environ. 2020, 725, 138331. [Google Scholar] [CrossRef] [PubMed]
  4. Schroeder, W.; Oliva, P.; Giglio, L.; Quayle, B.; Lorenz, E.; Morelli, F. Active fire detection using Landsat-8/OLI data. Remote Sens. Environ. 2016, 185, 210–220. [Google Scholar] [CrossRef] [Green Version]
  5. Syphard, A.D.; Keeley, J.E. Location, timing and extent of wildfire vary by cause of ignition. Int. J. Wildland Fire 2015, 24, 37–47. [Google Scholar] [CrossRef] [Green Version]
  6. Zhang, Q.; Ge, L.; Zhang, R.; Metternicht, G.I.; Du, Z.; Kuang, J.; Xu, M. Deep-learning-based burned area mapping using the synergy of Sentinel-1&2 data. Remote Sens. Environ. 2021, 264, 112575. [Google Scholar] [CrossRef]
  7. National Forest Fire Prevention Plan (2016–2025). 2016. Available online: https://leap.unep.org/countries/cn/national-legislation/national-forest-fire-prevention-plan-2016-2025 (accessed on 4 April 2022).
  8. Pinto, M.; Trigo, R.; Trigo, I.; DaCamara, C. A Practical Method for High-Resolution Burned Area Monitoring Using Sentinel-2 and VIIRS. Remote Sens. 2021, 13, 1608. [Google Scholar] [CrossRef]
  9. Daldegan, G.A.; Roberts, D.A.; Ribeiro, F.D.F. Spectral mixture analysis in Google Earth Engine to model and delineate fire scars over a large extent and a long time-series in a rainforest-savanna transition zone. Remote Sens. Environ. 2019, 232. [Google Scholar] [CrossRef]
  10. Mouillot, F.; Field, C.B. Fire history and the global carbon budget: A 1ox 1o fire history reconstruction for the 20th century. Glob. Chang. Biol. 2005, 11, 398–420. [Google Scholar] [CrossRef]
  11. Ba, R.; Song, W.; Li, X.; Xie, Z.; Lo, S. Integration of Multiple Spectral Indices and a Neural Network for Burned Area Mapping Based on MODIS Data. Remote Sens. 2019, 11, 326. [Google Scholar] [CrossRef] [Green Version]
  12. Chuvieco, E.; Mouillot, F.; van der Werf, G.R.; San Miguel, J.; Tanase, M.; Koutsias, N.; García, M.; Yebra, M.; Padilla, M.; Gitas, I.; et al. Historical background and current developments for mapping burned area from satellite Earth observation. Remote Sens. Environ. 2019, 225, 45–64. [Google Scholar] [CrossRef]
  13. Xie, Z.; Song, W.; Ba, R.; Li, X.; Xia, L. A Spatiotemporal Contextual Model for Forest Fire Detection Using Himawari-8 Satellite Data. Remote Sens. 2018, 10, 1992. [Google Scholar] [CrossRef] [Green Version]
  14. Florath, J.; Keller, S. Supervised Machine Learning Approaches on Multispectral Remote Sensing Data for a Combined Detec-tion of Fire and Burned Area. Remote Sens. 2022, 14, 657. [Google Scholar] [CrossRef]
  15. Stroppiana, D.; Bordogna, G.; Sali, M.; Boschetti, M.; Sona, G.; Brivio, P.A. A Fully Automatic, Interpretable and Adaptive Machine Learning Approach to Map Burned Area from Remote Sensing. ISPRS Int. J. Geo-Information 2021, 10, 546. [Google Scholar] [CrossRef]
  16. Santos, F.L.; Libonati, R.; Peres, L.F.; Pereira, A.A.; Narcizo, L.C.; Rodrigues, J.A.; Oom, D.; Pereira, J.M.C.; Schroeder, W.; Setzer, A.W. Assessing VIIRS capabilities to improve burned area mapping over the Brazilian Cerrado. Int. J. Remote Sens. 2020, 41, 8300–8327. [Google Scholar] [CrossRef]
  17. Ramo, R.; García, M.; Rodríguez, D.; Chuvieco, E. A data mining approach for global burned area mapping. Int. J. Appl. Earth Obs. Geoinf. ITC J. 2018, 73, 39–51. [Google Scholar] [CrossRef]
  18. Valencia, G.; Anaya, J.; Velásquez, E.A.; Ramo, R.; Caro-Lopera, F. About Validation-Comparison of Burned Area Products. Remote Sens. 2020, 12, 3972. [Google Scholar] [CrossRef]
  19. Moreno-Ruiz, J.A.; García-Lázaro, J.R.; Arbelo, M.; Cantón-Garbín, M. MODIS Sensor Capability to Burned Area Mapping—Assessment of Performance and Improvements Provided by the Latest Standard Products in Boreal Regions. Sensors 2020, 20, 5423. [Google Scholar] [CrossRef]
  20. Otón, G.; Lizundia-Loiola, J.; Pettinari, M.L.; Chuvieco, E. Development of a consistent global long-term burned area product (1982–2018) based on AVHRR-LTDR data. Int. J. Appl. Earth Obs. Geoinf. ITC J. 2021, 103, 102473. [Google Scholar] [CrossRef]
  21. Zhang, S.; Zhao, H.; Wu, Z.; Tan, L. Comparing the Ability of Burned Area Products to Detect Crop Residue Burning in China. Remote Sens. 2022, 14, 693. [Google Scholar] [CrossRef]
  22. Humber, M.; Boschetti, L.; Giglio, L.; Justice, C.O. Spatial and temporal intercomparison of four global burned area products. Int. J. Digit. Earth 2018, 12, 460–484. [Google Scholar] [CrossRef]
  23. Artés, T.; Oom, D.; De Rigo, D.; Durrant, T.H.; Maianti, P.; Libertà, G.; San-Miguel-Ayanz, J. A global wildfire dataset for the analysis of fire regimes and fire behaviour. Sci. Data 2019, 6, 1–11. [Google Scholar] [CrossRef]
  24. Long, T.; Zhang, Z.; He, G.; Jiao, W.; Tang, C.; Wu, B.; Zhang, X.; Wang, G.; Yin, R. 30 m Resolution Global Annual Burned Area Mapping Based on Landsat Images and Google Earth Engine. Remote Sens. 2019, 11, 489. [Google Scholar] [CrossRef] [Green Version]
  25. Hawbaker, T.J.; Vanderhoof, M.K.; Schmidt, G.L.; Beal, Y.-J.; Picotte, J.J.; Takacs, J.D.; Falgout, J.T.; Dwyer, J.L. The Landsat Burned Area algorithm and products for the conterminous United States. Remote Sens. Environ. 2020, 244, 111801. [Google Scholar] [CrossRef]
  26. Pessôa, A.; Anderson, L.; Carvalho, N.; Campanharo, W.; Junior, C.; Rosan, T.; Reis, J.; Pereira, F.; Assis, M.; Jacon, A.; et al. Intercomparison of Burned Area Products and Its Implication for Carbon Emission Estimations in the Amazon. Remote Sens. 2020, 12, 3864. [Google Scholar] [CrossRef]
  27. Diniz, C.G.; Souza, A.A.D.A.; Santos, D.C.; Dias, M.C.; da Luz, N.C.; de Moraes, D.R.V.; Maia, J.S.A.; Gomes, A.R.; Narvaes, I.D.S.; Valeriano, D.M.; et al. DETER-B: The New Amazon Near Real-Time Deforestation Detection System. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 3619–3628. [Google Scholar] [CrossRef]
  28. Andela, N.; Morton, D.C.; Giglio, L.; Paugam, R.; Chen, Y.; Hantson, S.; van der Werf, G.R.; Randerson, J.T. The Global Fire Atlas of individual fire size, duration, speed and direction. Earth Syst. Sci. Data 2019, 11, 529–552. [Google Scholar] [CrossRef] [Green Version]
  29. Xulu, S.; Mbatha, N.; Peerbhay, K. Burned Area Mapping over the Southern Cape Forestry Region, South Africa Using Sentinel Data within GEE Cloud Platform. ISPRS Int. J. Geo-Information 2021, 10, 511. [Google Scholar] [CrossRef]
  30. Seydi, S.T.; Hasanlou, M.; Chanussot, J. DSMNN-Net: A Deep Siamese Morphological Neural Network Model for Burned Area Mapping Using Multispectral Sentinel-2 and Hyperspectral PRISMA Images. Remote Sens. 2021, 13, 5138. [Google Scholar] [CrossRef]
  31. Wang, Q.; Zhang, C.; Atkinson, P.M. Sub-pixel mapping with point constraints. Remote Sens. Environ. 2020, 244, 111817. [Google Scholar] [CrossRef]
  32. Wu, S.; Ren, J.; Chen, Z.; Jin, W.; Liu, X.; Li, H.; Pan, H.; Guo, W. Influence of reconstruction scale, spatial resolution and pixel spatial relationships on the sub-pixel mapping accuracy of a double-calculated spatial attraction model. Remote Sens. Environ. 2018, 210, 345–361. [Google Scholar] [CrossRef]
  33. Yu, W.; Li, J.; Liu, Q.; Zeng, Y.; Zhao, J.; Xu, B.; Yin, G. Global Land Cover Heterogeneity Characteristics at Moderate Resolution for Mixed Pixel Modeling and Inversion. Remote Sens. 2018, 10, 856. [Google Scholar] [CrossRef] [Green Version]
  34. Chen, X.; Wang, D.; Chen, J.; Wang, C.; Shen, M. The mixed pixel effect in land surface phenology: A simulation study. Remote Sens. Environ. 2018, 211, 338–344. [Google Scholar] [CrossRef]
  35. Daniel, C.; Heinz, C.-I.C. Fully Constrained Least Squares Linear Spectral Mixture Analysis Method for Material Quantification in Hyperspectral Imagery. IEEE Trans. Geosci. Remote Sens. 2001, 39, 529–545. [Google Scholar]
  36. Shao, Y.; Lan, J. A Spectral Unmixing Method by Maximum Margin Criterion and Derivative Weights to Address Spectral Variability in Hyperspectral Imagery. Remote Sens. 2019, 11, 1045. [Google Scholar] [CrossRef] [Green Version]
  37. Craig, M. Minimum-volume transforms for remotely sensed data. IEEE Trans. Geosci. Remote Sens. 1994, 32, 542–552. [Google Scholar] [CrossRef]
  38. Nascimento, J.; Dias, J. Vertex component analysis: A fast algorithm to unmix hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2005, 43, 898–910. [Google Scholar] [CrossRef] [Green Version]
  39. Chang, C.-I.; Wu, C.-C.; Liu, W.; Ouyang, Y.-C. A New Growing Method for Simplex-Based Endmember Extraction Algorithm. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2804–2819. [Google Scholar] [CrossRef]
  40. Miao, L.; Qi, H. Endmember Extraction from Highly Mixed Data Using Minimum Volume Constrained Nonnegative Matrix Factorization. IEEE Trans. Geosci. Remote Sens. 2007, 45, 765–777. [Google Scholar] [CrossRef]
  41. Zhang, J.; Rivard, B.; Rogge, D.M. The Successive Projection Algorithm (SPA), an Algorithm with a Spatial Constraint for the Automatic Search of Endmembers in Hyperspectral Data. Sensors 2008, 8, 1321–1342. [Google Scholar] [CrossRef] [Green Version]
  42. He, Y.; Yang, J.; Guo, X. Green Vegetation Cover Dynamics in a Heterogeneous Grassland: Spectral Unmixing of Landsat Time Series from 1999 to 2014. Remote Sens. 2020, 12, 3826. [Google Scholar] [CrossRef]
  43. Winter, M.E. N-FINDR: An Algorithm for Fast Autonomous Spectral End_Member Determination in Hyperspectral Data. In Imaging Spectrom V; International Society for Optics and Photonics: Bellingham, WA, USA, 1999; Volume 3753, pp. 266–275. [Google Scholar] [CrossRef]
  44. Kumar, U.; Ganguly, S.; Nemani, R.R.; Raja, K.S.; Milesi, C.; Sinha, R.; Michaelis, A.; Votava, P.; Hashimoto, H.; Li, S.; et al. Exploring Subpixel Learning Algorithms for Estimating Global Land Cover Fractions from Satellite Data Using High Performance Computing. Remote Sens. 2017, 9, 1105. [Google Scholar] [CrossRef] [Green Version]
  45. Atkinson, P.M. Mapping sub-pixel boundaries from remotely sensed images. In Innovations in GIS; Kemp, Z., Ed.; Taylor and Francis: London, UK, 1997; Volume 4, pp. 166–180. [Google Scholar]
  46. Atkinson, P.M.; Cutler, M.E.J.; Lewis, H. Mapping sub-pixel proportional land cover with AVHRR imagery. Int. J. Remote Sens. 1997, 18, 917–935. [Google Scholar] [CrossRef]
  47. Atkinson, P.M. Sub-pixel Target Mapping from Soft-classified, Remotely Sensed Imagery. Photogramm. Eng. Remote Sens. 2005, 71, 839–846. [Google Scholar] [CrossRef] [Green Version]
  48. Wang, Q.; Atkinson, P.M. The effect of the point spread function on sub-pixel mapping. Remote Sens. Environ. 2017, 193, 127–137. [Google Scholar] [CrossRef] [Green Version]
  49. Wang, Y.; Chen, Q.; Ding, M.; Li, J. High Precision Dimensional Measurement with Convolutional Neural Network and Bi-Directional Long Short-Term Memory (LSTM). Sensors 2019, 19, 5302. [Google Scholar] [CrossRef] [Green Version]
  50. Hu, C. A novel ocean color index to detect floating algae in the global oceans. Remote Sens. Environ. 2009, 113, 2118–2129. [Google Scholar] [CrossRef]
  51. Salomonson, V.V.; Appel, I. Estimating fractional snow cover from MODIS using the normalized difference snow index. Remote Sens. Environ. 2004, 89, 351–360. [Google Scholar] [CrossRef]
  52. Salomonson, V.; Appel, I. Development of the Aqua MODIS NDSI fractional snow cover algorithm and validation results. IEEE Trans. Geosci. Remote Sens. 2006, 44, 1747–1756. [Google Scholar] [CrossRef]
  53. He, Y.; Chen, G.; Potter, C.; Meentemeyer, R.K. Integrating multi-sensor remote sensing and species distribution modeling to map the spread of emerging forest disease and tree mortality. Remote Sens. Environ. 2019, 231, 111238. [Google Scholar] [CrossRef]
  54. Li, L.; Chen, Y.; Xu, T.; Shi, K.; Liu, R.; Huang, C.; Lu, B.; Meng, L. Remote Sensing of Wetland Flooding at a Sub-Pixel Scale Based on Random Forests and Spatial Attraction Models. Remote Sens. 2019, 11, 1231. [Google Scholar] [CrossRef] [Green Version]
  55. Li, X.; Chen, R.; Foody, G.M.; Wang, L.; Yang, X.; Du, Y.; Ling, F. Spatio-Temporal Sub-Pixel Land Cover Mapping of Remote Sensing Imagery Using Spatial Distribution Information from Same-Class Pixels. Remote Sens. 2020, 12, 503. [Google Scholar] [CrossRef] [Green Version]
  56. Ling, F.; Li, X.; Foody, G.M.; Boyd, D.; Ge, Y.; Li, X.; Du, Y. Monitoring surface water area variations of reservoirs using daily MODIS images by exploring sub-pixel information. ISPRS J. Photogramm. Remote Sens. 2020, 168, 141–152. [Google Scholar] [CrossRef]
  57. Deng, C.; Zhu, Z. Continuous subpixel monitoring of urban impervious surface using Landsat time series. Remote Sens. Environ. 2018, 238, 110929. [Google Scholar] [CrossRef]
  58. Ling, F.; Du, Y.; Zhang, Y.; Li, X.; Xiao, F. Burned-Area Mapping at the Subpixel Scale with MODIS Images. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1963–1967. [Google Scholar] [CrossRef]
  59. Msellmi, B.; Picone, D.; Rabah, Z.; Mura, M.; Farah, I. Sub-Pixel Mapping Model Based on Total Variation Regularization and Learned Spatial Dictionary. Remote Sens. 2021, 13, 190. [Google Scholar] [CrossRef]
  60. Zhang, C.; Ma, L.; Chen, J.; Rao, Y.; Zhou, Y.; Chen, X. Assessing the impact of endmember variability on linear Spectral Mixture Analysis (LSMA): A theoretical and simulation analysis. Remote Sens. Environ. 2019, 235, 111471. [Google Scholar] [CrossRef]
  61. Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: ESA’s Optical High-Resolution Mission for GMES Operational Services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
  62. Roteta, E.; Bastarrika, A.; Padilla, M.; Storm, T.; Chuvieco, E. Development of a Sentinel-2 burned area algorithm: Generation of a small fire database for sub-Saharan Africa. Remote Sens. Environ. 2018, 222, 1–17. [Google Scholar] [CrossRef]
  63. Yan, L.; Roy, D.P.; Li, Z.; Zhang, H.K.; Huang, H. Sentinel-2A multi-temporal misregistration characterization and an orbit-based sub-pixel registration methodology. Remote Sens. Environ. 2018, 215, 495–506. [Google Scholar] [CrossRef]
  64. Tong, X.-Y.; Xia, G.-S.; Lu, Q.; Shen, H.; Li, S.; You, S.; Zhang, L. Land-cover classification with high-resolution remote sensing images using transferable deep models. Remote Sens. Environ. 2020, 237, 111322. [Google Scholar] [CrossRef] [Green Version]
  65. Shi, Y.; Wang, Z.; Liu, L.; Li, C.; Peng, D.; Xiao, P. Improving Estimation of Woody Aboveground Biomass of Sparse Mixed Forest over Dryland Ecosystem by Combining Landsat-8, GaoFen-2, and UAV Imagery. Remote Sens. 2021, 13, 4859. [Google Scholar] [CrossRef]
  66. Li, J.; Chen, X.; Tian, L.; Huang, J.; Feng, L. Improved capabilities of the Chinese high-resolution remote sensing satellite GF-1 for monitoring suspended particulate matter (SPM) in inland waters: Radiometric and spatial considerations. ISPRS J. Photogramm. Remote Sens. 2015, 106, 145–156. [Google Scholar] [CrossRef]
  67. Lu, C.L.; Wang, R.; Yin, H. GF-1 Satellite Remote Sensing Characters. Spacecr. RecoveryRemote Sens. 2014, 35, 67–73. (In Chinese) [Google Scholar]
  68. Morresi, D.; Marzano, R.; Lingua, E.; Motta, R.; Garbarino, M. Mapping burn severity in the western Italian Alps through phenologically coherent reflectance composites derived from Sentinel-2 imagery. Remote Sens. Environ. 2021, 269, 112800. [Google Scholar] [CrossRef]
  69. Louis, J.; Debaecker, V.; Pflug, B.; Main-Knorn, M.; Bieniarz, J.; Mueller-Wilm, U.; Cadau, E.; Gascon, F. Sentinel-2 Sen2cor: L2A Processor for Users. 2016. Available online: https://elib.dlr.de/107381/1/LPS2016_sm10_3louis.pdf (accessed on 19 April 2022).
  70. Ramo, R.; Roteta, E.; Bistinas, I.; Van Wees, D.; Bastarrika, A.; Chuvieco, E.; Van der Werf, G.R. African burned area and fire carbon emissions are strongly impacted by small fires undetected by coarse resolution satellite data. Proc. Natl. Acad. Sci. USA 2021, 118, e2011160118. [Google Scholar] [CrossRef]
  71. Rasmy, L.; Sebari, I.; Ettarid, M. Automatic Sub-Pixel Co-Registration of Remote Sensing Images Using Phase Correlation and Harris Detector. Remote Sens. 2021, 13, 2314. [Google Scholar] [CrossRef]
  72. Parker, B.M.; Lewis, T.; Srivastava, S.K. Estimation and evaluation of multi-decadal fire severity patterns using Landsat sensors. Remote Sens. Environ. 2015, 170, 340–349. [Google Scholar] [CrossRef]
  73. Hall, R.J.; Freeburn, J.T.; De Groot, W.J.; Pritchard, J.M.; Lynham, T.J.; Landry, R. Remote sensing of burn severity: Experience from western Canada boreal fires. Int. J. Wildland Fire 2008, 17, 476–489. [Google Scholar] [CrossRef]
  74. Papaloukas, C.; Fotiadis, D.I.; Liavas, A.P.; Likas, A.; Michalis, L.K. A knowledge-based technique for automated detection of ischaemic episodes in long duration elect rocard iog rams. Med. Biol. Eng. Comput. 2001, 39, 105–112. [Google Scholar] [CrossRef]
  75. Vanderhoof, M.K.; Hawbaker, T.J.; Teske, C.; Ku, A.; Noble, J.; Picotte, J. Mapping Wetland Burned Area from Sentinel-2 across the Southeastern United States and Its Contributions Relative to Landsat-8 (2016–2019). Fire 2021, 4, 52. [Google Scholar] [CrossRef]
  76. Yue, J.; Tian, Q.; Dong, X.; Xu, N. Using broadband crop residue angle index to estimate the fractional cover of vegetation, crop residue, and bare soil in cropland systems. Remote Sens. Environ. 2019, 237, 111538. [Google Scholar] [CrossRef]
  77. Zhang, X.; Cheng, B.; Chen, J.; Liang, C. High-Resolution Boundary Refined Convolutional Neural Network for Automatic Agricultural Greenhouses Extraction from GaoFen-2 Satellite Imageries. Remote Sens. 2021, 13, 4237. [Google Scholar] [CrossRef]
  78. Tong, X.; Liu, S.; Weng, Q. Bias-corrected rational polynomial coefficients for high accuracy geo-positioning of QuickBird stereo imagery. ISPRS J. Photogramm. Remote Sens. 2010, 65, 218–226. [Google Scholar] [CrossRef]
  79. Rabby, Y.W.; Ishtiaque, A.; Rahman, M.S. Evaluating the Effects of Digital Elevation Models in Landslide Susceptibility Mapping in Rangamati District, Bangladesh. Remote Sens. 2020, 12, 2718. [Google Scholar] [CrossRef]
  80. Shawky, M.; Moussa, A.; Hassan, Q.K.; El-Sheimy, N. Pixel-Based Geometric Assessment of Channel Networks/Orders Derived from Global Spaceborne Digital Elevation Models. Remote Sens. 2019, 11, 235. [Google Scholar] [CrossRef] [Green Version]
  81. Vivone, G.; Alparone, L.; Chanussot, J.; Mura, M.D.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A Critical Comparison Among Pansharpening Algorithms. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2565–2586. [Google Scholar] [CrossRef]
  82. Wu, Y.; Wang, N.; Li, Z.; Chen, A.; Guo, Z.; Qie, Y. The effect of thermal radiation from surrounding terrain on glacier surface temperatures retrieved from remote sensing data: A case study from Qiyi Glacier, China. Remote Sens. Environ. 2019, 231, 111267. [Google Scholar] [CrossRef]
  83. Guo, L.; Shi, T.; Linderman, M.; Chen, Y.; Zhang, H.; Fu, P. Exploring the Influence of Spatial Resolution on the Digital Mapping of Soil Organic Carbon by Airborne Hyperspectral VNIR Imaging. Remote Sens. 2019, 11, 1032. [Google Scholar] [CrossRef] [Green Version]
  84. Awada, H.; Ciraolo, G.; Maltese, A.; Provenzano, G.; Hidalgo, M.A.M.; Còrcoles, J.I. Assessing the performance of a large-scale irrigation system by estimations of actual evapotranspiration obtained by Landsat satellite images resampled with cubic convolution. Int. J. Appl. Earth Obs. Geoinf. ITC J. 2018, 75, 96–105. [Google Scholar] [CrossRef]
  85. Mo, Y.; Yang, X.; Tang, H.; Li, Z. Smoke Detection from Himawari-8 Satellite Data over Kalimantan Island Using Multilayer Perceptrons. Remote Sens. 2021, 13, 3721. [Google Scholar] [CrossRef]
  86. Banerjee, K.; Krishnan, P.; Mridha, N. Application of thermal imaging of wheat crop canopy to estimate leaf area index under different moisture stress conditions. Biosyst. Eng. 2018, 166, 13–27. [Google Scholar] [CrossRef]
  87. Heinz, D.; Chang, C.-I.; Althouse, M.L.G. Fully Constrained Least-Squares Based Linear Unmixing. In Proceedings of the IEEE 1999 International Geoscience and Remote Sensing Symposium. IGARSS’99 (Cat. No.99CH36293), Hamburg, Germany, 28 June–2 July 1999; pp. 1401–1403. [Google Scholar]
  88. Heylen, R.; Burazerovic, D.; Scheunders, P. Fully Constrained Least Squares Spectral Unmixing by Simplex Projection. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4112–4122. [Google Scholar] [CrossRef]
  89. Plaza, A.; Martinez, P.; Perez, R.; Plaza, J. A Quantitative and Comparative Analysis of Endmember Extraction Algorithms from Hyperspectral Data. IEEE Trans. Geosci. Remote Sens. 2004, 42, 650–663. [Google Scholar] [CrossRef]
  90. Hamedianfar, A.; Shafri, H.Z.M.; Mansor, S.; Ahmad, N. Improving detailed rule-based feature extraction of urban areas from WorldView-2 image and lidar data. Int. J. Remote Sens. 2014, 35, 1876–1899. [Google Scholar] [CrossRef]
  91. Jahjah, M.; Ulivieri, C. Automatic archaeological feature extraction from satellite VHR images. Acta Astronaut. 2010, 66, 1302–1310. [Google Scholar] [CrossRef]
  92. Mansaray, L.R.; Yang, L.; Kabba, V.T.; Kanu, A.S.; Huang, J.; Wang, F. Optimising rice mapping in cloud-prone environments by combining quad-source optical with Sentinel-1A microwave satellite imagery. GISci. Remote Sens. 2019, 56, 1333–1354. [Google Scholar] [CrossRef]
  93. Fetai, B.; Oštir, K.; Kosmatin Fras, M.; Lisec, A. Extraction of Visible Boundaries for Cadastral Mapping Based on UAV Imagery. Remote Sens. 2019, 11, 1510. [Google Scholar] [CrossRef] [Green Version]
  94. Huang, G.; Yan, B.; Mou, Z.; Wu, K.; Lv, X. Surrogate Model for Torsional Behavior of Bundle Conductors and its Application. IEEE Trans. Power Deliv. 2021, 37, 67–75. [Google Scholar] [CrossRef]
  95. Bisoyi, N.; Gupta, H.; Padhy, N.P.; Chakrapani, G.J. Prediction of daily sediment discharge using a back propagation neural network training algorithm: A case study of the Narmada River, India. Int. J. Sediment Res. 2018, 34, 125–135. [Google Scholar] [CrossRef]
  96. Nedaie, A.; Najafi, A.A. Support vector machine with Dirichlet feature mapping. Neural Networks 2018, 98, 87–101. [Google Scholar] [CrossRef]
  97. Hamilton, D.; Brothers, K.; McCall, C.; Gautier, B.; Shea, T. Mapping Forest Burn Extent from Hyperspatial Imagery Using Machine Learning. Remote Sens. 2021, 13, 3843. [Google Scholar] [CrossRef]
  98. Zhan, P.; Zhu, W.; Li, N. An automated rice mapping method based on flooding signals in synthetic aperture radar time series. Remote Sens. Environ. 2020, 252, 112112. [Google Scholar] [CrossRef]
  99. Sebald, J.; Senf, C.; Seidl, R. Human or natural? Landscape context improves the attribution of forest disturbances mapped from Landsat in Central Europe. Remote Sens. Environ. 2021, 262, 112502. [Google Scholar] [CrossRef]
  100. Nelson, M.D.; Garner, J.D.; Tavernia, B.G.; Stehman, S.V.; Riemann, R.I.; Lister, A.J.; Perry, C.H. Assessing map accuracy from a suite of site-specific, non-site specific, and spatial distribution approaches. Remote Sens. Environ. 2021, 260. [Google Scholar] [CrossRef]
  101. Ji, C.; Bachmann, M.; Esch, T.; Feilhauer, H.; Heiden, U.; Heldens, W.; Hueni, A.; Lakes, T.; Metz-Marconcini, A.; Schroedter-Homscheidt, M.; et al. Solar photovoltaic module detection using laboratory and airborne imaging spectroscopy data. Remote Sens. Environ. 2021, 266, 112692. [Google Scholar] [CrossRef]
  102. Watanabe, M.; Koyama, C.N.; Hayashi, M.; Nagatani, I.; Tadono, T.; Shimada, M. Refined algorithm for forest early warning system with ALOS-2/PALSAR-2 ScanSAR data in tropical forest regions. Remote Sens. Environ. 2021, 265, 112643. [Google Scholar] [CrossRef]
  103. Foody, G.M. Impacts of ignorance on the accuracy of image classification and thematic mapping. Remote Sens. Environ. 2021, 259, 112367. [Google Scholar] [CrossRef]
  104. Wu, X.; Shi, Z.; Zou, Z. A geographic information-driven method and a new large scale dataset for remote sensing cloud/snow detection. ISPRS J. Photogramm. Remote Sens. 2021, 174, 87–104. [Google Scholar] [CrossRef]
  105. Hao, Z.; Lin, L.; Post, C.J.; Mikhailova, E.A.; Li, M.; Chen, Y.; Yu, K.; Liu, J. Automated tree-crown and height detection in a young forest plantation using mask region-based convolutional neural network (Mask R-CNN). ISPRS J. Photogramm. Remote Sens. 2021, 178, 112–123. [Google Scholar] [CrossRef]
  106. Jiang, Z.; Zhang, J.; Ma, Y.; Mao, X. Hyperspectral Remote Sensing Detection of Marine Oil Spills Using an Adaptive Long-Term Moment Estimation Optimizer. Remote Sens. 2021, 14, 157. [Google Scholar] [CrossRef]
  107. Bhattarai, R.; Rahimzadeh-Bajgiran, P.; Weiskittel, A.; Meneghini, A.; MacLean, D.A. Spruce budworm tree host species distribution and abundance mapping using multi-temporal Sentinel-1 and Sentinel-2 satellite imagery. ISPRS J. Photogramm. Remote Sens. 2020, 172, 28–40. [Google Scholar] [CrossRef]
  108. Hu, X.; Ban, Y.; Nascetti, A. Uni-Temporal Multispectral Imagery for Burned Area Mapping with Deep Learning. Remote Sens. 2021, 13, 1509. [Google Scholar] [CrossRef]
  109. Mouillot, F.; Schultz, M.G.; Yue, C.; Cadule, P.; Tansey, K.; Ciais, P.; Chuvieco, E. Ten years of global burned area products from spaceborne remote sensing—A review: Analysis of user needs and recommendations for future developments. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 64–79. [Google Scholar] [CrossRef] [Green Version]
  110. Mangeon, S.; Field, R.; Fromm, M.; McHugh, C.; Voulgarakis, A. Satellite versus ground-based estimates of burned area: A comparison between MODIS based burned area and fire agency reports over North America in 2007. Anthr. Rev. 2015, 3, 76–92. [Google Scholar] [CrossRef] [Green Version]
  111. Chang, Y.; Yan, L.; Fang, H.; Liu, H. Simultaneous Destriping and Denoising for Remote Sensing Images With Unidirectional Total Variation and Sparse Representation. IEEE Geosci. Remote Sens. Lett. 2013, 11, 1051–1055. [Google Scholar] [CrossRef]
  112. Ebadi, L.; Shafri, H.Z.M.; Mansor, S.B.; Ashurov, R. A review of applying second-generation wavelets for noise removal from remote sensing data. Environ. Earth Sci. 2013, 70, 2679–2690. [Google Scholar] [CrossRef] [Green Version]
  113. Ha, C.; Kim, W.; Jeong, J. Remote sensing image enhancement based on singular value decomposition. Opt. Eng. 2013, 52, 083101. [Google Scholar] [CrossRef]
  114. Hird, J.N.; McDermid, G.J. Noise reduction of NDVI time series: An empirical comparison of selected techniques. Remote Sens. Environ. 2009, 113, 248–258. [Google Scholar] [CrossRef]
  115. Gajardo, J.; Mora, M.; Valdés-Nicolao, G.; Carrasco-Benavides, M. Burned Area Classification Based on Extreme Learning Machine and Sentinel-2 Images. Appl. Sci. 2021, 12, 9. [Google Scholar] [CrossRef]
  116. Knopp, L.; Wieland, M.; Rättich, M.; Martinis, S. A Deep Learning Approach for Burned Area Segmentation with Sentinel-2 Data. Remote Sens. 2020, 12, 2422. [Google Scholar] [CrossRef]
  117. Ruescas, A.B.; Sobrino, J.A.; Julien, Y.; Jiménez-Muñoz, J.C.; Sòria, G.; Hidalgo, V.; Atitar, M.; Franch, B.; Cuenca, J.; Mattar, C. Mapping sub-pixel burnt percentage using AVHRR data. Application to the Alcalaten area in Spain. Int. J. Remote Sens. 2010, 31, 5315–5330. [Google Scholar] [CrossRef]
Figure 1. Study site with locations of the fifteen fire-scar sites in Hunan Province, China.
Figure 1. Study site with locations of the fifteen fire-scar sites in Hunan Province, China.
Remotesensing 14 03546 g001
Figure 2. Flowchart for the analysis performed in this study.
Figure 2. Flowchart for the analysis performed in this study.
Remotesensing 14 03546 g002
Figure 3. Field measurement confirmation and reports in details by well-trained personnel. Panels (A,B) show the fire scar sites. Panel (B) show that the trained personnel were using UAV (the UAV is shown in red circle) and other equipment for field surveys. Panel (C) shows the manually created vector data of burned area.
Figure 3. Field measurement confirmation and reports in details by well-trained personnel. Panels (A,B) show the fire scar sites. Panel (B) show that the trained personnel were using UAV (the UAV is shown in red circle) and other equipment for field surveys. Panel (C) shows the manually created vector data of burned area.
Remotesensing 14 03546 g003
Figure 4. Comparison of BASM-notra with BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM in minimizing misclassification and effect due to noise for Sentinel 2 image at eleven fire scar sites. The yellow line of panels (AF) are fire-scar sites vector data. Panels (BF) are results of burned area mapped at subpixel level, in which red represents burned area. Any red area beyond the yellow line represents misclassification and effect due to noise. Panel (A) is the fire-scar site based on the GF series data sets. Panel (B) is based on BASM-notra. Panel (C) is based on BASM-FERB. Panel (D) is based on BASM-RF. Panel (E) is based on BASM-BPNN. Panel (F) is based on BASM-SVM.
Figure 4. Comparison of BASM-notra with BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM in minimizing misclassification and effect due to noise for Sentinel 2 image at eleven fire scar sites. The yellow line of panels (AF) are fire-scar sites vector data. Panels (BF) are results of burned area mapped at subpixel level, in which red represents burned area. Any red area beyond the yellow line represents misclassification and effect due to noise. Panel (A) is the fire-scar site based on the GF series data sets. Panel (B) is based on BASM-notra. Panel (C) is based on BASM-FERB. Panel (D) is based on BASM-RF. Panel (E) is based on BASM-BPNN. Panel (F) is based on BASM-SVM.
Remotesensing 14 03546 g004aRemotesensing 14 03546 g004b
Figure 5. Comparison of the burned area mapped at subpixel level with that mapped at pixel level using different algorithms at the fire-scar site in 7CSNX (a), 8CZGY (b). The yellow polygon of panels (AJ) are the fire-scar site vector data. Panels (BJ) are zoomed view of the red circle in Panel (A). Panels (BE) are the burned area mapped at pixel level, where red area represents burned area but the red area beyond the yellow polygon represents misclassification. Panels (GJ) are burned area mapped at subpixel, in which red area represents burned scar but any red area beyond the yellow polygons represent misclassification. Panel (A) is the fire-scar site based on the GF series data sets. Panels (BE) are the burned area mapped at pixel level using FERB, RF, BPNN, and SVM, respectively. Panel (F) is the zoomed view of the red circle in Panel (A) based on the GF series data sets. Panels (GJ) are the burned area mapped at subpixel level using BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM, respectively.
Figure 5. Comparison of the burned area mapped at subpixel level with that mapped at pixel level using different algorithms at the fire-scar site in 7CSNX (a), 8CZGY (b). The yellow polygon of panels (AJ) are the fire-scar site vector data. Panels (BJ) are zoomed view of the red circle in Panel (A). Panels (BE) are the burned area mapped at pixel level, where red area represents burned area but the red area beyond the yellow polygon represents misclassification. Panels (GJ) are burned area mapped at subpixel, in which red area represents burned scar but any red area beyond the yellow polygons represent misclassification. Panel (A) is the fire-scar site based on the GF series data sets. Panels (BE) are the burned area mapped at pixel level using FERB, RF, BPNN, and SVM, respectively. Panel (F) is the zoomed view of the red circle in Panel (A) based on the GF series data sets. Panels (GJ) are the burned area mapped at subpixel level using BASM-FERB, BASM-RF, BASM-BPNN, and BASM-SVM, respectively.
Remotesensing 14 03546 g005
Figure 6. The burned area mapped at subpixel level using different algorithms in the Sentinel-2 image over fire-scar site in 1HYHD (a), 11ZJJSZ (b). The yellow polygon in panels (AH) is the vector data of fire-scar sites. Panels (BH) are zoomed view of the red circle in Panel (A). Panels (EH) are burned area mapped at subpixel level, in which the red area represents burned area and purple color represents background. Panel (A) shows the fire-scar site derived from the GF series data sets. Panel (B) shows the pre-fire site based on the Sentinel-2 images. Panel (C) shows the post-fire site. Panel (D) is the post-fire site based on the GF series data sets. Panel (E) is the burned area mapped at subpixel using BASM-FERB. Panel (F) shows the burned area mapped at subpixel level using BASM-RF. Panel (G) is the burned area mapped at subpixel level using BASM-BPNN. Panel (H) is the burned area mapped at subpixel used BASM-SVM.
Figure 6. The burned area mapped at subpixel level using different algorithms in the Sentinel-2 image over fire-scar site in 1HYHD (a), 11ZJJSZ (b). The yellow polygon in panels (AH) is the vector data of fire-scar sites. Panels (BH) are zoomed view of the red circle in Panel (A). Panels (EH) are burned area mapped at subpixel level, in which the red area represents burned area and purple color represents background. Panel (A) shows the fire-scar site derived from the GF series data sets. Panel (B) shows the pre-fire site based on the Sentinel-2 images. Panel (C) shows the post-fire site. Panel (D) is the post-fire site based on the GF series data sets. Panel (E) is the burned area mapped at subpixel using BASM-FERB. Panel (F) shows the burned area mapped at subpixel level using BASM-RF. Panel (G) is the burned area mapped at subpixel level using BASM-BPNN. Panel (H) is the burned area mapped at subpixel used BASM-SVM.
Remotesensing 14 03546 g006
Table 1. Overview of eleven widely used burned-area products.
Table 1. Overview of eleven widely used burned-area products.
No.Burned-Area ProductDeveloperSensorSpatial ResolutionReference
1Fire CCI 5.0ESAMODIS250 m[18]
2Fire CCI 5.1ESAMODIS250 m[19]
3FireCCILT10ESAAVHRR0.25°[20]
4Copernicus burnt areaEuropean CommissionPROBA-V300 m[21]
5MCD64A1 c6NASAMODIS500 m[22]
6GWISJRCMODIS500 m[23]
7GABAMIRSDE/CASLandsat 8 OLI30 m[24]
8Landsat Burned AreaNASALandsat TM
Landsat ETM+
Landsat OLI
30 m[25]
9TREESTREES-INPEMODIS250 m[26]
10DETER BINPEAWiFS64 m[27]
11Global Fire AtlasNASAMODIS500 m[28]
Note: No. = number. ESA = European Space Agency. MODIS = Moderate-resolution Imaging Spectroradiometer. AVHRR = advanced very high resolution radiometer. PROBA = Project for On-Board Autonomy. MCD64A1 = MODIS Direct Broadcast Monthly Burned Area Product Collection 6. NASA = National Aeronautics and Space Administration. GWIS = Global Wildfire Information System. JRC = Joint Research Centre. GABAM = Global Annual Burned Area Mapping. IRSDE/CAS = Institute of Remote Sensing and Digital Earth-Chinese Academy of Sciences. OLI = Operational Land Imager. TM = Thematic mapper. ETM = Enhanced Thematic Mapper. TREES = Tropical Ecosystems and Environmental Sciences lab. INPE = National Institute for Space Research. DETER = near real-time deforestation detection. AWiFS = Advanced Wide-Field Sensor.
Table 2. Selected bands of Sentinel 2, GF 1, GF 1B, GF 1C, GF 1D, and GF 2 satellite imagery data sets used in the current study.
Table 2. Selected bands of Sentinel 2, GF 1, GF 1B, GF 1C, GF 1D, and GF 2 satellite imagery data sets used in the current study.
SatelliteSensorBand NumberBand Width (μm)Spatial Resolution (m)
Sentinel 2MSI20.46–0.5210
30.54–0.58
40.65–0.68
80.79–0.90
GF 1, GF 1B, GF1C, GF1DPMS10.45–0.902
20.45–0.528
30.52–0.59
40.63–0.69
50.77–0.89
GF 2PMS10.45–0.901
20.45–0.524
30.52–0.59
40.63–0.69
50.77–0.89
Table 3. Fire information at the fifteen selected sites.
Table 3. Fire information at the fifteen selected sites.
No.Date of the Fire EventLongitudeLatitudeExtent (ha)
1HYHD24 September 2019113°3′27°8′1244.42
2CZGY16 December 2019112°53′25°54′166.82
3YZLS1 October 2019112°8′25°32′95.32
4HHHJ6 October 2018109°53′27°15′87.18
5XXFH6 April 2019109°40′28°2′78.23
6CZGD4 January 2021113°54′25°58′66.61
7CSNX19 March 2020112°0′28°11′53.06
8CZGY31 October 2019112°41′25°29′50.14
9CZBH6 February 2019112°52′25°42′39.18
10ZZLL16 April 2020113°27′27°25′37.99
11ZJJSZ19 March 2020110°0′29°25′34.72
12CZGD19 January 2021113°48′25°49′21.81
13YYTJ9 April 2020111°59′28°16′21.62
14YYTJ9 April 2020112°0′28°16′18.41
15YYTJ14 March 2020111°59′28°14′8.70
Table 4. Parameter setup of FERB, RF, BPNN, and SVM.
Table 4. Parameter setup of FERB, RF, BPNN, and SVM.
ModelParameter SetupTraining Percent
FERB
(1)
Custom Bands (Normalized Difference) Band 1: Red, Band 2: NIR
(2)
Segment Settings (Algorithm: Edge, Scale Level: 50.0), Merge Settings (Algorithm: Full Lambda Schedule, Merge Level: 91.8)
(3)
Rule Attributes Type: Spectral, Name: Spectral Mean
-
RF
(1)
Number of Trees: 100
(2)
Number of Features: Square Root
(3)
Impurity Function: Gini Coefficient
(4)
Min Node Samples: 1
(5)
Min Impurity: 0
30%
BPNN
(1)
Training Threshold Contribution: 0.9
(2)
Training Rate: 0.2
(3)
Training Momentum: 0.9
(4)
Training RMS Exit Criteria: 0.1
(5)
Number of Hidden Layer: 1
(6)
(Number of Training Iterations: 100
30%
SVM
(1)
Kernel Type: Polynomial
(2)
Degree of Kernel Polynomial: 2
(3)
Bias in Kernel Function: 1
(4)
Penalty Parameter: 100
(5)
Pyramid Levels: 0
30%
Table 5. Confusion matrix of ground truth and prediction.
Table 5. Confusion matrix of ground truth and prediction.
Ground TruthBurned AreaBackground
Prediction
Burned areaTrue Positive (TP)False Positive (FP)
BackgroundFalse Negative (FN)True Negative (TN)
Table 6. A performance comparison among BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra using the Sentinel-2 images (scale = 5).
Table 6. A performance comparison among BASM-FERB, BASM-RF, BASM-BPNN, BASM-SVM, and BASM-notra using the Sentinel-2 images (scale = 5).
No.Algorithm Ground TruthBurned AreaBackgroundOAUAPAIoUKappa
Prediction
1HYHDBASM-FERBBurned area2,026,688159,55998.56%92.70%94.04%87.56%92.56%
Background128,37517,724,410
BASM-RFBurned area2,082,757288,36198.20%87.84%96.64%85.24%91.02%
Background72,30617,595,608
BASM-BPNNBurned area2,073,524387,51197.66%84.25%96.22%81.55%88.52%
Background81,53917,496,458
BASM-SVMBurned area2,086,017486,17997.23%81.10%96.80%78.98%86.70%
Background69,04617,397,790
BASM-notraBurned area2,098,3263,151,18483.99%39.97%97.37%39.54%48.88%
Background56,73714,732,785
2CZGYBASM-FERBBurned area223,19544,69798.08%83.32%85.82%73.24%83.53%
Background36,8663,939,935
BASM-RFBurned area232,53975,72197.57%75.44%89.42%69.25%80.54%
Background27,5223,908,911
BASM-BPNNBurned area225,85070,28997.54%76.26%86.85%68.37%79.90%
Background34,2113,914,343
BASM-SVMBurned area230,47992,20497.13%71.43%88.62%65.43%77.58%
Background29,5823,892,428
BASM-notraBurned area241,0681,343,54467.90%15.21%92.70%15.03%17.45%
Background18,9932,641,088
3YZLSBASM-FERBBurned area200,22457,50398.99%77.69%88.52%70.58%82.23%
Background25,9737,965,280
BASM-RFBurned area205,704145,15997.99%58.63%90.94%55.39%70.30%
Background20,4937,877,624
BASM-BPNNBurned area205,320192,13097.42%51.66%90.77%49.08%64.61%
Background20,8777,830,653
BASM-SVMBurned area204,996208,25097.22%49.61%90.63%47.19%62.80%
Background21,2017,814,533
BASM-notraBurned area205,6811,036,79087.18%16.55%90.93%16.29%24.51%
Background20,5166,985,993
4HHHJBASM-FERBBurned area184,57626,50996.26%87.44%86.30%76.78%84.68%
Background29,3061,250,957
BASM-RFBurned area184,33812,43997.18%93.68%86.19%81.45%88.15%
Background29,5441,265,027
BASM-BPNNBurned area183,59820,26896.61%90.06%85.84%78.41%85.93%
Background30,2841,257,198
BASM-SVMBurned area185,90018,78796.86%90.82%86.92%79.90%87.00%
Background27,9821,258,679
BASM-notraBurned area187,594272,47479.97%40.78%87.71%38.57%44.88%
Background26,2881,004,992
5XXFHBASM-FERBBurned area128,31350,83192.08%71.63%95.19%69.13%76.82%
Background6478537,928
BASM-RFBurned area128,59255,43191.48%69.88%95.40%67.60%75.37%
Background6199533,328
BASM-BPNNBurned area128,26659,45690.88%68.33%95.16%66.03%73.88%
Background6525529,303
BASM-SVMBurned area128,46157,41891.19%69.11%95.30%66.83%74.64%
Background6330531,341
BASM-notraBurned area129,416147,37078.89%46.76%96.01%45.87%50.48%
Background5375441,389
6CZGDBASM-FERBBurned area113,57561,53598.78%64.86%83.72%57.60%72.48%
Background22,0816,641,201
BASM-RFBurned area108,901149,39297.42%42.16%80.28%38.20%54.09%
Background26,7556,553,344
BASM-BPNNBurned area86,921100,46397.82%46.39%64.07%36.81%52.73%
Background48,7356,602,273
BASM-SVMBurned area90,620169,07896.87%34.89%66.80%29.74%44.39%
Background45,0366,533,658
BASM-notraBurned area117,0672,076,08269.37%5.34%86.30%5.29%6.56%
Background18,5894,626,654
7CSNXBASM-FERBBurned area105,881998599.28%91.38%89.65%82.66%90.13%
Background12,2252,968,290
BASM-RFBurned area105,41424,10998.81%81.39%89.25%74.12%84.52%
Background12,6922,954,166
BASM-BPNNBurned area101,54139,73398.18%71.88%85.97%64.33%77.35%
Background16,5652,938,542
BASM-SVMBurned area104,01760,10097.60%63.38%88.07%58.37%72.49%
Background14,0892,918,175
BASM-notraBurned area106,945776,46674.56%12.11%90.55%11.95%15.68%
Background11,1612,201,809
8CZGYBASM-FERBBurned area109,60218,96999.78%85.25%91.69%79.14%88.24%
Background992712,827,564
BASM-RFBurned area106,35570,00899.36%60.30%88.98%56.11%71.58%
Background13,17412,776,525
BASM-BPNNBurned area91,376101,00199.00%47.50%76.45%41.43%58.12%
Background28,15312,745,532
BASM-SVMBurned area92,605148,86498.64%38.35%77.47%34.50%50.70%
Background26,92412,697,669
BASM-notraBurned area110,8323,116,58275.90%3.43%92.72%3.42%4.93%
Background86979,729,951
9CZBHBASM-FERBBurned area79,646646699.71%92.49%84.23%78.84%88.02%
Background14,9097,253,485
BASM-RFBurned area81,599132,32698.02%38.14%86.30%35.97%52.05%
Background12,9567,127,625
BASM-BPNNBurned area80,10233,23299.35%70.68%84.71%62.68%76.74%
Background14,4537,226,719
BASM-SVMBurned area80,72369,09598.87%53.88%85.37%49.33%65.52%
Background13,8327,190,856
BASM-notraBurned area81,6371,361,64081.31%5.66%86.34%5.61%8.41%
Background12,9185,898,311
10ZZLLBASM-FERBBurned area79,63621,50498.54%78.74%96.34%76.45%85.89%
Background30271,575,007
BASM-RFBurned area80,32931,44897.99%71.87%97.18%70.40%81.58%
Background23341,565,063
BASM-BPNNBurned area79,09543,59697.19%64.47%95.68%62.65%75.60%
Background35681,552,915
BASM-SVMBurned area79,94935,42997.73%69.29%96.72%67.70%79.57%
Background27141,561,082
BASM-notraBurned area80,789415,26875.16%16.29%97.73%16.22%21.28%
Background18741,181,243
11ZJJSZBASM-FERBBurned area74,44514,46998.81%83.73%95.12%80.28%88.44%
Background38161,444,226
BASM-RFBurned area76,16533,20997.70%69.64%97.32%68.33%80.00%
Background20961,425,486
BASM-BPNNBurned area75,96928,42298.00%72.77%97.07%71.21%82.15%
Background22921,430,273
BASM-SVMBurned area76,08842,82297.07%63.99%97.22%62.84%75.69%
Background21731,415,873
BASM-notraBurned area76,134358,38276.54%17.52%97.28%17.44%23.05%
Background21271,100,313
12CZGDBASM-FERBBurned area51,66316,85898.00%75.40%94.67%72.33%82.89%
Background2909915,928
BASM-RFBurned area51,37811,11598.55%82.21%94.15%78.22%87.01%
Background3194921,671
BASM-BPNNBurned area51,82321,25497.57%70.92%94.96%68.34%79.93%
Background2749911,532
BASM-SVMBurned area51,87123,05497.39%69.23%95.05%66.82%78.75%
Background2701909,732
BASM-notraBurned area52,156248,39674.60%17.35%95.57%17.22%22.08%
Background2416684,390
13YYTJBASM-FERBBurned area56,955747099.10%88.41%88.87%79.59%88.17%
Background71331,553,830
BASM-RFBurned area57,24929,06797.79%66.32%89.33%61.46%75.00%
Background68391,532,233
BASM-BPNNBurned area54,47046,67296.54%53.85%84.99%49.18%64.20%
Background96181,514,628
BASM-SVMBurned area54,56641,56996.86%56.76%85.14%51.64%66.53%
Background95221,519,731
BASM-notraBurned area57,299364,51577.16%13.58%89.41%13.37%17.97%
Background67891,196,785
14YYTJBASM-FERBBurned area37,597488198.42%88.51%76.81%69.84%81.42%
Background11,354975,321
BASM-RFBurned area37,39820,00496.93%65.15%76.40%54.24%68.72%
Background11,553960,198
BASM-BPNNBurned area37,51735,99395.39%51.04%76.64%44.17%58.93%
Background11,434944,209
BASM-SVMBurned area36,73828,96196.00%55.92%75.05%47.15%62.02%
Background12,213951,241
BASM-notraBurned area37,882223,39277.22%14.50%77.39%13.91%17.84%
Background11,069756,810
15YYTJBASM-FERBBurned area17,275960697.30%64.26%91.84%60.80%74.24%
Background1534384,615
BASM-RFBurned area17,24812,26296.65%58.45%91.70%55.51%69.71%
Background1561381,959
BASM-BPNNBurned area16,63214,28996.01%53.79%88.43%50.25%64.90%
Background2177379,932
BASM-SVMBurned area16,93816,30095.60%50.96%90.05%48.24%62.93%
Background1871377,921
BASM-notraBurned area17,38392,80277.19%15.78%92.42%15.57%20.79%
Background1426301,419
AverageBASM-FERB98.11%81.72%89.52%74.32%83.98%
BASM-RF97.44%68.07%89.97%63.43%75.31%
BASM-BPNN97.01%64.92%86.92%59.63%72.23%
BASM-SVM96.82%61.25%87.68%56.98%69.82%
BASM-notra77.13%18.72%91.36%18.35%22.99%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xu, H.; Zhang, G.; Zhou, Z.; Zhou, X.; Zhang, J.; Zhou, C. Development of a Novel Burned-Area Subpixel Mapping (BASM) Workflow for Fire Scar Detection at Subpixel Level. Remote Sens. 2022, 14, 3546. https://doi.org/10.3390/rs14153546

AMA Style

Xu H, Zhang G, Zhou Z, Zhou X, Zhang J, Zhou C. Development of a Novel Burned-Area Subpixel Mapping (BASM) Workflow for Fire Scar Detection at Subpixel Level. Remote Sensing. 2022; 14(15):3546. https://doi.org/10.3390/rs14153546

Chicago/Turabian Style

Xu, Haizhou, Gui Zhang, Zhaoming Zhou, Xiaobing Zhou, Jia Zhang, and Cui Zhou. 2022. "Development of a Novel Burned-Area Subpixel Mapping (BASM) Workflow for Fire Scar Detection at Subpixel Level" Remote Sensing 14, no. 15: 3546. https://doi.org/10.3390/rs14153546

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop