*2.4. Spectral Variability*

VHR images show the required details of the mangrove ecosystem. Therefore, the Worldview-2 image was used to select the eight targeted land cover classes: (1) closed canopy mangrove, (2) open canopy mangrove, (3) individual mangrove trees, (4) mudflats, (5) aerial roots, (6) tidal zone, (7) shallow water, and (8) deep water. In order to better separate them and distinguish between the spectral

signatures, based on the field survey and image interpretation, reflectance values of the target classes (by 100 points) were extracted from Worldview-2 image bands. The boxplots in Figure 3 show that the two classes closed canopy mangrove and open canopy mangrove are clearly distinguished by the blue band and the yellow band. Moreover, it shows that the aerial roots are clearly distinguished from the mudflats in the green, yellow and red bands. Figure 4 shows the reflectance values of the eight land cover classes for the Sentinel-2 bands.

**Figure 3.** Reflectance values of the eight land cover classes for each Worldview-2 band: (**B2**) (Blue: 450–510 nm), (**B3**) (Green: 510–580 nm), (**B4**) (Yellow: 585–625 nm), (**B5**) (Red: 630–690 nm), (**B6**) (Red edge: 705–745 nm), (**B7**) (Near-infrared 1: 770–895 nm), and (**B8**) (Near-infrared 2: 860–1040 nm). The letters A to H show the land cover classes namely closed canopy mangrove, open canopy mangrove class, individual mangrove trees, mudflats, aerial roots, tidal zone, shallow water, and deep water, respectively.

**Figure 4.** Reflectance values of the eight land cover classes for Sentinel-2: (**B2**) (Blue band 490 nm), (**B3**) (Green band 560 nm), (**B4**) (Red band 665 nm), (**B5**) (Vegetation Red Edge band 705 nm), (**B6**) (Vegetation Red Edge band 740 nm), (**B7**) (Vegetation Red Edge band 783 nm), (**B8**) (Near-infrared band 842 nm), and (**B8A**) (Vegetation Red Edge band 865 nm). The letters A to H show the land cover classes namely closed canopy mangrove, open canopy mangrove class, individual mangrove trees, mudflats, aerial roots, tidal zone, shallow water, and deep water, respectively.


**Table 2.** Sensor specifications of the Worldview-2 and Sentinel-2 imagery.

## *2.5. Reference Data*

The sampling of reference data used Object-Based Image Analysis (OBIA), which is based on segmentation [34,47]. The multi-resolution segmentation algorithm from eCognition 9.2 software (Trimble Inc., Munich, Germany) [48] was used, which classifies homogeneous image objects by using attributes of image objects rather than the attributes of individual pixels or a hierarchical object-oriented approach using a knowledge base. In the present study, a series of scale parameters, shape and compactness (from low to high) were tested to control the size of segmentation. In order to generate reliable reference samples, information from the Normalized Difference Vegetation Index (NDVI) layer and the Moran Index using the Worldview-2 bands was additionally included for image segmentation. In previous studies, NDVI has been successfully applied to display and quantify mangrove forest changes [12,49,50]. NDVI values were computed as:

$$\text{NDVI} = \frac{\text{NIR} - \text{Red}}{\text{NIR} + \text{Red}} \tag{1}$$

where NIR is band 8 and Red is band 5.

The Moran index provides the correlation between attributes at each location in a study area and the statistical mean of the values from neighboring locations. The Moran index has successfully been applied in almost all studies dealing with spatial autocorrelation (for a review see [51]). It evaluates the magnitude of homogeneity of a target image object to other objects surrounding it. If targets are attracted to (or repelled from) each other, the observations are dependent [52]. In addition, the Moran Index is similar to correlation coefficients and its value ranges from −1 to 1 [53]. Moreover, the Moran index provides quantitative clustering information that is used to select homogeneous regions. The Moran index measures the degree of spatial auto-correlation at each particular location [54]. Information and photos from the field observations, as well as a visual interpretation of Worldview-2 images, were used to develop the rule sets to select segmentations for each class as reference data (Ground Truth or OBIA training). In order to use spectral features (mean and standard deviation of blue, yellow, red edge bands and NDVI), additional geometric features such as shape and extent were used. The total number of variables selected was based on visual inspection of the reflectance values of the eight classes. The feature selection process was completed with the eCognition feature optimization tool using 100-point datasets.

#### *2.6. Upscaling by Reference Data*

After the generation of the reference data, RF was used to classify Worldview-2 and Sentinel-2 images. In this step, Sentinel-2 imagery was preliminarily mapped over the same extent as the Worldview-2 image with 70% of the reference data. The accuracy of the map of the RF algorithm was then checked, and the reference data were used for mapping mangrove classes to a larger extent. A layer stack was created from the NDVI, blue, green, red and near-infrared bands. Sentinel-2 data (10 m spatial resolution) served as input for the RF classification. RF was performed using the *Ranger* Package in the R statistical software [55]. Figure 5 shows the main steps of the classification approach applied in this study.

**Figure 5.** Flow chart of the upscaling approach for mapping land cover in mangrove ecosystems.
