2.1. Study Area
The study area is the Rhodopi Mountain Range National Park (henceforth RMRNP) in northern Greece (
Figure 1), which occupies approximately 175,000 ha with an altitudinal range of 50–2022 m a.s.l. It is located at the central-west massif of the area of the Rhodopi mountain range, constituting a natural border between Greece and Bulgaria. The Rhodopi Mountain Range is shared between the two countries, occupying a total area of 1,473,500 ha. Approximately 82% of the total area of Rhodopi is located in Bulgaria and 18% is in Greece.
The ecological significance of the ecosystems in RMRNP has resulted in the designation of larger or smaller areas into various national and international protection regimes. In particular, seven areas of the RMRNP have been included in the Natura 2000 network of protected areas (two of them as Special Protected Areas and five as Special Areas of Conservation), two areas have been declared as Preserved Natural Monuments, seven areas as Wildlife Reserves, and three areas have been characterized by the European Council as Biogenetic Stocks.
The climatic conditions represent a transitional zone between sub-Mediterranean and central European with a wet continental character. The mean annual temperature is 10.3 °C, and the average annual precipitation is 875.3 mm [
29]. Inside RMRNP a rich variety of ecosystems of the Balkan Peninsula are to be found. Almost 60% of European species can be found here, and this is the main reason that makes RMRNP one of the most important regions for European nature conservation. Inside the area, 26 different habitat types exist and all the vegetation zones of Europe, from the euro-Mediterranean zone of the evergreen broadleaves to the zone of cold resistant conifers and sub-alpine meadows. Its main land cover types are coniferous stands of black pine (
Pinus nigra J.F. Arnold), Scots pine (
P. sylvestris L.), Norway spruce (
Picea abies L.), broadleaved stands of beech (
Fagus sylvatica L.), silver birch (
Betula pendula Roth), oaks (
Quercus sp.), evergreen broadleaves, open areas, maintained by anthropogenic activities and especially grazing, as well as a small proportion of agricultural areas. The RMRNP is also rich in wildlife biodiversity, accommodating species such as Brown Bear (
Ursus arctos), grey wolf (
Canis lupus), red deer (
Cervus elaphus), roe deer (
Capreolus capreolus), Chamois (
Rupicapra rupicapra ssp.
balcanica), wild boar (
Sus scrofa), Capercaillie (
Tetrao urogallus), Grouse (
Bonasa bonasia), golden eagle (
Aquila chrysaetos) and others [
30].
2.3. Remote Sensing Data Analysis
Two techniques were employed to monitor woody vegetation cover changes in RMRNP: the Tasseled Cap Transformation (TCT) [
31] and the implementation of Vegetation Indices (VI’s). The TCT algorithm represents empirical equalization of spectral channels and linear transformation of bands to three indices: brightness, greenness and wetness. Brightness is usually associated with bare or sparsely vegetated lands, greenness with green vegetation, and wetness is usually associated with moisture, water and other moist features [
32].
Recording and monitoring vegetation changes using Vegetation Indices is a relatively fast process that allows a good understanding of changes in space and time. In the current study, five VI’s were used, which were selected based on their capability to capture vegetation variation in an image, as it is documented in the literature. The specific properties of each of the employed vegetation indices are provided in the following paragraphs. The five indices were the Normalized Difference Vegetation Index (NDVI), Soil Adjusted Vegetation Index (SAVI), Enhanced Vegetation Index 2 (EVI2), Normalized Difference Water Index (NDWI) and Bare Soil Index (BSI;
Table 3).
NDVI is one of the vegetation indices that combines the two opposite properties of canopy. It is known that the spectral profiles from green vegetation represent two specific areas where the highest discrepancy occurs (red to nir region) [
33]. Because of this property, NDVI is considered an appropriate vegetation index to study vegetation patterns across temporal or spatial scales, and it has been extensively used in such studies [
26,
38,
39,
40,
41]. SAVI is based on the original NDVI formula and is a transformation technique that minimizes the effect of soil brightness. It has been found to give better results than NDVI in cases of high variation in soil color and moisture [
34]. EVI was developed to optimize the vegetation signal with improved sensitivity in high biomass regions and improved vegetation monitoring through a decoupling of the canopy background signal and a reduction in atmosphere influences [
35]. NDWI [
36] is an index derived from the Near-Infrared (NIR) and Short Wave Infrared (SWIR) channels. The SWIR reflects variation in both the vegetation water content and the spongy mesophyll structure in vegetation canopies, while the NIR reflectance is affected by leaf internal structure and leaf dry matter content but not by water content. The combination of the NIR with the SWIR removes variations induced by leaf internal structure and leaf dry matter content, improving the accuracy in retrieving the vegetation water content [
42]. Bare Soil Index (BSI) is a numerical indicator that combines blue, red, near infrared and short wave infrared spectral bands to capture soil variations. These spectral bands are used in a normalized manner. The short wave infrared and the red spectral bands are used to quantify the soil mineral composition, while the blue and the near infrared spectral bands are used to enhance the presence of vegetation. The original BSI [
43] used SWIR1 reflectance, but recent research by Diek et al. [
37] suggests that the SWIR2 band may be more appropriate for calculating BSI. For this reason, the SWIR2 band was used in this study to calculate BSI index.
All the above indices and transformations were calculated for the entire set of Landsat images. In order to reduce the effect of interannual variation caused by subtle differences in the vegetation phenological characteristics and the sun’s geometry, all indices were stretched to a range of −1 to 1 using Equation (1). The equation was applied to all Vegetation Indices calculated for the study area and for all time slots (105 times in total).
where: VIrescaled = The stretched value of the respective Vegetation Index (NDVI, SAVI, EVI2, NDWI, BSI);
VI = The value of the Vegetation Index as calculated using the respective formula;
VImin = The lowest value of the Vegetation Index across the image;
VImax = The highest value of the Vegetation Index across the image.
One thousand points were randomly located across the entire study area, using the Sampling Design Tool, available as a plugin in ArcMap (v.10.8, Esri Inc., Redlands, CA, USA) and these formed the basis for the determination of vegetation dynamics across the time span of the study. The vegetation index values of the pixel corresponding to each particular point are then assigned as attributes to the point. Linear regression was employed to test for significant relationships between the tested vegetation indices and time. Furthermore, Mann Kendall test was used to test for significant trends between time and vegetation indices [
44].
Once the pattern of vegetation dynamics had been established, an Object Oriented Image Analysis (OBIA) was employed and applied to the images of the first year (1984) and the last year (2019) of the study in order to quantify the vegetation changes. For each time slot an additional Landsat image, acquired during the winter season, was employed. Four classes were identified in 2019 (open areas, broadleaved forests, coniferous forests and water bodies). In 1984 three classes were identified (water bodies was excluded) because the water body present in the area in 2019 resulted from the construction of two dams during the study period.
The OBIA, which was conducted using the software eCognition v.10.2 [
45], was found to be an effective approach in capturing the landcover variation of the Mediterranean mountain regions [
46,
47,
48]. The first step of OBIA is the generation of objects consisting of spectrally similar neighboring pixels using an appropriate scale parameter. The objects are then classified using training samples in a preset classifier or crisp and fuzzy rules. The classification is performed based on the image spectral information, or ancillary data, which for this study were three layers representing elevation, eastness and northness, respectively, all generated using the ASTER Global Digital Elevation Model (GDEM). The scale parameter was set at 100, with testing scale parameters between 10 and 150 (at intervals of 10) and with visual assessment of the segmentation result for over and under segmentation. The color criterion was set at 0.7 and the shape at 0.5. Various classifiers, including Classification Trees (CART), Support Vector Machines (SVM) and Random Forests (RF), all embedded in eCognition, were tested for their effectiveness in identifying the selected classes. The evaluation of each classifier’s performance was assessed by their efficiency in reproducing the training set [
46]. Among the ones tested, Support Vector Machines (SVM) was found to have the best performance.
For training the classifier and testing the result for the year 2019, we employed 200 sampling points, located randomly and sampled in May–August 2019. The vegetation composition in these 200 points was estimated in circular plots with a radius of 17.9m, which resulted in a plot of approximately 0.01ha. The delimitation of each plot was made using the TruPulse 200R Laser range finder. The sampling protocol included a visual estimation of canopy cover and the relative abundance of species using the 9 scale Braun Blanquet approach. From the 200 collected plots, 150 were used for training and 50 for testing the results. The same training and test data were used for the 1984 classification after removing the plots that had changed within this period. The assessment of the degree of change was based on a change index calculated using selective Principal Components Analysis (PCA), which was applied to the two summer Landsat images of 1984 and 2019 [
49,
50]. According to this approach, from a pair of images, a set of comparable layers are selected, which in this case were the six spectral bands of the Landsat images. A selective PCA analysis is performed for each pair of comparable layers to obtain two principle components in the two dimensional feature space, where the first (PC1) expresses the information common to both pictures and the second (PC2) expresses changes between the two images or noise. The change intensity increases towards the margins of PC2. To convert the PC2 values into change index values and to separate between actual change (increasing towards the margins of PC2) and noise (central part of PC2) the absolute difference values of a bidirectional normalized sum function are calculated [
51]. This fuzzy membership function assigns values of around 0 to the pixels that occupy the central part of PC2 and increases values towards 1 as the pixel’s location approaches the margins of PC2. The closer the values are to 1 the higher the possibility of change [
51]. The approach has been successfully employed in the Mediterranean environment for monitoring post fire vegetation recovery [
52]. In this study, a threshold of 0.7 was used, above which a pixel is considered to have changed. This resulted in 112 points for training and 36 for testing the classification result of 1984.
Based on the two mapping products of the two time slots, a transition matrix was constructed in order to quantify the change and identify its pattern. Furthermore, two landscape diversity indices were calculated, the Shannon’s Diversity and Evenness indices, in order to see if the observed changes had resulted in increased homogeneity or heterogeneity. The calculation was performed using the Patch Analyst extension for ArcMap [
53]. Finally, a multitemporal spectral profile of the main transitions was constructed based on the NDVI values. The flowchart adopted for this study is shown in
Figure 2.