1. Introduction
The Arctic and Antarctic regions are rapidly warming due to climate change, driving significant ecological and environmental change [
1,
2,
3,
4,
5,
6,
7]. Arctic vegetation is composed of boreal forests, shrubs, and tundra ecosystems, which include low-growing plants such as dwarf shrubs, mosses, and lichens. In contrast, Antarctic vegetation is limited to mosses, lichens, algae, and two endemic vascular plant species—
Deschampsia antarctica and
Colobanthus quitensis—in its sparse, ice-free regions [
2,
3,
4,
5,
6,
7,
8,
9]. Despite differences in biodiversity, Arctic and Antarctic vegetation share adaptations to polar conditions, such as extreme-cold tolerance, dormancy, and rapid growth during short summers, that enable survival amid extreme weather, low sunlight, and nutrient scarcity [
7,
10,
11]. However, polar vegetation is especially sensitive to a variety of environmental changes, including shifts in temperature, precipitation, and ice dynamics associated with climate change [
11,
12,
13,
14]. Vegetation responses to these changes are varied, with some regions experiencing greening trends, such as the tenfold increase in Antarctic vegetation cover between 1986 and 2021, while others exhibit browning events driven by extreme weather, permafrost thaw, and herbivory [
11,
12,
13,
14].
Improved monitoring is essential for informed conservation planning. In Antarctica, only 1.5% of ice-free areas, which are critical for vegetation, are formally protected under the Antarctic Specially Protected Areas (ASPAs) framework [
3,
8,
15]. These protected areas often fail to represent the full spectrum of biodiversity, leaving many ecosystems and species unprotected [
15]. The effectiveness of these conservation efforts is further undermined by climate change, invasive species, and human activities such as tourism and research, which can introduce non-native species and disrupt fragile ecosystems [
2,
3,
8,
15]. Challenges also exist in the Arctic, where governance is complicated by the involvement of multiple nations, differing regulatory frameworks, and competing interests in land use [
3]. Furthermore, research efforts are often geographically biased, focusing disproportionately on accessible areas—such as Alaska, Scandinavia, and the Antarctic Peninsula—while vast regions of the Russian Arctic and continental Antarctica remain under-represented [
11,
13].
Advances in remote sensing technologies—including optical, Synthetic Aperture Radar (SAR), thermal, and light detection and ranging (LiDAR) sensors—offer unprecedented opportunities to monitor polar vegetation and understand how these ecosystems are adapting. The increasing number of earth observation satellites, advancements in uncrewed aerial vehicle (UAV) technology, and the growing availability of more sophisticated payloads and sensors have significantly enhanced monitoring capabilities [
16,
17]. Satellite data and UAVs now enable detailed monitoring of vegetation distribution, health, and phenology, allowing researchers to understand the spatial and temporal dynamics of vegetation change and allow researchers to gain insights into regions previously very difficult or expensive to study [
4,
8,
18]. Additionally, with recent advances in machine learning (ML) and sensor fusion techniques, scientists can combine data collected across different sensors and times to improve monitoring capabilities over traditional remote sensing techniques [
19,
20], as illustrated in
Figure 1.
Sensor fusion is increasingly recognised as an important advancement in remote sensing, offering enhanced spatial, spectral, and temporal insights [
21,
22,
23]. However, despite its potential, sensor fusion approaches remain methodological and have not been implemented at a large scale in polar environments. In particular, the methodological integration of UAV and satellite data, rather than through widespread application, limits the ability to fully leverage complementary sensor capabilities. This review examines the methods and challenges associated with sensor fusion and highlights the need for its practical adoption in polar vegetation research.
Through remote sensing, researchers can provide critical insights to help identify areas most vulnerable to climate change, aiding the establishment of more representative protected area networks [
2,
8]. Linking scientific data with policy frameworks—such as the Antarctic Treaty System, the Scientific Committee on Antarctic Research, the Arctic Monitoring and Assessment Programme, and the Arctic Council—is crucial for dynamic and adaptive management of these ecosystems [
15,
24]. With advanced monitoring technologies providing timely and accurate information, we can inform conservation policy, ensuring the preservation of polar biodiversity for future generations [
2,
25].
2. Methods
This review synthesises methodological advancements in remote sensing for monitoring sparse, low-stature vegetation in polar environments, with particular emphasis on sensor fusion and ML. The focus is on studies that improve spatial, spectral, or temporal monitoring capabilities across Arctic and Antarctic ecosystems.
A structured literature search was conducted using Web of Science, Scopus, and Google Scholar, targeting studies published primarily after 2012. Earlier studies were included only when they represented foundational contributions—such as introducing long-standing spectral thresholding techniques—or offered essential methodological or historical context.
Search queries combined the following terms using Boolean operators:
Studies were included based on their relevance to sparse, low-stature vegetation mapping in polar regions. Priority was given to those employing novel classification approaches, innovative sensor fusion strategies, or emerging technologies. While normalised difference vegetation index (NDVI) thresholding remains a common technique, studies using it were included only if they represented foundational work or incorporated recent methodological refinements.
Research on boreal forests was excluded due to ecological and structural differences in vegetation, unless such studies also addressed low-stature or sparse vegetation in a way relevant to polar ecosystems, particularly through transferable sensor fusion or classification methods.
Peer-reviewed journal articles formed the core of this review. High-quality grey conference papers were included when they provided critical insights not available elsewhere.
3. Results
This section is structured into four main parts, each with specific subsections addressing key aspects of polar vegetation remote sensing.
Section 3.1: Examines the limitations in remote sensing for sparse polar vegetation.
Section 3.2: Introduces recent technological advances for overcoming key limitations.
Section 3.3: Discusses core methods for remote sensing of low-stature polar vegetation.
- –
Section 3.3.1: Overview of vegetation indices used in polar ecosystems.
- –
Section 3.3.2: Summary of machine learning methods for sparse polar vegetation.
Section 3.4: Examines remote sensing techniques across platforms and resolutions.
- –
Section 3.4.1: Satellite and UAV platforms used in polar vegetation monitoring.
- –
Section 3.4.2: Coarse-resolution satellite methods for polar vegetation studies.
- –
Section 3.4.3: Medium-resolution satellite methods for polar vegetation studies.
- –
Section 3.4.4: SAR methods for structural classification and all-weather imaging.
- –
Section 3.4.5: High-resolution satellite imagery for polar vegetation studies.
- –
Section 3.4.6: UAV methods for fine-scale mapping of polar vegetation.
- –
Section 3.4.7: Multi-sensor fusion techniques for enhanced monitoring.
3.1. Limitations of Remote Sensing for Polar Vegetation Monitoring
Remote sensing in polar regions presents distinct challenges due to extreme environmental conditions, sparse vegetation cover, and highly heterogeneous surfaces [
26,
27]. Snow, ice, rugged terrain, and frequent cloud cover restrict both data availability and interpretability, imposing fundamental limitations on sensor performance across spatial, spectral, and temporal scales. These issues are especially pronounced in low-biomass systems dominated by mosses, lichens, and microbial mats, where vegetation signals are subtle and highly variable [
28].
Traditional coarse-resolution optical satellites are prone to sub-pixel mixing, when vegetation, soil, rock, and snow can occupy a single pixel. This blending obscures spectral signatures, reduces classification accuracy, and introduces ambiguity in vegetation mapping [
29]. As illustrated in
Figure 2, the patchy, fragmented distribution of polar vegetation frequently falls below the detection threshold of coarse sensors.
Optical sensors are further limited by persistent cloud cover and seasonal snow, which obscure surface features during critical phenological windows [
30,
31]. Long-term monitoring is further complicated by inconsistent revisit intervals, mission discontinuities, and cross-platform calibration difficulties, leading to gaps in temporal continuity and intersensor comparability [
22,
32]. Due to orbital constraints, polar-orbiting satellites such as Sentinel-2 and Landsat are unable to image regions near the geographic poles beyond approximately ±82° latitude, excluding up to 21% of the ice-free regions in Antarctica [
33].
Although SAR offers the advantage of all-weather, day-and-night imaging, its signals are highly sensitive to surface roughness, dielectric properties, and freeze–thaw cycles. In cryptogam-dominated systems, SAR backscatter often reflects soil or moisture dynamics more strongly than vegetation structure, making interpretation complex [
31,
34].
Methodologically, the prevalence of simple vegetation indices (VIs), such as the NDVI, imposes further limitations. These indices often oversimplify spectral and structural variability and are poorly suited to cryptogams, which exhibit weak near-infrared reflectance and are frequently under-detected [
35].
These challenges underscore the need for improved spatial and spectral resolution, weather-independent sensing, and analytical frameworks that can accommodate environmental heterogeneity. The following section examines how recent advances in UAV and satellite technologies, ML, and data fusion are overcoming these barriers and expanding the capability to monitor polar vegetation at ecologically relevant scales.
3.2. Technological Advances and Emerging Solutions
Recent technological and methodological innovations are directly addressing the constraints outlined above, marking a shift from coarse, threshold-based approaches to integrated, high-resolution monitoring workflows. Key developments in UAV platforms, satellite systems, ML, and sensor fusion have collectively expanded the capacity to detect, classify, and track polar vegetation across challenging conditions.
UAV-based remote sensing has evolved substantially over the last decade, now supporting multi-sensor payloads that include multispectral, hyperspectral, thermal, LiDAR, and RGB sensors [
17,
36,
37,
38]. These configurations overcome the spatial limitations of satellite platforms and are especially effective in mapping the fragmented, fine-scale mosaics characteristic of cryptogam-dominated systems [
9,
39,
40]. By capturing both spectral and structural data simultaneously, modern UAVs address the issues of sub-pixel mixing and spectral ambiguity noted in
Section 3.1. Their deployment to inaccessible terrain also helps overcome ground validation challenges, although there remain operational constraints, such as battery life and weather conditions.
In parallel, the proliferation of high-resolution Earth observation satellites has filled critical spatial and temporal gaps. Platforms such as Sentinel-2, Landsat 8, and WorldView now provide sub-10 m resolution imagery with increasingly frequent revisit cycles, mitigating issues of temporal discontinuity and coarse granularity [
20,
41,
42]. The availability of red-edge and Shortwave Infrared (SWIR) bands has enabled the development of more tailored VIs suited to low-biomass systems, offering alternatives to the NDVI that better detect cryptogams and mosses [
43,
44,
45]. While spectral confusion remains a challenge, the improved spectral fidelity of these sensors enhances discrimination between vegetated and non-vegetated surfaces.
The adoption of ML and deep learning has significantly advanced classification accuracy and vegetation trait estimation. Algorithms such as Random Forest (RF), Support Vector Machines (SVMs), and convolutional neural networks (CNNs) accommodate high-dimensional, noisy data, making them well suited to the heterogeneous landscapes of polar regions [
37,
46,
47]. These models outperform simple thresholding approaches by leveraging spectral, structural, and contextual information, directly addressing the limitations of the traditional indices highlighted earlier. Moreover, interpretable ML models—such as RF and Extreme Gradient Boosting (XGBoost)—offer insights into feature importance, aiding both ecological understanding and model validation.
Among the most impactful advancements is the rise of multi-sensor data fusion. By combining UAV, optical, SAR, and topographic datasets, fusion frameworks resolve many of the limitations associated with single-sensor approaches. For instance, SAR’s all-weather imaging complements cloud-prone optical data, while UAV-derived high-resolution observations support the upscaling of vegetation metrics to regional or continental scales [
36,
48,
49]. These approaches enhance class separability in spectrally ambiguous settings, bridge spatial and temporal discontinuities, and enable the estimation of vegetation attributes that are otherwise difficult to detect.
Together, these innovations mark a methodological turning point. By integrating high-resolution sensors, adaptive analytics, and multi-source data, remote sensing in polar regions is transitioning from constrained, discontinuous observations to ecologically meaningful, spatially scalable monitoring. The sections that follow examine how these technologies are operationalised across major application domains, including land cover mapping, biomass estimation, and vegetation health assessment.
3.3. Foundational Methods for Polar Vegetation Remote Sensing
This section introduces the core analytical methods that shape remote sensing of sparse vegetation in polar ecosystems. It focuses on VIs and ML approaches—including supervised and unsupervised classification, regression models, and deep learning—enabling accurate vegetation detection, trait estimation, and ecological mapping across Arctic and Antarctic landscapes. These methods support key applications discussed in later sections.
3.3.1. Vegetation Indices
Derived from reflectance values across spectral bands, VIs are central to mapping vegetation presence, health, and productivity in sparse and fragmented Arctic and Antarctic landscapes [
36,
50,
51]. They are especially useful in environments where biomass is low, species diversity is limited, and vegetation is intermixed with snow, ice, and bare ground [
33,
47,
51].
A broad range of indices has been applied across RGB, multispectral, and hyperspectral imagery. RGB-based indices—such as the GRVI, RGBVI, and ExG—are frequently used in UAV imagery to enhance spectral contrast in cryptogam-rich environments [
34,
37,
52]. Multispectral indices—like the NDVI, SAVI, and NDRE—are applied to satellite and UAV data to capture chlorophyll activity and photosynthetic vigour [
20,
53,
54]. In hyperspectral datasets, narrowband indices, such as the NDLI and tailored combinations, improve discrimination among mosses, lichens, and algae [
47,
54].
The NDVI remains the most widely used index in polar studies, supporting greening trend analysis, vegetation cover estimation, and time-series monitoring across the Advanced Very High Resolution Radiometer (AVHRR), Moderate-Resolution Imaging Spectroradiometer (MODIS), Landsat, and Sentinel-2 platforms [
42,
55,
56]. However, the NDVI’s limitations are well documented: the NDVI saturates in dense canopies, is sensitive to soil background, and under-represents cryptogamic species with low near-infrared (NIR) reflectance [
34,
57]. To address this, polar-specific indices have been developed. For instance, the NDLI exploits the spectral divergence between green and red bands to distinguish lichens and mosses, while tailored variants of the NDVI incorporating red-edge and NIR-2 bands (e.g., NDVI-2 and NDVI-4) improve detection in mixed snow–rock–vegetation environments [
47,
58]. The NDMI and water-sensitive indices have further improved mapping of moisture-dependent vegetation types [
41].
VIs are also widely used as input features for ML classifiers and regression models, enhancing vegetation classification, biomass estimation, and trait prediction (
Section 3.4). Despite their simplicity, VIs remain foundational for remote sensing workflows in polar research, particularly when combined with structural or contextual variables.
3.3.2. Machine Learning Techniques
ML has become integral to polar vegetation monitoring, supporting a wide range of applications—from land cover classification to trait estimation—across diverse sensing modalities. These approaches offer a flexible framework for modelling nonlinear, high-dimensional relationships under challenging environmental conditions, including snow cover, cloud interference, and spectral ambiguity from cryptogams and mixed pixels [
54,
58].
Supervised classification remains the foundation of ML-based polar monitoring. Random Forest, Support Vector Machines, and Extreme Gradient Boosting are widely used due to their robustness with limited training data and ability to handle high-dimensional feature sets. RF classifiers routinely achieve overall accuracies above 85% and F1-scores exceeding 0.80 in high-resolution optical and SAR–optical fusion contexts [
44,
46,
51,
59]. XGBoost and SVM similarly perform well in fragmented vegetation settings, particularly when combined with object-based image analysis [
58,
60]. Models like RF and XGBoost also provide measures of feature importance, making them more interpretable than deep learning models and useful for identifying key ecological drivers. Additionally, these algorithms are typically faster to train and easier to implement, though they still require high-quality labelled data and may struggle with transferability across heterogeneous polar landscapes [
54,
61].
Deep learning models, such as convolutional neural networks and U-Net architectures, have shown further improvements, particularly for semantic segmentation and trait estimation in UAV and very high resolution (VHR) imagery. For example, 1D-CNNs applied to Arctic hyperspectral data achieved Kappa values above 0.98 [
37], and U-Net-based models demonstrated F1-score gains of over 20% when combined with semi-supervised pre-labelling or ensemble strategies [
46,
47]. However, deep learning models require large, diverse labelled datasets and are especially sensitive to over-fitting, particularly when trained on small or noisy datasets [
49,
54]. Their computational demands are also significantly higher than traditional ML models, frequently requiring GPU-based processing and long training times [
46,
49]. Training deep architectures, such as CNNs or semi-supervised learning (SSL) frameworks, can be orders of magnitude slower, and their performance often suffers from limited spatial transferability unless extensive, representative datasets are available [
37,
46,
49].
Input feature selection and preprocessing significantly affect model performance across ML methods. VIs, hyperspectral derivatives, and texture metrics improve class separability in shadowed or heterogeneous terrain [
49,
62]. Integrating SAR coherence, backscatter, and digital elevation model-derived variables (e.g., Height Above Nearest Drainage [HAND] and slope) has enhanced generalisation across seasons and regions; for example, coherence metrics improved tundra classification accuracy to 84% in an RF model [
61]. In Antarctica, fusion of UAV thermal, RGB, and multispectral data allowed for the health classification of mosses across phenological stages with high seasonal consistency [
26].
Object-based image analysis (OBIA) outperforms pixel-based approaches for high-resolution data by leveraging geometric and contextual information. OBIA combined with RF or SVM has achieved 5–10% higher classification accuracy compared to pixel-based models in VHR imagery of cryptogam-dominated landscapes [
20,
58,
63]. While OBIA is explicitly segmented, deep learning models like U-Net implicitly learn spatial aggregation, offering a hybrid pathway when training data are sufficient [
46].
Regression-based ML methods—including Support Vector Regression (SVR), Partial Least Squares Regression (PLSR), and neural networks—are increasingly applied to estimate continuous vegetation traits, such as biomass, chlorophyll, and moisture content. Relating field data to remote sensing inputs remains the most widely used fusion strategy in polar vegetation studies (
Figure 3b), supporting regression and classification across satellite, UAV, and SAR modalities. For instance, SVR applied to UAV hyperspectral data predicted moss chlorophyll with
in Antarctica, while deep learning models estimated lichen abundance from fused WorldView-2 imagery with mean absolute errors as low as 7% [
44,
45]. However, trait model performance often declines when transferred to coarser satellite data due to resolution and spectral limitations.
Unsupervised and semi-supervised approaches can be valuable tools in data-scarce polar regions. Techniques such as k-means clustering, fuzzy c-means, and Mapcurves-based relabelling, have supported label generation and stratified classification. For example, unsupervised pre-labelling increased the F1-score from 0.67 to 0.81 in Antarctic mapping [
19,
60,
64]. Semi-supervised deep learning, leveraging predictions from traditional classifiers, has further improved performance where ground truth is limited.
While no single ML approach universally outperforms others, consistent patterns have emerged. Supervised models—particularly RF, SVM, and XGBoost—remain effective across most remote sensing modalities. Deep learning offers superior segmentation and trait prediction when training data are abundant. OBIA and feature engineering (e.g., spectral derivatives, texture, and SAR coherence) substantially boost accuracy in high-resolution and multi-sensor settings. Specific applications and detailed accuracies are discussed further in
Section 3.4.
3.4. Remote Sensing Applications for Polar Vegetation Monitoring
Building on the foundational methods outlined above, this section analyses how various remote sensing platforms have been applied to vegetation monitoring in polar regions. Applications are grouped by sensor type—coarse-, medium-, and high-resolution optical, SAR, UAV, and multi-sensor fusion—and evaluated based on spatial scale, classification accuracy, and ecological relevance.
3.4.1. Sensors and Platforms
The following tables summarise the key satellite (
Table 1) and UAV (
Table 2) platforms used in polar vegetation monitoring, along with their technical specifications. These platforms form the technological foundation for the remote sensing methods discussed in subsequent sections. Each category—ranging from coarse-resolution optical satellites to UAV-mounted hyperspectral sensors—offers distinct trade-offs in spatial, spectral, and temporal resolution, which directly influence their effectiveness for detecting, classifying, and monitoring polar vegetation. These differences shape the design, scalability, and ecological relevance of various classification, trend detection, and trait mapping approaches.
These platform specifications, both UAV and satellite, provide the foundation for the trade-off comparisons summarised in
Table 3, which connects sensor characteristics to application domains, methods, and classification accuracy.
3.4.2. Course-Resolution Optical Methods: AVHRR and MODIS
Spatial, spectral, and temporal resolution fundamentally shape the effectiveness of remote sensing in detecting, classifying, and monitoring polar vegetation. In polar landscapes, resolving fine-scale features—such as moss beds, lichen mats, or shrubs—is critical for generating ecologically meaningful outputs. Historically, coarse-resolution optical sensors like AVHRR (1.1 km) and MODIS (232–500 m)—as shown in
Table 1—have been foundational to broad-scale vegetation monitoring due to their long-term archives, near-daily revisit times, and extensive spatial coverage [
51,
55]. MODIS, for instance, offers 36 spectral bands and 1–2 day global coverage, making it well suited for long-term trend analysis despite its limited spatial granularity [
30,
66].
A key example of these trade-offs is the Circumpolar Arctic Vegetation Map (CAVM), which used 1 km AVHRR NDVI composites to delineate major tundra vegetation zones across the Arctic [
65,
74]. This pioneering effort demonstrated the feasibility of continent-scale vegetation mapping but also underscored the limitations of coarse spatial resolution and simple NDVI thresholding. Relying on broad physiognomic categories and expert interpretation, the CAVM reduced over 400 plant communities into just 15 functional classes and lacked a formal spatial accuracy assessment. While effective for broad classification, the approach struggled with mixed pixels, geographic variability in reflectance, and spectral confusion, particularly in cryptogam-dominated areas where the NDVI is known to perform poorly.
Recent applications of AVHRR and MODIS data, building on this foundational work, have shown continued relevance for broad-scale phenological monitoring and vegetation trend detection. Karlsen et al. [
30] reported a moderate correlation (
) between MODIS NDVI-derived onset of the growing season and field-observed flowering of
Salix polaris in Svalbard, though spectral ambiguity in moss-rich systems remains a known challenge. Addressing another core limitation, Zihong Liu and Cheng [
56] developed a time-series reconstruction method (TSR–PT) to mitigate cloud-related data gaps—common in high-latitude optical remote sensing—by modelling phenological continuity across years. This reduced the RMSE from 0.2369–0.2483 (traditional methods) to just 0.0089, even with over 70% of the data missing. Nonetheless, coarse resolution and spectral averaging still weaken ecological signals, underscoring the increasing necessity of higher-resolution or multi-sensor approaches to resolve fine-scale ecological patterns [
43,
45,
73].
These examples illustrate how coarse-resolution datasets, while spatially constrained, can still yield valuable ecological insights when supported by long archives and robust temporal modelling. However, as discussed in subsequent sections, persistent challenges—including spectral ambiguity, structural insensitivity, and sub-pixel mixing in heterogeneous landscapes—have driven increasing demand for higher-resolution or multi-sensor approaches better suited to capturing the complexity of polar vegetation.
3.4.3. Medium-Resolution Optical Methods: Landsat and Sentinel-2
The limitations of coarse-resolution sensors have driven a shift toward medium-resolution optical platforms—particularly Landsat (30 m; 7–9 spectral bands) and Sentinel-2 (10–20 m; 16 bands), as illustrated in
Table 1—which offer a favourable trade-off between spatial and spectral resolution for polar vegetation monitoring [
19,
22,
75]. Compared to AVHRR and MODIS, these sensors reduce pixel mixing and enable the detection of fine-scale vegetation heterogeneity, including moss beds, cryptogamic crusts, and mixed-plant communities [
20,
54,
67].
For example, Fretwell et al. [
67] produced the first regional vegetation map of the Antarctic Peninsula using 30 m Landsat Enhanced Thematic Mapper Plus, demonstrating that AVHRR’s 1.1 km resolution could not resolve fragmented moss and lichen cover.Similarly, Roland et al. [
12] used a Landsat NDVI time series (1986–2021) to detect significant greening trends in Antarctic vegetation, underscoring Landsat’s value for long-term trend detection even with basic indices.
Sentinel-2 expands on this by adding red-edge and SWIR bands, enabling differentiation of low-stature plant communities based on chlorophyll, moisture, and physiological variation [
43,
44,
58,
76]. These bands support advanced indices like the NDMI, GRVI, and NDRE, improving classification in spectrally ambiguous, cryptogam-dominated systems [
46,
60,
61].
Walshaw et al. [
33] leveraged these features in the first continent-wide, 10 m resolution map of photosynthetic life in Antarctica, using Sentinel-2 (2017–2023) and Google Earth Engine. Integrating the NDVI, NDMI, GRVI, and NDRE reduced false positives from bare ground and improved detection of dark-coloured mosses and lichens. The classification achieved strong agreement with ground-truth data (
terrestrial;
cryospheric) and established minimum detection thresholds—52 m
2 for green vegetation, 79 m
2 for lichens, and just 11 m
2 for snow algae—well below MODIS or AVHRR footprints. Despite these advances, spectral mixing still limits detection in highly fragmented patches, prompting use of sub-pixel methods.
Casanovas et al. [
35] addressed this using matched filtering with Landsat 7, identifying lichen presence based on spectral similarity to known endmembers. The method failed to detect lichens at only 6.6% of ground-truthed sites, compared to 46% omission using a standard NDVI threshold of 0.2. Although commission error could not be assessed without absence data, the performance gap illustrates the value of rule-based sub-pixel detection in complex spectral environments.
Complementing this, Singh et al. [
44] applied an RF model to Sentinel-2 red-edge and SWIR data to classify lichen abundance in East Antarctica. Trained on 92 lichen and 120 non-lichen points and validated on 343 independent samples, the model achieved 82.4% overall accuracy. Probabilistic outputs improved performance in spectrally ambiguous areas, reinforcing the importance of expanded spectral sensitivity and ML approaches.
Together, these studies show that medium-resolution sensors, when paired with advanced indices and classifiers, now achieve classification accuracies exceeding 80% in many polar settings. Historically focused on small regions (tens to hundreds of km
2) due to data volume and processing demands [
59,
61,
77], these medium-resolution sensors are increasingly used for continental-scale applications. Notably, Walshaw et al. [
33] demonstrated how cloud-based workflows and multi-index strategies can scale Sentinel-2 vegetation mapping to the Antarctic continent.
Yet these advances are tempered by ongoing limitations: medium-resolution methods still depend on high-quality field data—often scarce in polar regions—and lack the spatial granularity to detect sub-meter vegetation patterns. Revisit rates of 5 days (Sentinel-2) and 16 days (Landsat) remain lower than MODIS or AVHRR, reducing temporal continuity and increasing susceptibility to cloud cover [
22,
59,
61,
66,
78]. While Landsat supports long-term monitoring via its historical archive [
66,
79], Sentinel-2 offers finer spatial and spectral resolution but covers only recent years. As shown in the following sections, these limitations have increasingly been addressed through data fusion and UAV–satellite integration.
To synthesise the evolution of platform capabilities discussed so far,
Table 3 summarises the progression from coarse-resolution satellites to UAV–satellite fusion, highlighting improvements in classification accuracy, patch detectability, and application scale. This comparative overview contextualises the methods and trade-offs addressed in the following sections.
3.4.4. Synthetic Aperture Radar Methods: Structural Monitoring and All-Weather Observations
SAR provides a weather- and light-independent alternative to optical remote sensing. Its sensitivity to vegetation structure, surface roughness, and dielectric properties such as moisture content makes it especially valuable in polar regions, where persistent cloud, polar night, and short growing seasons limit optical data availability [
80]. Backscatter signals, particularly in X- and C-band systems, correlate with shrub cover and height, though these relationships often saturate beyond 20% cover or 1 m height [
31].
SAR classification accuracy varies widely depending on polarisation configuration and analytical approach. Single-polarised SAR, which records backscatter from only one channel (e.g., VV or HH), captures limited structural detail and often produces moderate classification results. A’Campo et al. [
59] reported just 57.7% overall accuracy using single-pol TerraSAR-X data to classify tundra vegetation.
In contrast, dual-polarised systems improve structural sensitivity by capturing two polarisation combinations (e.g., HH/HV), allowing for partial characterisation of volume and surface scattering. In the same study, A’Campo achieved 92.4% accuracy using dual-pol TerraSAR-X with Kennaugh element features, multi-temporal data, and an RF classifier—demonstrating the impact of both polarisation diversity and advanced methodology.
Quad-polarised SAR (e.g., RADARSAT-2), which measures all combinations (HH, HV, VH, and VV), can further improve vegetation discrimination, particularly between structurally similar classes. However, results remain method-dependent. Ullmann et al. [
64] found that quad-pol RADARSAT-2 yielded only 62.9–68.5% accuracy, partly due to reliance on single-date data and conventional decomposition with maximum likelihood classification. These comparisons show that increased polarisation alone does not guarantee higher accuracy; temporal coverage and classifier choice are often more decisive.
Ullmann et al. [
28] expanded on this by showing that polarisation performance varies by vegetation type—HH/VV for wetlands and HH/HV for shrub tundra—highlighting the need for acquisition strategies tailored to specific landscape contexts. Dual-pol data, when paired with decomposition, can approximate quad-pol performance at lower data volume and cost.
To improve SAR’s sensitivity to seasonal vegetation change, several studies have incorporated interferometric coherence. Unlike backscatter intensity, which responds to surface structure and dielectric contrast, coherence captures temporal stability between repeated acquisitions. Merchant et al. [
61] used dual-pol Sentinel-1 backscatter and coherence to classify six hydro-ecological tundra types. Coherence declined during the growing season in vegetated areas—reflecting structural change—while remaining high in stable classes, like wetlands and woody vegetation. Combined with topographic metrics (e.g., slope, HAND, and wetness index) in an RF model, the approach achieved 84% overall accuracy. This illustrates how temporal coherence enhances classification in moisture-sensitive or structurally dynamic systems.
SAR vegetation studies have typically focused on local to regional extents (10–2000 km
2), constrained by resolution and data availability. Sentinel-1 (10 m; C-band) supports consistent regional mapping with 6–12 day revisit intervals. Higher-resolution platforms such as TerraSAR-X (5–8 m) and RADARSAT-2 (5–12 m) offer finer spatial detail but are often limited to small footprints and tasking constraints [
81].
While SAR can resolve dominant vegetation classes—such as shrub tundra and graminoid wetlands—sub-meter features like moss patches and cryptogamic crusts remain below its detection threshold. Although comprehensive pan-Arctic or pan-Antarctic vegetation maps based solely on SAR data have been limited, recent advances—such as the launch of Sentinel-1C and the integration of data from missions like the RADARSAT Constellation Mission—make large-scale SAR-based vegetation mapping increasingly feasible. Nonetheless, vegetation classification using SAR alone remains challenging in cryptogam-rich or spectrally ambiguous environments.
SAR provides unique structural and temporal insights but struggles to distinguish low-stature or mixed-plant communities without supplemental data. Overlapping scattering responses and environmental confounders—such as moisture and roughness—reduce separability among plant types [
28,
82]. As a result, the most accurate SAR applications increasingly rely on multi-sensor fusion, combining SAR with optical or topographic data, or using coherence alongside contextual variables. These strategies, discussed in later sections, are critical for advancing SAR-based vegetation mapping in polar systems.
3.4.5. VHR Optical Methods: WorldView, QuickBird, and PlanetScope
Very high resolution (VHR) optical imagery (<5 m spatial resolution)—WorldView, QuickBird, PlanetScope—significantly improves vegetation detectability and classification accuracy at fine scales critical for cryptogamic communities in polar ecosystems [
20,
41,
58]. A key advantage of VHR data is the ability to detect vegetation patches smaller than 1 m
2—important in fragmented, moss- and lichen-dominated landscapes. Jawak et al. [
58] used WorldView-2 (2 m multispectral; 0.5 m panchromatic) to detect Antarctic patches as small as 0.25 m
2. Similarly, Gray et al. [
41] demonstrated that WorldView imagery detected snow algal bloom extent up to 17.5 times greater than Sentinel-2, revealing substantial vegetated area that coarser sensors missed entirely. However, even at sub-meter resolution, classification challenges persist. Spectral confusion among red algae, mineral-rich snow, and rock remained an issue despite using Spectral Angle Mapping (SAM). Kodl et al. [
42] further reported that over 80% of barren and shrub patches in the Arctic tundra were smaller than 0.5 m, underscoring that ecological heterogeneity often exists below the resolution threshold of most satellite sensors.
Beyond spatial granularity, several studies have shown that classification accuracy improves when VHR imagery is paired with advanced methods, such as spectral unmixing or custom indices. Shin et al. [
68] applied Spectral Mixture Analysis (SMA) to QuickBird (2.4 m) and KOMPSAT-2 (4 m) to estimate vegetation abundance in Antarctica, achieving strong field agreement (
). Gray et al. [
41] introduced the IB5 index using WorldView’s yellow band to distinguish red and green algae—a separation not possible with Sentinel-2 due to spectral overlap in the red and NIR. Similarly, Jawak et al. [
58] evaluated 16 feature extraction methods using WorldView-2’s 8-band data, finding that Mixture-Tuned Matched Filtering (MTMF) and SAM-based approaches outperformed traditional NDVI models. Bias errors ranged from 6.44% to 11.55%, with RMSE values near 0.41 km
2 and high cross-site consistency. These findings highlight that algorithm choice and spectral band configuration play a major role in realising the potential of VHR data.
VHR imagery is especially valuable when integrated with other datasets. High-resolution maps, such as those from Jawak et al. [
58], often serve as training or validation data for coarser-scale classifications using Sentinel-2 or SAR. VHR imagery also plays a key role in delineating boundaries, analysing fragmentation, and identifying sub-pixel heterogeneity in support of UAV or field campaigns.
Despite these advantages, temporal limitations persist. Most VHR systems require targeted tasking and are constrained by high acquisition costs, cloud cover, and infrequent revisit times. As a result, studies often rely on single or limited seasonal acquisitions. PlanetScope, with its near-daily revisit rate and 3m spatial resolution, offers improved temporal coverage for short-season monitoring. However, its limited spectral range and variable radiometric quality can reduce its utility for detailed vegetation trait analysis [
20,
42].
3.4.6. Uncrewed Aerial Vehicles (UAVs)
Coarse- and medium-resolution satellite sensors (e.g., AVHRR, MODIS, Landsat, and Sentinel-2) have provided foundational insights into polar vegetation patterns but remain limited by spatial resolution and spectral ambiguity—particularly in heterogeneous landscapes dominated by low-stature cryptogams and mixed communities [
33,
35]. VHR satellite imagery improves sub-meter detection but is constrained by a narrow spectral range, high costs, and persistent cloud cover [
41,
58]. SAR, while weather-independent, often struggles to distinguish cryptogam-rich vegetation due to low structural contrast and complex scattering behaviour [
28,
61].
These persistent gaps have driven the rapid adoption of UAVs for high-resolution ecological monitoring in polar regions [
17,
72]. Pina and Vieira [
17] noted that Antarctic UAV studies increased from an average of three per year before 2013 to over 30 annually after 2019, reflecting advances in lightweight sensors and operational deployment.
UAVs have significantly expanded polar vegetation monitoring by enabling ultra-high spatial resolution imagery, 3D structural mapping, and flexible sensor integration [
36,
37,
45,
48,
83,
84]. They support applications beyond the reach of most satellite platforms, including fine-scale assessments of vegetation health, biomass, and species composition [
37,
85,
86].
A key advancement in UAV-based monitoring has been the integration of multi-sensor payloads—combining optical, thermal, and structural sensors on a single platform [
73,
84,
87]. This intra-platform fusion links spectral reflectance, canopy temperature, and 3D structural data to improve vegetation classification, biomass estimation, and physiological assessments in heterogeneous polar landscapes [
37,
38,
47,
83]. These approaches provide synergistic insights that are often unattainable using single-sensor systems and represent one of the fastest-growing areas in polar vegetation monitoring (
Figure 3), though most remain geographically concentrated in the Arctic.
However, spatial coverage for UAV studies remains limited, typically ranging from plot scale (1–100 m
2) to site scale (1–10 ha), with some studies expanding further to several hundred hectares [
60,
78]. Zmarz et al. [
9] demonstrated the improving range with the Beyond Visual Line-Of-Sight (BVLOS) mapping of Penguin Island from 30 km away, and Zmarz et al. [
88] expanded on this with a BVLOS surveys over a 7.5 km
2 area at Admiralty Bay (King George Island, ASPA 128). This highlights the potential of long-range UAVs to bridge the scale gap between field data and satellite observations while supporting baseline ecological monitoring in Antarctic protected areas.
Despite the many strengths of UAVs in polar vegetation studies, several limitations remain. Even with ultra-high-resolution sensors, mixed pixels can still pose challenges when vegetation features are coarser than the sensor’s ground sample distance [
89]. Flight endurance further constrains spatial coverage; most UAVs operate for only 20–60 min per flight. Multirotor systems offer high-precision, low-altitude imaging and perform well in rugged terrain, but they are limited in range and payload capacity. In contrast, fixed-wing UAVs provide greater coverage and efficiency but require open terrain for launch and are less manoeuvrable [
17,
18]. These trade-offs limit the ability to survey broad areas or conduct extended transects without complex logistical planning. Adaptive path planning could help overcome some of these challenges, though its application in polar vegetation monitoring remains largely unexplored [
90].
Temporally, UAVs offer high revisit flexibility. Some studies have captured short-term dynamics with multiple flights over a few days [
40,
85], while others have tracked vegetation across entire growing seasons [
52,
91]. However, multi-year UAV datasets remain rare, limiting their stand-alone utility for long-term ecological trend detection.
While scale, cost, and logistical constraints limit broad deployment, UAVs have proven especially valuable for targeted ecological applications. The following sections examine how UAV platforms have advanced our ability to assess vegetation health, quantify biomass, and map a fine-scale community structure in polar landscapes.
Pigment and Chlorophyll Dynamics
Hyperspectral UAV sensors, with their high spectral fidelity, have proven very effective for assessing vegetation health through chlorophyll concentration and pigment variation at fine spatial scales. Malenovský et al. [
45] used UAV-based hyperspectral imaging and SVR to map chlorophyll in Antarctic mosses with field spectrometry–level accuracy. Similarly, Turner et al. [
26] applied RF models to UAV multispectral imagery, identifying NIR reflectance as a key predictor and achieving 5–10% error in chlorophyll estimation.
More recent ground-based studies have aimed to refine which spectral bands and indices are most effective for chlorophyll detection and stress monitoring in Arctic plants, with applications for future UAV and satellite studies [
92]. Levy et al. [
93] applied pigment-based hyperspectral UAV analysis to map chlorophyll-rich microbial mats, and their metabolic activity, in Antarctica’s Dry Valleys using hyperspectral and RGB imagery.
Moisture Content and Water Dynamics
Early UAV-based moisture assessments estimated stress indirectly by using indices like the NDWI and MTVI2, which rely on NIR and SWIR reflectance. Lucieer et al. [
83] and Turner et al. [
38] applied these indices to Antarctic mosses, with Lucieer et al. [
83] finding moderate correlations with moisture proxies (
). Turner et al. [
26] improved predictions by integrating terrain-derived flow accumulation metrics with multispectral imagery to better model moss health.
More recent approaches directly estimate vegetation water content using hyperspectral SWIR sensors. Turner et al. [
94] achieved <5% error in moss turf water prediction by targeting absorption features at 980 nn, 1200 nm, and 1450 nm wavelengths strongly associated with plant water status [
39,
95]. Thermal infrared (TIR) UAV sensors further complement these efforts by detecting canopy temperature variation linked to evapotranspiration and water stress [
37,
38], offering additional physiological insight.
Biomass Studies
UAV-based biomass estimation in polar regions has increasingly leveraged structural data in addition to spectral data. One spectral-only study found that multispectral UAV imagery performed comparably to hyperspectral data in predicting aboveground biomass (
up to 0.60) and the leaf area index (
up to 0.65), with broadband indices driving model accuracy. Incorporating topographic data offered little improvement, especially struggling with low-stature vegetation, but the authors emphasised that adding canopy height could substantially enhance predictions, particularly in taller shrub systems [
96].
Structural methods—such as Structure-from-Motion (SfM) photogrammetry and LiDAR—have demonstrated this potential. Greaves et al. [
97] showed that airborne LiDAR acquired from crewed aircraft enabled harvested shrub biomass prediction with
, improving to
when combined with spectral metrics. Subsequent work incorporated terrestrial LiDAR [
98] and modelled biomass uncertainty across Arctic landscapes.
UAV-mounted LiDAR now offers similar accuracy at finer spatial and temporal scales. Collins et al. [
99] collected >5000 pts/m
2 in the Arctic tundra, capturing subcanopy complexity and improved ground returns. Although formal accuracy metrics were limited, methodological enhancements—such as oblique scanning and dual returns—highlight UAV LiDAR’s strong potential despite logistical and cost constraints.
SfM photogrammetry provides a more accessible structural alternative. Cunliffe et al. [
85] found that the SfM-derived canopy height predicted the harvested biomass with
, outperforming NDVI-based models. However, SfM heights showed systematic biases (+0.14 m), and performance depends on vegetation structure and terrain quality. Orndahl et al. [
100] similarly reported
for deciduous shrubs but reduced accuracy in low-stature vegetation. These findings emphasise the need to match structural methods to specific vegetation types and landscape conditions.
While many UAV-based biomass studies emphasise structural mapping at a single point in time, UAVs also support temporal monitoring. Siewert and Olofsson [
91] related the UAV-derived NDVI to biomass using an exponential model (
) but found better results when tracking Gross Primary Productivity (
), highlighting UAVs’ potential for capturing vegetation dynamics even without structural data.
Fine-Scale Vegetation Mapping
UAVs have significantly advanced fine-scale vegetation monitoring by providing ultra-high resolution imagery. Their ability to achieve centimetre-level resolution is a key advantage over coarser satellite data, allowing for precise identification of individual plants, species, and functional types, which is essential for understanding the heterogeneity of polar ecosystems. This level of detail supports both automated classification and manual delineation of vegetation boundaries at fine ecological scales [
72,
83]. Other studies have also shown that combining multiple UAV-sensor data can make improvements over individual sensors.
Early work using RGB imagery paired with SfM photogrammetry showed promising results. Fraser et al. [
40] achieved 82% accuracy classifying 11 Arctic vegetation classes by integrating spectral and 3D features, highlighting how structural information can enhance classification. However, the limited spectral range of RGB imagery constrained its ability to differentiate similar plant types. Also incorporating structural data, Greaves et al. [
86] used downscaled LiDAR-derived shrub biomass alongside 5 cm, 4-band airborne spectral layers as predictor variables in RF models to map tundra vegetation communities at 20 cm resolution. This study illustrates how structural biomass data can support broader ecological classification.
Multispectral approaches have since improved classification performance [
34]. Raniga et al. [
46] successfully classified five moss and lichen health classes in Antarctica by integrating XGBoost and U-Net models, achieving F1-scores up to 93%. Hyperspectral data have demonstrated an even greater ability to differentiate vegetation types. Sandino et al. [
47], for example, achieved 95–98% accuracy in distinguishing mosses, lichens, and non-vegetated surfaces.
UAVs also offer flexible revisit capabilities, supporting classification across phenological stages, herbivory events, or short-term environmental transitions [
37,
53,
101]. However, transferability remains a key challenge, as most UAV-based classification models require site-specific training due to variation in vegetation composition, illumination, and sensor calibration [
100]. While UAVs provide exceptional resolution and sensor flexibility,
Table 3 highlights how their limited coverage and logistical constraints restrict broader applicability.
3.4.7. Multi-Sensor Fusion
Multi-sensor fusion is emerging as a powerful tool in modern polar vegetation monitoring, enabling researchers to integrate complementary datasets—such as optical, radar, and UAV imagery—to overcome challenges related to spatial resolution, spectral ambiguity, and temporal continuity. These approaches enhance classification accuracy, support functional trait estimation, and enable vegetation trend analysis across fragmented and remote polar landscapes.
Figure 3 highlights key trends in the use of sensor fusion for polar vegetation monitoring. Since 2014, there has been a marked increase in multi-sensor integration studies, particularly those linking field data to satellite observations. However, fusion approaches that integrate multiple satellite sensors—or that combine UAV and satellite data—remain comparatively limited in polar contexts. Most applications focus on land cover and biomass estimation, with fewer addressing vegetation health or long-term dynamics. These patterns suggest both methodological and logistical constraints but also point to significant untapped potential. UAV–satellite fusion, in particular, offers a scalable pathway to bridge fine-resolution UAV observations with regional satellite data, while fusion between distinct satellite sensors enables consistent, broad-scale monitoring across seasons and sensor types. Expanding these under-represented approaches could improve both spatial and temporal vegetation monitoring across diverse polar ecosystems.
Multi-Sensor Satellite Fusion
Satellite-based fusion combines complementary datasets—typically optical, SAR, hyperspectral, and topographic inputs—to enhance vegetation classification, phenology tracking, and trait estimation. These approaches address key limitations of single-sensor systems by integrating spectral, structural, and temporal signals, making them well suited to the heterogeneous and cloud-prone environments of polar ecosystems.
SAR–optical fusion has also enhanced class separability by integrating structural and spectral information. In the Mackenzie Delta, Ullmann et al. [
64] found that combining quad-pol RADARSAT-2 with Landsat-8 increased classification accuracy from 75–83% (single sensors) to 87%, particularly improving distinction between structurally similar tundra types (e.g., shrublands vs. wetlands). Similarly, Stendardi et al. [
102] fused Sentinel-1 SAR and Sentinel-2 NDVI time series to track snow-melt and vegetation transitions in Svalbard, showing how radar–optical combinations capture both phenological and moisture-driven dynamics. Merchant et al. [
61] extended this by integrating Sentinel-1 SAR, Interferometric Synthetic Aperture Radar (InSAR) coherence, and topographic indices in an RF model, achieving 84% accuracy for classifying hydro-ecological tundra types. The inclusion of coherence-captured changes associated with inundation and vegetation growth, highlighting how temporal stability metrics can add deeper ecological insight. Building on this, Collingwood et al. [
57] showed that SAR–optical fusion can extend beyond classification by adapting structural information to model biomass: artificial neural networks trained on RADARSAT-2 and GeoEye-1 SAVI achieved strong agreement with field-measured phytomass (r
2 = 0.87), outperforming optical-only approaches and demonstrating the value of class-stratified, nonlinear modelling in sparse High Arctic vegetation.
Fusion has also improved the quality of training data for ML-based vegetation mapping. Across a series of studies, Langford et al. [
19,
60,
103] integrated optical (Landsat-8, SPOT-5, and WorldView-2), hyperspectral (EO-1 Hyperion), SAR (PALSAR), and elevation data to refine CNN models in the Alaskan Arctic. Using unsupervised clustering and Mapcurves-based relabelling, they increased the F1-scores from 0.67 to 0.81. Notably, Langford et al. [
60] showed that fusing multi-temporal WorldView-2 imagery improved plant functional type classification (R
2 = 0.75 vs. 0.63), highlighting the value of phenological signals in distinguishing vegetation classes. These efforts demonstrate how multi-sensor data can enhance not just model outputs, but the training datasets themselves in remote regions where field data are scarce.
UAV–Satellite Fusion
UAV–satellite fusion enables fine-scale ecological observations from UAVs to be scaled across broader spatial and temporal domains using satellite imagery. While UAVs offer high spatial resolution and flexible sensor payloads, their limited coverage and short deployment windows constrain regional monitoring. In contrast, satellite platforms provide consistent, large-area observations but lack the resolution to detect fine-scale vegetation patterns. Fusion bridges these complementary strengths, supporting scalable classification, trait estimation, and change detection in patchy, low-stature polar vegetation.
Simple fusion efforts have focused on spatial alignment and validation. For example, Cunliffe et al. [
101] used UAV imagery to delineate vegetation boundaries and validate historical aerial and satellite classifications, enhancing temporal consistency.
Subsequent studies progressed to calibration-based approaches. For instance, Riihimäki et al. [
20] modelled relationships between UAV-derived FCover and VIs from PlanetScope, Sentinel-2A, and Landsat 8 to improve sub-pixel vegetation mapping. Red-edge and SWIR indices outperformed the NDVI, with Sentinel-2 and Landsat 8 achieving lower errors (RMSE ≤ 0.15), likely due to more stable co-registration and reduced sensitivity to fine-scale heterogeneity.
These calibration approaches laid the foundation for more advanced fusion techniques that scale UAV classifications to regional extents. Jozdani et al. [
49] used a semi-supervised U-Net Teacher–Student framework to transfer 1 cm UAV-derived lichen classification (over 0.04 km
2) to 195 km
2 of WorldView-2 imagery (50 cm resolution). Achieving 85.28% overall accuracy and an F1-score of 84.38%, the model outperformed conventional supervised approaches trained on downsampled UAV data. Importantly, it required no ground labels at the satellite scale, illustrating how UAV imagery can enhance large-scale classification in data-poor polar environments.
As fusion methods have evolved, studies have moved beyond land cover to incorporate biochemical and functional trait mapping. Thomson et al. [
104] combined field spectroscopy, UAV multispectral imagery, and Sentinel-2A data to classify plant functional types (72% accuracy) and estimate key traits, such as water content (
), nitrogen (
), and phosphorus (
). Unlike NDVI-only methods, this approach captured ecophysiological variation relevant to cryptogamic and low-stature vegetation. Recalibrating PLSR models on Sentinel-2 data using UAV-derived values improved accuracy, emphasising the importance of sensor-specific adaptation during upscaling.
Together, these studies show that UAV–satellite fusion has progressed from spatial validation to trait-based modelling and historical inference. It now serves as a key framework for scalable, ecologically meaningful remote sensing applications in polar regions—especially in areas where access is limited, vegetation is small and patchy, and traditional satellite imagery struggles to resolve fine-scale ecological dynamics.
4. Discussion
Improvements in satellite and UAV technologies continue to expand the potential for polar vegetation monitoring, yet many of these capabilities remain underutilised. Hyperspectral satellite sensors have seen limited application in polar ecosystems. While legacy datasets, such as Hyperion, have demonstrated their value, newer missions—such as Tanager and the Environmental Mapping and Analysis Program (ENMAP)—remain largely unexplored in these regions, especially for low-level vegetation. Future hyperspectral missions, such as Copernicus Hyperspectral Imaging Mission (CHIME) and FLuorescence EXplorer (FLEX), may further enhance monitoring. Similarly, high-resolution optical satellites—such as PlanetScope Doves, WorldView-2/3, and QuickBird—have proven valuable for fine-scale mapping, yet most studies still rely on coarser-resolution sensors, like Sentinel-2 and Landsat. The lack of SAR studies in Antarctic vegetation studies remains a major research gap. In both the Arctic and Antarctica, despite the increasing availability of high-resolution SAR data, sensors such as Capella Space (0.5 m X-band) and ICEYE (0.25 m X-band in spotlight mode) remain untested for polar vegetation applications, as do UAV-based SAR sensors [
69,
105]. Expanding the use of these modern sensors, particularly through sensor fusion, could enhance both spatial and spectral insights.
4.1. Expanding UAV Applications
Beyond satellites, UAV technologies are advancing rapidly, offering new opportunities for large-scale monitoring. BVLOS UAV operations remain underutilised, despite their ability to expand coverage. Additionally, the coordinated use of multiple UAVs could improve spatial monitoring and facilitate multi-sensor integration. In particular, integrating fixed-wing and multirotor UAVs in coordinated missions has not been explored for polar vegetation studies. Advances in adaptive, real-time path planning could further enhance UAV operations by optimising flight paths based on terrain and vegetation patterns. These strategies directly inform how sensor fusion methods might be adapted to the logistical and operational constraints of Antarctica.
4.2. Improving Field Data Collection and Validation
Field data collection remains a significant limitation in remote sensing for polar vegetation, particularly for validating results and training ML models. Expensive field campaigns and challenging weather make it difficult to conduct extensive in situ data collection. Most studies rely on small-scale field sampling, which restricts the accuracy and scalability of vegetation studies. Furthermore, field campaigns are often geographically biased to more easily accessible regions. Expanding field data collection efforts through collaborative initiatives, long-term ecological monitoring networks, and automated ground-based sensors should be explored to enhance our overall understanding of polar vegetation changes. Additionally, UAV-based in situ calibration methods, spectral libraries, and low-cost distributed sensor networks could provide continuous ground-truth data, strengthening sensor fusion approaches reliant on high-quality training data.
4.3. Expanding Machine Learning and Deep Learning Approaches
ML has proven integral for polar vegetation remote sensing. While regression models and supervised classification methods, such as RF and SVM, have been widely used to relate in situ field data to airborne sensors or to incorporate both single-sensor or multi-sensor data for classification purposes, unsupervised classification and deep learning methods are not equally represented in polar vegetation studies. Yet, they hold considerable promise for improving vegetation classification and trait prediction in environments where field data are limited or ground truth is sparse. As cloud computing improves processing capabilities, computationally heavy supervised and deep learning approaches can be scaled, enabling more sophisticated integration of multi-source datasets. Future work should prioritise hybrid approaches that combine interpretability, scalability, and accuracy—especially for temporally dynamic or structurally ambiguous vegetation.
4.4. Expanding Sensor Fusion Approaches
While this review highlights sensor fusion, its application in polar vegetation monitoring remains limited. Most polar studies focus on theoretical frameworks and methodologies, but few have implemented sensor fusion at scale. Research has largely been constrained to small study areas or single-time-point observations rather than broad, long-term applications. Future studies are needed to understand the transferability of sensor fusion techniques across variable regions. Comparative multi-site and multi-temporal studies, along with standardised fusion workflows, are needed for scaling these methods beyond isolated case studies and ensuring broader applicability across polar ecosystems.
Sensor fusion techniques, such as band alignment, have demonstrated how combining multiple sensors can improve temporal coverage and minimise cloud cover issues. However, more advanced fusion approaches between different sensor types remain underutilised, particularly for long-term temporal monitoring. Most research relies on single-date analyses, missing the opportunity to use multi-sensor fusion to track seasonal and inter-annual vegetation changes. Additionally, incorporating SAR data with UAV studies could hold substantial potential for combining structural and spectral insights in weather-challenged polar regions.
Beyond improving temporal coverage, sensor fusion also presents an opportunity to address geographic biases in polar vegetation studies. By integrating satellite data with UAV observations at key reference sites and extrapolating trends through ML models, sensor fusion could extend research into inaccessible regions that are rarely or never studied. Most current studies remain proof-of-concept rather than large-scale ecosystem assessments. Future research should focus on developing transferable frameworks that enable sensor fusion to be applied across broader polar landscapes.
4.5. Implications for Future Work
These knowledge gaps and technological opportunities outlined here provide a strong foundation for future research into operational-scale sensor fusion workflows. In particular, they inform the development of adaptive UAV–satellite integration strategies aimed at improving spatial and temporal vegetation monitoring in extreme environments, like Antarctica. Ongoing research will explore how these approaches can be tailored to overcome ecological, logistical, and climatic constraints unique to polar regions.
5. Conclusions
This review demonstrates how recent advances in remote sensing—particularly the integration of satellite and UAV platforms, sensor fusion, and ML—are transforming our ability to monitor sparse vegetation in polar environments. By evaluating coarse-, medium-, and high-resolution methods across optical and SAR domains, we highlight the strengths and limitations of each approach in addressing the spatial, spectral, and temporal complexities of Arctic and Antarctic ecosystems.
Medium-resolution platforms like Sentinel-2 and Landsat have improved vegetation mapping through expanded spectral bands and cloud-based processing, while UAVs enable ultra-fine scale vegetation classification and trait estimation. SAR provides all-weather structural insight, and multi-sensor fusion increasingly enables ecologically meaningful upscaling. ML—especially when integrated with multi-sensor datasets—enhances classification accuracy and facilitates spatial transferability.
Despite these advances, substantial gaps remain. The Antarctic remains under-represented in SAR and hyperspectral research, and UAV–satellite fusion is underutilised for long-term monitoring. Field validation data remain scarce, limiting the scalability and accuracy of remote sensing applications. Deep learning, multi-temporal fusion, and coordinated UAV deployments offer significant untapped potential for ecosystem-scale monitoring.
To fully harness these technologies, future work should prioritise operational-scale sensor fusion, expanded field calibration campaigns, and the development of transferable ML workflows. These efforts will be critical for improving biodiversity assessments, informing conservation strategies, and supporting adaptive policy frameworks in a rapidly changing polar environment. Together, these findings establish a clear foundation for research that adapts multi-sensor workflows to the constraints of Antarctica. By leveraging UAV–satellite synergies, refining ML frameworks, and addressing geographic and temporal data gaps, future studies can advance from proof-of-concept research toward scalable, ecologically meaningful monitoring solutions for polar regions.
Author Contributions
Conceptualisation, A.P., J.S. (Juan Sandino) and F.G.; methodology, A.P.; software, A.P.; formal analysis, A.P.; investigation, A.P.; resources, F.G.; data curation, A.P. and J.S. (Juan Sandino); writing—original draft preparation, A.P.; writing—review and editing, A.P., J.S. (Juan Sandino) and F.G.; visualisation, A.P. and J.S. (Juan Sandino); supervision, J.S. (Juan Sandino), J.S. (Justine Shaw), B.B. and F.G.; project administration, F.G.; funding acquisition, F.G. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Australian Research Council (ARC) SRIEAS Grant (grant number: SR200100005) Securing Antarctica’s Environmental Future.
Data Availability Statement
Not applicable.
Acknowledgments
We would like to thank Juan Sandino for providing
Figure 2 during his field trip for the 2022–23 season in Antarctica, with the support of the Australian Antarctic Division (AAD) through AAS Project 4628, Securing Antarctica’s Environmental Future (SAEF), and the QUT Centre for Robotics (QCR). A.P. acknowledges scholarship support from SAEF.
Conflicts of Interest
The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.
Abbreviations
The following abbreviations are used in this manuscript:
AAD | Australian Antarctic Division |
ARC | Australian Research Council |
ASPA | Antarctic Specially Protected Area |
AVHRR | Advanced Very High Resolution Radiometer |
BVLOS | Beyond Visual Line-of-Sight |
CHIME | Copernicus Hyperspectral Imaging Mission |
CNN | Convolutional Neural Network |
EnMAP | Environmental Mapping and Analysis Program |
FLEX | FLuorescence EXplorer (FLEX) |
HAND | Height Above Nearest Drainage |
InSAR | Interferometric Synthetic Aperture Radar |
LiDAR | Light Detection and Ranging |
ML | Machine Learning |
MODIS | Moderate Resolution Imaging Spectroradiometer |
NDVI | Normalised Difference Vegetation Index |
NIR | Near-Infrared |
OBIA | Object-Based Image Analysis |
PLSR | Partial Least Squares Regression |
QCR | QUT Centre for Robotics |
RF | Random Forest |
RGB | Red, Green, and Blue |
SAR | Synthetic Aperture Radar |
SAEF | Securing Antarctica’s Environmental Future |
SfM | Structure from Motion |
SVM | Support Vector Machine |
SVR | Support Vector Regression |
SWIR | Shortwave Infrared |
UAV | Uncrewed Aerial Vehicle |
VIs | Vegetation Indices |
References
- Beer, E.; Eisenman, I.; Wagner, T.J.W. Polar amplification due to enhanced heat flux across the halocline. Geophys. Res. Lett. 2020, 47, e2019GL086706. [Google Scholar] [CrossRef]
- Chown, S.L.; Lee, J.E.; Hughes, K.A.; Barnes, J.; Barrett, P.J.; Bergstrom, D.M.; Convey, P.; Cowan, D.A.; Crosbie, K.; Dyer, G.; et al. Challenges to the Future Conservation of the Antarctic. Science 2012, 337, 158–159. [Google Scholar] [CrossRef] [PubMed]
- Bennett, J.R.; Shaw, J.D.; Terauds, A.; Smolders, J.P.; Lea, M.A.; Nielsen, V.; Paul, D.; Blake, J.M.; Cheung, W.W.; Crown, S.A.; et al. Polar lessons learned: Long-term management of polar environments based on shared threats in Arctic and Antarctic environments. Front. Ecol. Environ. 2015, 13, 316–324. [Google Scholar] [CrossRef]
- Stow, D.A.; Hope, A.; McGuire, D.; Verbyla, D.; Gamon, J.; Huemmrich, F.; Houston, S.; Racine, C.; Sturm, M.; Tape, K.; et al. Remote sensing of vegetation and land-cover change in Arctic Tundra Ecosystems. Remote. Sens. Environ. 2004, 89, 281–308. [Google Scholar] [CrossRef]
- Robinson, S.A.; Klekociuk, A.R.; King, D.H.; Pizarro Rojas, M.; Zúñiga, G.E.; Bergstrom, D.M. The 2019/2020 summer of Antarctic heatwaves. Glob. Change Biol. 2020, 26, 3178–3180. [Google Scholar] [CrossRef]
- Brooks, S.T.; Jabour, J.; van den Hoff, J.; Bergstrom, D.M. Our footprint on Antarctica competes with nature for rare ice-free land. Nat. Sustain. 2019, 2, 185–190. [Google Scholar] [CrossRef]
- Billings, W.D.; Mooney, H.A. The Ecology of Arctic and Alpine Plants. Biol. Rev. 1968, 43, 481–529. [Google Scholar] [CrossRef]
- Lee, J.R.; Raymond, B.; Bracegirdle, T.J.; Chadès, I.; Fuller, R.A.; Shaw, J.D.; Terauds, A. Climate change drives expansion of Antarctic ice-free habitat. Nat. Clim. Change 2017, 7, 489–495. [Google Scholar] [CrossRef]
- Zmarz, A.; Rodzewicz, M.; Dąbski, M.; Karsznia, I.; Korczak-Abshire, M.; Chwedorzewska, K.J. Application of UAV BVLOS remote sensing data for multi-faceted analysis of Antarctic ecosystem. Remote. Sens. Environ. 2018, 217, 88–102. [Google Scholar] [CrossRef]
- Convey, P.; Chown, S.L.; Clarke, A.; Barnes, D.K.; Cummings, V.; Ducklow, H.; Frati, F.; Green, T.G.; Hogg, I.D.; Johnston, N.M.; et al. Antarctic Biodiversity – Spatial and Temporal Patterns and Change: An Assessment. Ecol. Monogr. 2014, 84, 203–244. [Google Scholar] [CrossRef]
- Colesie, C.; Walshaw, C.V.; Sancho, L.G.; Davey, M.P.; Gray, A. Antarctica’s vegetation in a changing climate. WIREs Clim. Change 2023, 14, e810. [Google Scholar] [CrossRef]
- Roland, T.P.; Bartlett, O.T.; Charman, D.J.; Anderson, K.; Hodgson, D.A.; Amesbury, M.J.; Maclean, I.; Fretwell, P.T.; Fleming, A. Sustained greening of the Antarctic Peninsula observed from satellites. Nat. Geosci. 2024, 18, 41–47. [Google Scholar] [CrossRef]
- Callaghan, T.V.; Gatti, R.C.; Phoenix, G. The need to understand the stability of arctic vegetation during rapid climate change: An assessment of imbalance in the literature. Ambio 2022, 51, 1034–1044. [Google Scholar] [CrossRef] [PubMed]
- Berner, L.T.; Massey, R.; Jantz, P.; Forbes, B.C.; Macias-Fauria, M.; Myers-Smith, I.; Kumpula, T.; Gauthier, G.; Andreu-Hayles, L.; Gaglioti, B.V.; et al. Summer warming explains widespread but not uniform greening in the Arctic tundra biome. Nat. Commun. 2020, 11, 4621. [Google Scholar] [PubMed]
- Shaw, J.D.; Terauds, A.; Riddle, M.J.; Possingham, H.P.; Chown, S.L. Antarctica’s Protected Areas Are Inadequate, Unrepresentative, and at Risk. PLoS Biol. 2014, 12, e1001888. [Google Scholar] [CrossRef]
- Wilkinson, R.; Mleczko, M.; Brewin, R.; Gaston, K.; Mueller, M.; Shutler, J.; Yan, X.; Anderson, K. Environmental impacts of earth observation data in the constellation and cloud computing era. Sci. Total Environ. 2024, 909, 168584. [Google Scholar] [CrossRef] [PubMed]
- Pina, P.; Vieira, G. UAVs for Science in Antarctica. Remote Sens. 2022, 14, 1610. [Google Scholar] [CrossRef]
- Lucieer, A.; Robinson, S.; Turner, D.; Harwin, S.; Kelcey, J. Using a Micro-UAV for Ultra-High Resolution Multi-Sensor Observations of Antarctic Moss Beds. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 429–433. [Google Scholar]
- Langford, Z.L.; Kumar, J.; Hoffman, F.M. Convolutional Neural Network Approach for Mapping Arctic Vegetation Using Multi-Sensor Remote Sensing Fusion. In Proceedings of the 2017 IEEE International Conference on Data Mining Workshops (ICDMW), New Orleans, LA, USA, 18–21 November 2017; pp. 322–331. [Google Scholar] [CrossRef]
- Riihimäki, H.; Luoto, M.; Heiskanen, J. Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data. Remote Sens. Environ. 2019, 231, 111259. [Google Scholar] [CrossRef]
- Beamish, A.; Raynolds, M.K.; Epstein, H.; Frost, G.V.; Macander, M.J.; Bergstedt, H.; Bartsch, A.; Kruse, S.; Miles, V.; Tanis, C.M.; et al. Recent trends and remaining challenges for optical remote sensing of Arctic tundra vegetation: A review and outlook. Remote Sens. Environ. 2020, 246, 111872. [Google Scholar] [CrossRef]
- Runge, A.; Grosse, G. Comparing Spectral Characteristics of Landsat-8 and Sentinel-2 Same-Day Data for Arctic-Boreal Regions. Remote Sens. 2019, 11, 1730. [Google Scholar] [CrossRef]
- Chen, W.; Blain, D.; Li, J.; Keohler, K.; Fraser, R.; Zhang, Y.; Leblanc, S.; Olthof, I.; Wang, J.; McGovern, M. Biomass measurements and relationships with Landsat-7/ETM+ and JERS-1/SAR data over Canada’s western sub-arctic and low arctic. Int. J. Remote Sens. 2009, 30, 2355–2376. [Google Scholar] [CrossRef]
- Reiersen, L.O.; Vorkamp, K.; Kallenborn, R. The role of the Arctic Monitoring and Assessment Programme (AMAP) in reducing pollution of the Arctic and around the globe. Environ. Sci. Ecotechnol. 2024, 17, 100302. [Google Scholar] [CrossRef]
- Frenot, Y.; Chown, S.L.; Whinam, J.; Selkirk, P.M.; Convey, P.; Skotnicki, M.; Bergstrom, D.M. Biological invasions in the Antarctic: Extent, impacts and implications. Biol. Rev. 2005, 80, 45–72. [Google Scholar] [CrossRef] [PubMed]
- Turner, D.; Lucieer, A.; Malenovský, Z.; King, D.H.; Robinson, S.A. Assessment of Antarctic Moss Health from Multi-sensor UAS Imagery with Random Forest Modelling. Int. J. Appl. Earth Obs. Geoinf. 2018, 68, 168–179. [Google Scholar] [CrossRef]
- Turner, D.J.; Malenovský, Z.; Lucieer, A.; Turnbull, J.D.; Robinson, S.A. Optimizing Spectral and Spatial Resolutions of Unmanned Aerial System Imaging Sensors for Monitoring Antarctic Vegetation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 4503–4515. [Google Scholar] [CrossRef]
- Ullmann, T.; Schmitt, A.; Jagdhuber, T. Two Component Decomposition of Dual Polarimetric HH/VV SAR Data: Case Study for the Tundra Environment of the Mackenzie Delta Region, Canada. Remote Sens. 2016, 8, 1027. [Google Scholar] [CrossRef]
- Pinzon, J.E.; Tucker, C.J. A Non-Stationary 1981–2012 AVHRR NDVI3g Time Series. Remote Sens. 2014, 6, 6929–6960. [Google Scholar] [CrossRef]
- Karlsen, S.R.; Elvebakk, A.; Høgda, K.A.; Grydeland, T. Spatial and temporal variability in the onset of the growing season on Svalbard, Arctic Norway—Measured by MODIS-NDVI satellite data. Remote Sens. 2014, 6, 8088–8106. [Google Scholar] [CrossRef]
- Duguay, Y.; Bernier, M.; Lévesque, E.; Tremblay, B. Potential of C and X Band SAR for Shrub Growth Monitoring in Sub-Arctic Environments. Remote Sens. 2015, 7, 9410–9430. [Google Scholar] [CrossRef]
- Roy, D.P.; Kovalskyy, V.; Zhang, H.; Vermote, E.F.; Yan, L.; Kumar, S.; Egorov, A. Characterization of Landsat-7 to Landsat-8 reflective wavelength and normalized difference vegetation index continuity. Remote Sens. Environ. 2016, 185, 57–70. [Google Scholar] [CrossRef] [PubMed]
- Walshaw, C.V.; Gray, A.; Fretwell, P.T.; Convey, P.; Davey, M.P.; Johnson, J.S.; Colesie, C. A satellite-derived baseline of photosynthetic life across Antarctica. Nat. Geosci. 2024, 17, 755–762. [Google Scholar] [CrossRef]
- Váczi, P.; Barták, M.; Bednaříková, M.; Hrbáček, F.; Hájek, J. Spectral properties of Antarctic and Alpine vegetation monitored by multispectral camera: Case studies from James Ross Island and Jeseníky Mts. Czech Polar Rep. 2020, 10, 297–312. [Google Scholar] [CrossRef]
- Casanovas, P.; Black, M.; Fretwell, P.; Convey, P. Mapping lichen distribution on the Antarctic Peninsula using remote sensing, lichen spectra and photographic documentation by citizen scientists. Polar Res. 2015, 34, 25633. [Google Scholar] [CrossRef]
- Lucieer, A.; Malenovský, Z.; Veness, T.; Wallace, L. HyperUAS—Imaging Spectroscopy from a Multirotor Unmanned Aircraft System. J. Field Robot. 2014, 31, 571–590. [Google Scholar] [CrossRef]
- Yang, D.; Morrison, B.D.; Davidson, K.J.; Lamour, J.; Li, Q.; Nelson, P.R.; Hantson, W.; Hayes, D.J.; Swetnam, T.L.; McMahon, A.; et al. Remote Sensing from Unoccupied Aerial Systems: Opportunities to Enhance Arctic Plant Ecology in a Changing Climate. J. Ecol. 2022, 110, 2812–2835. [Google Scholar] [CrossRef]
- Turner, D.; Lucieer, A.; Malenovskỳ, Z.; King, D.H.; Robinson, S.A. Spatial Co-Registration of Ultra-High Resolution Visible, Multispectral and Thermal Images Acquired with a Micro-UAV over Antarctic Moss Beds. Remote Sens. 2014, 6, 4003–4024. [Google Scholar] [CrossRef]
- Chi, J.; Lee, H.; Hong, S.G.; Kim, H.C. Spectral Characteristics of the Antarctic Vegetation: A Case Study of Barton Peninsula. Remote Sens. 2021, 13, 2470. [Google Scholar] [CrossRef]
- Fraser, R.H.; Olthof, I.; Lantz, T.C.; Schmitt, C. UAV Photogrammetry for Mapping Vegetation in the Low-Arctic. Arct. Sci. 2016, 2, 79–102. [Google Scholar] [CrossRef]
- Gray, A.; Krolikowski, M.; Fretwell, P.; Convey, P.; Peck, L.S.; Mendelova, M.; Smith, A.G.; Davey, M.P. Remote Sensing Phenology of Antarctic Green and Red Snow Algae Using WorldView Satellites. Front. Plant Sci. 2021, 12, 671981. [Google Scholar] [CrossRef]
- Kodl, G.; Streeter, R.; Cutler, N.; Bolch, T. Arctic tundra shrubification can obscure increasing levels of soil erosion in NDVI assessments of land cover derived from satellite imagery. Remote Sens. Environ. 2024, 301, 113935. [Google Scholar] [CrossRef]
- Fonseca, E.L.d.; Santos, E.C.d.; Figueiredo, A.R.d.; Simões, J.C. The use of sentinel-2 imagery to generate vegetation maps for the Northern Antarctic Peninsula and offshore islands. Anais Acad. Bras. Ciências 2023, 95, e20230710. [Google Scholar] [CrossRef] [PubMed]
- Singh, C.P.; Joshi, H.; Kakadiya, D.; Bhatt, M.S.; Bajpai, R.; Paul, R.R.; Upreti, D.K.; Saini, S.; Beg, M.J.; Pande, A.; et al. Mapping lichen abundance in ice-free areas of Larsemann Hills, East Antarctica using remote sensing and lichen spectra. Polar Sci. 2023, 38, 100976. [Google Scholar] [CrossRef]
- Malenovský, Z.; Lucieer, A.; King, D.H.; Turnbull, J.D.; Robinson, S.A. Unmanned aircraft system advances health mapping of fragile polar vegetation. Methods Ecol. Evol. 2017, 8, 1842–1857. [Google Scholar] [CrossRef]
- Raniga, D.; Amarasingam, N.; Sandino, J.; Doshi, A.; Barthelemy, J.; Randall, K.; Robinson, S.A.; Gonzalez, F.; Bollard, B. Monitoring of Antarctica’s fragile vegetation using drone-based remote sensing, multispectral imagery and AI. Sensors 2024, 24, 1063. [Google Scholar] [CrossRef]
- Sandino, J.; Bollard, B.; Doshi, A.; Randall, K.; Barthelemy, J.; Robinson, S.A.; Gonzalez, F. A green fingerprint of Antarctica: Drones, hyperspectral imaging, and machine learning for moss and lichen classification. Remote Sens. 2023, 15, 5658. [Google Scholar] [CrossRef]
- Villoslada, M.; Ylänne, H.; Juutinen, S.; Kolari, T.H.; Korpelainen, P.; Tahvanainen, T.; Wolff, F.; Kumpula, T. Reindeer control over shrubification in subarctic wetlands: Spatial analysis based on unoccupied aerial vehicle imagery. Remote Sens. Ecol. Conserv. 2023, 9, 687–706. [Google Scholar] [CrossRef]
- Jozdani, S.; Chen, D.; Chen, W.; Leblanc, S.G.; Prévost, C.; Lovitt, J.; He, L.; Johnson, B.A. Leveraging deep neural networks to map caribou lichen in high-resolution satellite images based on a small-scale, noisy UAV-derived map. Remote Sens. 2021, 13, 2658. [Google Scholar] [CrossRef]
- Liu, N.; Treitz, P. Remote sensing of Arctic percent vegetation cover and fAPAR on Baffin Island, Nunavut, Canada. Int. J. Appl. Earth Obs. Geoinf. 2018, 71, 159–169. [Google Scholar] [CrossRef]
- Davidson, S.J.; Santos, M.J.; Sloan, V.L.; Watts, J.D.; Phoenix, G.K.; Oechel, W.C.; Zona, D. Mapping Arctic Tundra Vegetation Communities Using Field Spectroscopy and Multispectral Satellite Data in North Alaska, USA. Remote Sens. 2016, 8, 978. [Google Scholar] [CrossRef]
- Assmann, J.J.; Myers-Smith, I.H.; Kerby, J.T.; Cunliffe, A.M.; Daskalova, G.N. Drone data reveal heterogeneity in tundra greenness and phenology not captured by satellites. Environ. Res. Lett. 2020, 15, 125002. [Google Scholar] [CrossRef]
- Siewert, M.B.; Olofsson, J. UAV reveals substantial but heterogeneous effects of herbivores on Arctic vegetation. Sci. Rep. 2021, 11, 18472. [Google Scholar] [CrossRef]
- Nelson, P.R.; Bundy, K.; Smith, K.; Macander, M.; Chan, C. Predicting plants in the wild: Mapping arctic and boreal plants with UAS-based visible and near infrared reflectance spectra. Int. J. Appl. Earth Obs. Geoinf. 2024, 133, 104156. [Google Scholar] [CrossRef]
- Fraser, R.H.; Lantz, T.C.; Olthof, I.; Kokelj, S.V.; Sims, R.A. Warming-induced shrub expansion and lichen decline in the Western Canadian Arctic. Ecosystems 2014, 17, 1151–1168. [Google Scholar] [CrossRef]
- Zihong Liu, Da He, Q.S.; Cheng, X. NDVI time-series data reconstruction for spatial-temporal dynamic monitoring of Arctic vegetation structure. Geo-Spat. Inf. Sci. 2024, 1–19. [CrossRef]
- Collingwood, A.; Treitz, P.; Charbonneau, F.; Atkinson, D.M. Artificial neural network modeling of high arctic phytomass using synthetic aperture radar and multispectral data. Remote Sens. 2014, 6, 2134–2153. [Google Scholar] [CrossRef]
- Jawak, S.D.; Luis, A.J.; Fretwell, P.T.; Convey, P.; Durairajan, U.A. Semiautomated Detection and Mapping of Vegetation Distribution in the Antarctic Environment Using Spatial-Spectral Characteristics of WorldView-2 Imagery. Remote Sens. 2019, 11, 1909. [Google Scholar] [CrossRef]
- A’Campo, W.; Bartsch, A.; Roth, A.; Wendleder, A.; Martin, V.S.; Durstewitz, L.; Lodi, R.; Wagner, J.; Hugelius, G. Arctic tundra land cover classification on the Beaufort coast using the Kennaugh element framework on dual-polarimetric TerraSAR-X imagery. Remote Sens. 2021, 13, 4780. [Google Scholar] [CrossRef]
- Langford, Z.; Kumar, J.; Hoffman, F.M.; Norby, R.J.; Wullschleger, S.D.; Sloan, V.L.; Iversen, C.M. Mapping Arctic plant functional type distributions in the Barrow Environmental Observatory using WorldView-2 and LiDAR datasets. Remote Sens. 2016, 8, 733. [Google Scholar] [CrossRef]
- Merchant, M.A.; Obadia, M.; Brisco, B.; DeVries, B.; Berg, A. Applying Machine Learning and Time-Series Analysis on Sentinel-1A SAR/InSAR for Characterizing Arctic Tundra Hydro-Ecological Conditions. Remote Sens. 2022, 14, 1123. [Google Scholar] [CrossRef]
- Eischeid, I.; Soininen, E.M.; Assmann, J.J.; Ims, R.A.; Madsen, J.; Pedersen, Å.Ø.; Pirotti, F.; Yoccoz, N.G.; Ravolainen, V.T. Disturbance Mapping in Arctic Tundra Improved by a Planning Workflow for Drone Studies: Advancing Tools for Future Ecosystem Monitoring. Remote Sens. 2021, 13, 4466. [Google Scholar] [CrossRef]
- Sotille, M.E.; Bremer, U.F.; Vieira, G.; Velho, L.F.; Petsch, C.; Simões, J.C. Evaluation of UAV and satellite-derived NDVI to map maritime Antarctic vegetation. Appl. Geogr. 2020, 125, 102322. [Google Scholar] [CrossRef]
- Ullmann, T.; Schmitt, A.; Roth, A.; Duffe, J.; Dech, S.; Hubberten, H.W.; Baumhauer, R. Land cover characterization and classification of arctic tundra environments by means of polarized synthetic aperture X- and C-band radar (PolSAR) and Landsat 8 multispectral imagery—Richards Island, Canada. Remote Sens. 2014, 6, 8565–8593. [Google Scholar] [CrossRef]
- Walker, D.A.; Gould, W.A.; Maier, H.A.; Raynolds, M.K. The Circumpolar Arctic Vegetation Map: AVHRR-derived base maps, environmental controls, and integrated mapping procedures. Int. J. Remote Sens. 2002, 23, 4551–4570. [Google Scholar] [CrossRef]
- Guay, K.C.; Beck, P.S.A.; Berner, L.T.; Goetz, S.J.; Baccini, A.; Buermann, W. Vegetation productivity patterns at high northern latitudes: A multi-sensor satellite data assessment. Glob. Change Biol. 2014, 20, 3147–3158. [Google Scholar] [CrossRef]
- Fretwell, P.T.; Convey, P.; Fleming, A.H.; Peat, H.J.; Hughes, K.A. Detecting and mapping vegetation distribution on the Antarctic Peninsula from remote sensing data. Polar Biol. 2011, 34, 273–281. [Google Scholar] [CrossRef]
- Shin, J.I.; Kim, H.C.; Kim, S.I.; Hong, S.G. Vegetation abundance on the Barton Peninsula, Antarctica: Estimation from high-resolution satellite images. Polar Biol. 2014, 37, 1579–1588. [Google Scholar] [CrossRef]
- Castelletti, D.; Farquharson, G.; Brown, J.; De, S.; Yague-Martinez, N.; Stringham, C.; Yalla, G.; Villarreal, A. Capella Space VHR SAR Constellation: Advanced Tasking Patterns and Future Capabilities. In Proceedings of the 2022 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 4137–4140. [Google Scholar] [CrossRef]
- Storch, T.; Honold, H.P.; Chabrillat, S.; Habermeyer, M.; Tucker, P.; Brell, M.; Ohndorf, A.; Wirth, K.; Betz, M.; Kuchler, M.; et al. The EnMAP imaging spectroscopy mission towards operations. Remote Sens. Environ. 2023, 294, 113632. [Google Scholar] [CrossRef]
- Joshua, M.; Salvaggio, K.; Keremedjiev, M.; Roth, K.; Foughty, E. Planet’s Upcoming VIS-SWIR Hyperspectral Satellites. In Proceedings of the Optica Sensing Congress 2023 (AIS, FTS, HISE, Sensors, ES), Munich, Germany, 30 July–3 August 2023; Optica Publishing Group: Washington, DC, USA, 2023; p. HM3C.5. [Google Scholar] [CrossRef]
- Bollard-Breen, B.; Brooks, J.D.; Jones, M.R.L.; Robertson, J.; Betschart, S.; Kung, O.; Cary, C.S.; Lee, C.K.; Pointing, S.B. Application of an unmanned aerial vehicle in spatial mapping of terrestrial biology and human disturbance in the McMurdo Dry Valleys, East Antarctica. Polar Biol. 2015, 38, 573–578. [Google Scholar] [CrossRef]
- Yang, D.; Meng, R.; Morrison, B.D.; McMahon, A.; Hantson, W.; Hayes, D.J.; Breen, A.L.; Salmon, V.G.; Serbin, S.P. A Multi-Sensor Unoccupied Aerial System Improves Characterization of Vegetation Composition and Canopy Properties in the Arctic Tundra. Remote Sens. 2020, 12, 2638. [Google Scholar] [CrossRef]
- Walker, D.; Raynolds, M.; Daniels, F.; Einarsson, E.; Elvebakk, A.; Gould, W.; Katenin, A.; Kholod, S.; Markon, C.; Melnikov, E.; et al. The circumpolar Arctic vegetation map. J. Veg. Sci. 2005, 16, 267–282. [Google Scholar] [CrossRef]
- Wulder, M.A.; Roy, D.P.; Radeloff, V.C.; Loveland, T.R.; Anderson, M.C.; Johnson, D.M.; Healey, S.; Zhu, Z.; Scambos, T.A.; Pahlevan, N.; et al. Fifty years of Landsat science and impacts. Remote Sens. Environ. 2022, 280, 113195. [Google Scholar] [CrossRef]
- Fonseca, E.L.; Santos, E.C.D.; Figueiredo, A.R.D.; Simoes, J.C. Antarctic biological soil crusts surface reflectance patterns from Landsat and Sentinel-2 images. Ann. Braz. Acad. Sci. 2022, 94, e20210596. [Google Scholar] [CrossRef]
- Miranda, V.; Pina, P.; Heleno, S.; Vieira, G.; Mora, C.; Schaefer, C.E. Monitoring recent changes of vegetation in Fildes Peninsula (King George Island, Antarctica) through satellite imagery guided by UAV surveys. Remote Sens. 2020, 11, 1105. [Google Scholar] [CrossRef] [PubMed]
- Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar] [CrossRef]
- Bhatt, U.S.; Walker, D.A.; Raynolds, M.K.; Bieniek, P.A.; Epstein, H.E.; Comiso, J.C.; Pinzon, J.E.; Tucker, C.J.; Polyakov, I.V. Recent Declines in Warming and Vegetation Greening Trends over Pan-Arctic Tundra. Remote Sens. 2013, 5, 4229–4254. [Google Scholar] [CrossRef]
- Moreira, A.; Prats-Iraola, P.; Younis, M.; Krieger, G.; Hajnsek, I.; Papathanassiou, K.P. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 2013, 1, 6–43. [Google Scholar] [CrossRef]
- Gabarró, C.; Hughes, N.; Wilkinson, J.; Bertino, L.; Bracher, A.; Diehl, T.; Dierking, W.; Gonzalez-Gambau, V.; Lavergne, T.; Madurell, T.; et al. Improving satellite-based monitoring of the polar regions: Identification of research and capacity gaps. Front. Remote Sens. 2023, 4, 952091. [Google Scholar] [CrossRef]
- Sarah, N. Banks, Douglas J. King, A.M.; Duffe, J. Characterizing Scattering Behaviour and Assessing Potential for Classification of Arctic Shore and Near-Shore Land Covers with Fine Quad-Pol RADARSAT-2 Data. Can. J. Remote Sens. 2014, 40, 291–314. [Google Scholar] [CrossRef]
- Lucieer, A.; Turner, D.; King, D.H.; Robinson, S.A. Using an Unmanned Aerial Vehicle (UAV) to capture micro-topography of Antarctic moss beds. Int. J. Appl. Earth Obs. Geoinf. 2014, 27, 53–62. [Google Scholar] [CrossRef]
- Meng, R.; Yang, D.; McMahon, A.; Hantson, W.; Hayes, D.; Breen, A.; Serbin, S. A UAS Platform for Assessing Spectral, Structural, and Thermal Patterns of Arctic Tundra Vegetation. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 9113–9116. [Google Scholar] [CrossRef]
- Cunliffe, A.M.; J Assmann, J.; N Daskalova, G.; Kerby, J.T.; Myers-Smith, I.H. Aboveground biomass corresponds strongly with drone-derived canopy height but weakly with greenness (NDVI) in a shrub tundra landscape. Environ. Res. Lett. 2020, 15, 125004. [Google Scholar] [CrossRef]
- Greaves, H.E.; Eitel, J.U.H.; Vierling, L.A.; Boelman, N.T.; Griffin, K.L.; Magney, T.S.; Prager, C.M. 20 cm resolution mapping of tundra vegetation communities provides an ecological baseline for important research areas in a changing Arctic environment. Environ. Res. Commun. 2019, 1, 105004. [Google Scholar] [CrossRef]
- Ford, D.; Beilman, D.; Groff, D. Land Surface Temperature Mapping and Analysis of Moss Banks on the Western Antarctic Peninsula using a multi-sensor UAV. Earth Space Sci. Open Arch. 2022. [Google Scholar]
- Zmarz, A.; Karlsen, S.R.; Kycko, M.; Korczak-Abshire, M.; Gołębiowska, I.; Karsznia, I.; Chwedorzewska, K. BVLOS UAV missions for vegetation mapping in maritime Antarctic. Front. Environ. Sci. 2023, 11, 1154115. [Google Scholar] [CrossRef]
- Yang, J.; Lee, Y.K.; Chi, J. Spectral unmixing-based Arctic plant species analysis using a spectral library and terrestrial hyperspectral Imagery: A case study in Adventdalen, Svalbard. Int. J. Appl. Earth Obs. Geoinf. 2023, 125, 103583. [Google Scholar] [CrossRef]
- Lockhart, K.; Sandino, J.; Amarasingam, N.; Hann, R.; Bollard, B.; Gonzalez, F. Unmanned Aerial Vehicles for Real-Time Vegetation Monitoring in Antarctica: A Review. Remote Sens. 2025, 17, 304. [Google Scholar] [CrossRef]
- Siewert, M.B.; Olofsson, J. Scale-dependency of Arctic ecosystem properties revealed by UAV. Environ. Res. Lett. 2020, 15, 094030. [Google Scholar] [CrossRef]
- Zagajewski, B.; Kycko, M.; Tømmervik, H.; Bochenek, Z.; Wojtuń, B.; Bjerke, J.W.; Kłos, A. Feasibility of hyperspectral vegetation indices for the detection of chlorophyll concentration in three high Arctic plants: Salix polaris, Bistorta vivipara, and Dryas octopetala. Acta Soc. Bot. Pol. 2018, 87, 3604. [Google Scholar] [CrossRef]
- Levy, J.; Cary, S.C.; Joy, K.; Lee, C.K. Detection and community-level identification of microbial mats in the McMurdo Dry Valleys using drone-based hyperspectral reflectance imaging. J. R. Soc. Interface 2020, 17, 20200243. [Google Scholar] [CrossRef]
- Turner, D.; Cimoli, E.; Lucieer, A.; Haynes, R.S.; Randall, K.; Waterman, M.J.; Lucieer, V.; Robinson, S.A. Mapping water content in drying Antarctic moss communities using UAS-borne SWIR imaging spectroscopy. Remote Sens. Ecol. Conserv. 2024, 10, 296–311. [Google Scholar] [CrossRef]
- Liu, N.; Budkewitsch, P.; Treitz, P. Examining spectral reflectance features related to Arctic percent vegetation cover: Implications for hyperspectral remote sensing of Arctic tundra. Remote Sens. Environ. 2017, 192, 58–72. [Google Scholar] [CrossRef]
- Putkiranta, P.; Räsänen, A.; Korpelainen, P.; Erlandsson, R.; Kolari, T.H.; Pang, Y.; Villoslada, M.; Wolff, F.; Kumpula, T.; Virtanen, T. The value of hyperspectral UAV imagery in characterizing tundra vegetation. Remote Sens. Environ. 2024, 308, 114175. [Google Scholar] [CrossRef]
- Greaves, H.E.; Vierling, L.A.; Eitel, J.U.; Boelman, N.T.; Magney, T.S.; Prager, C.M.; Griffin, K.L. High-resolution mapping of aboveground shrub biomass in Arctic tundra using airborne lidar and imagery. Remote Sens. Environ. 2016, 184, 361–373. [Google Scholar] [CrossRef]
- Greaves, H.E.; Vierling, L.A.; Eitel, J.U.H.; Boelman, N.T.; Magney, T.S.; Prager, C.M.; Griffin, K.L. Applying terrestrial lidar for evaluation and calibration of airborne lidar-derived shrub biomass estimates in Arctic tundra. Remote Sens. Lett. 2017, 8, 175–184. [Google Scholar] [CrossRef]
- Collins, A.; Andresen, C.; Charsley-Groffman, L.; Cochran, T.; Dann, J.; Lathrop, E.; Riemersma, G.; Swanson, E.; Tapadinhas, A.; Wilson, C. UAS Lidar Mapping of an Arctic Tundra Watershed: Challenges and Opportunities. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 44, 1–8. [Google Scholar] [CrossRef]
- Orndahl, K.M.; Ehlers, L.P.; Herriges, J.D.; Pernick, R.E.; Hebblewhite, M.; Goetz, S.J. Mapping tundra ecosystem plant functional type cover, height, and aboveground biomass in Alaska and northwest Canada using unmanned aerial vehicles. Arct. Sci. 2022, 8, 401–423. [Google Scholar] [CrossRef]
- Cunliffe, A.M.; Tanski, G.; Radosavljevic, B.; Palmer, W.F.; Sachs, T.; Lantuit, H.; Kerby, J.T.; Myers-Smith, I.H. Rapid retreat of permafrost coastline observed with aerial drone photogrammetry. Cryosphere 2019, 13, 1513–1528. [Google Scholar] [CrossRef]
- Stendardi, L.; Karlsen, S.R.; Malnes, E.; Nilsen, L.; Tømmervik, H.; Cooper, E.; Notarnicola, C. Multi-Sensor Analysis of Snow Seasonality and a Preliminary Assessment of SAR Backscatter Sensitivity to Arctic Vegetation: Limits and Capabilities. Remote Sens. 2022, 14, 1866. [Google Scholar] [CrossRef]
- Langford, Z.L.; Kumar, J.; Hoffman, F.M.; Breen, A.L.; Iversen, C.M. Arctic Vegetation Mapping Using Unsupervised Training Datasets and Convolutional Neural Networks. Remote Sens. 2019, 11, 69. [Google Scholar] [CrossRef]
- Thomson, E.R.; Spiegel, M.P.; Althuizen, I.H.; Bass, P.; Chen, S.; Chmurzynski, A.; Halbritter, A.H.; Henn, J.J.; Jónsdóttir, I.S.; Klanderud, K.; et al. Multiscale mapping of plant functional groups and plant traits in the High Arctic using field spectroscopy, UAV imagery and Sentinel-2A data. Environ. Res. Lett. 2021, 16, 055006. [Google Scholar] [CrossRef]
- Ignatenko, V.; Laurila, P.; Radius, A.; Lamentowski, L.; Antropov, O.; Muff, D. ICEYE Microsatellite SAR Constellation Status Update: Evaluation of First Commercial Imaging Modes. In Proceedings of the 2020 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Waikoloa, HI, USA, 26 September–2 October 2020; pp. 3581–3584. [Google Scholar] [CrossRef]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).