Next Article in Journal
Typical Fault Detection on Drone Images of Transmission Lines Based on Lightweight Structure and Feature-Balanced Network
Previous Article in Journal
Unmanned Aerial Vehicle 3D Path Planning Based on an Improved Artificial Fish Swarm Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges

Environmental Solutions Initiative, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Drones 2023, 7(10), 637; https://doi.org/10.3390/drones7100637
Submission received: 25 August 2023 / Revised: 22 September 2023 / Accepted: 22 September 2023 / Published: 16 October 2023

Abstract

:
This paper reviews the diverse applications of drone technologies in the built environment and their role in climate change research. Drones, or unmanned aerial vehicles (UAVs), have emerged as valuable tools for environmental scientists, offering new possibilities for data collection, monitoring, and analysis in the urban environment. The paper begins by providing an overview of the different types of drones used in the built environment, including quadcopters, fixed-wing drones, and hybrid models. It explores their capabilities and features, such as high-resolution cameras, LiDAR sensors, and thermal imaging, which enable detailed data acquisition for studying climate change impacts in urban areas. The paper then examines the specific applications of drones in the built environment and their contribution to climate change research. These applications include mapping urban heat islands, assessing the energy efficiency of buildings, monitoring air quality, and identifying sources of greenhouse gas emissions. UAVs enable researchers to collect spatially and temporally rich data, allowing for a detailed analysis and identifying trends and patterns. Furthermore, the paper discusses integrating UAVs with artificial intelligence (AI) to derive insights and develop predictive models for climate change mitigation and adaptation in urban environments. Finally, the paper addresses drone technologies’ challenges and the future directions in the built environment. These challenges encompass regulatory frameworks, privacy concerns, data management, and the need for an interdisciplinary collaboration. By harnessing the potential of drones, environmental scientists can enhance their understanding of climate change impacts in urban areas and contribute to developing sustainable strategies for resilient cities.

1. Introduction

The proliferation of drone technologies has revealed new frontiers for climate change research and analysis in urban environments. Drones, also known as unmanned aerial vehicles (UAVs), provide scientists with unprecedented abilities to collect detailed spatial and temporal data about the built environment, enabling for more robust studies on the impacts of climate change on cities [1]. Whereas satellite imagery was previously allowed for large-scale data accessibility, drones at present facilitate close-range data capture, monitoring, and mapping at new scales and from multiple angles [2]. With high-resolution cameras, LiDAR sensors, and other payload instruments, UAVs can generate precise 3D models, temperature measurements, and air pollution readings that uncover the granular patterns and trends associated with global warming [3]. As urban centers grapple with intensifying climate change effects, from urban heat islands to infrastructure vulnerabilities, drones have become integral to the efforts aimed at adaptation, mitigation, and developing resilience.
UAVs play an instrumental role in climate change research in the built environment, ranging from mapping urban heat islands, assessing building energy efficiencies, monitoring air quality, and identifying sources of greenhouse gas emissions [4]. By enabling hyperlocal real-time data collection across cities, UAVs can precisely capture heat differentials, thermal leakage, and atmospheric changes [5]. UAVs also allow for the regular inspection of building insulation and the detection of weak links in building envelopes that exacerbate energy consumption [6,7]. Additionally, the aerial mobility of drones facilitates cost-effective air-quality sampling at different altitudes and locations across metro areas [8]. Such granular quantitative data are pivotal for cities to diagnose climate vulnerabilities and layer risk profiles. UAVs have become especially valuable for climate change assessment and planning in dense urban areas, where their ability to collect nuanced data amidst complex built environments provides advantages over conventional UAVs and also allows cities to regularly monitor the efficiency of green infrastructure projects designed to mitigate climate change impacts through a detailed inspection and thermal imaging [9]. Moreover, the combination of rich UAV data-sets with artificial intelligence and machine learning algorithms offers additional capabilities for predictive climate change modeling and scenario planning in urban contexts [10]. By processing drone-captured imagery and readings using neural networks, the researchers can identify climate change-related patterns, generate simulations, and assess the efficacy of potential adaptation strategies [11]. As UAVs become more popular in environmental research, they are poised to provide the forensic-level insights cities need to respond to intensifying climate change impacts.
This paper provides a comprehensive overview of drone technologies being leveraged in cities globally to assess and address climate change challenges. The paper details the types of sensing payloads and data analytics combined with drones to reveal the insights while examining the existing research where UAVs inform urban climate change strategies. Additionally, the regulatory and ethical considerations surrounding broader UAV deployments for environmental research are discussed. By highlighting UAVs and artificial intelligence (AI) as an emerging lynchpin in cities’ climate change mitigation and adaptation efforts, this paper aims to spur further the innovations in using UAVs to create more sustainable and resilient urban futures.

1.1. Motivation and Purpose

UAVs are increasingly recognized as critical to improving completion time, performance, and flexibility for numerous tasks supporting ground infrastructure and networks. As climate change increases the risks to the built environment, properly integrating UAVs into these systems is vital. The effective utilization of UAVs to bolster resilience requires careful consideration of the relevant requirements. UAV assistance shows promise for expediting structural assessments, coordinating emergency response, assessing damage, and mitigating climate threats. Strategic UAV implementation can strengthen the adaptability and durability of buildings, transportation networks, utilities, and communication systems against intensifying climate impacts, specifically within the built environment. Further research and planning are essential to fully leverage UAVs’ capabilities for safeguarding the built environment as climate risks escalate. UAVs have significant potential to support the rapidly changing urban infrastructure; however, effectively accomplishing these tasks presents challenges. Specific applications require tailoring UAV capabilities to address distinct difficulties. For example, deploying UAVs to assess urban damage and coordinate emergency responses after climate disasters demands careful planning to maximize resilience benefits. Further research on integrating UAVs into built environments can enable them to serve ground users better and fulfill assigned tasks amidst intensifying climate change threats.
The escalating Impacts of climate change and the rapid development of AI present new opportunities to leverage UAVs for the resilience of the built environment. As extreme weather intensifies, AI-enabled UAVs have considerable potential to provide real-time hazard assessments, accurate monitoring for the built infrastructure, and coordinate emergency response post-climate hazards. However, fully capitalizing on emerging UAV and AI capacities requires a thoughtful integration into existing systems. Key questions remain regarding the optimal roles, design priorities, and deployment strategies for UAVs in built environments under climate change. Key questions remain regarding the most suitable deployment strategies for UAVs in built environments under climate change, how UAVs and AI can be utilized to evaluate shifting risk profiles and climate change vulnerabilities, and what new sensing, computing, and communication capabilities are needed. As computing power, battery technology, and automation advance, strategic UAV implementation can provide adaptive, comprehensive climate threat assessments and emergency responses. However, further research and planning are essential to unlock the full potential of UAVs and AI in creating a climate-resilient infrastructure.
These questions motivate assessing the existing and proposed UAV-assisted solutions for built environments under climate change challenges. Potential UAV applications include real-time hazard monitoring, emergency response coordination, connectivity restoration for damaged networks, and climate vulnerability assessments. Integrating UAVs and AI requires categorizing capabilities based on mission types, such as routing, deployment coordination, cellular communications, disaster management, data gathering, surveillance, and secure communications. Capitalizing on the rapid advancement of AI and UAV technologies can provide adaptive climate threat mitigation and emergency responses. However, additional research is needed to devise optimal strategies for the UAV and AI integration that maximize the resilience of the built environment under intensifying climate risks. A comprehensive evaluation of UAV-based solutions elucidates the high-impact applications to strengthen the built environment.
This paper aims to provide a comprehensive overview of how drone or UAV technologies are transforming climate change research and environmental analyses in the urban built environment. As cities grapple with growing climate change impacts, from urban heat islands to infrastructure vulnerability, UAVs are emerging as a vital tool for planners, architects, engineers, and environmental scientists seeking to understand and mitigate these challenges. The paper examines how UAVs allow new kinds of climate-related data collection, monitoring, and modeling in the complex vertical landscapes of urban areas. Their capacity to flexibly survey the built environment from low altitudes, with customizable sensor payloads, enables climate studies at new spatial and temporal detail levels that are challenging to achieve with satellites or ground observations alone. This paper highlights the demonstrated global applications of UAVs, illustrating how they are becoming integral to urban climate change research. Their contributions span mapping heat islands, assessing building energy use, identifying emissions sources, monitoring green infrastructure efficacy, and surveying climate vulnerabilities. When integrated with emerging digital technologies, such as AI, UAVs’ potential for climate modeling and predictive analyses is also discussed. Finally, the paper examines the practical challenges and future directions for broader UAV deployment. The transformative capabilities of UAVs for climate change research in complex urban environments are explored, underscoring their importance in developing resilient and sustainable cities.

1.2. UAVs: Roots and Advancements

UAVs have undergone significant advancements and have become increasingly prevalent in various fields. This section explores the history and development of UAV technology, highlighting its evolution and the factors contributing to its widespread adoption. The development of UAV technology was driven by the need for improved capabilities in military operations and the emergence of new applications in civilian sectors, such as agriculture, environmental monitoring, and infrastructure management. Additionally, the integration of advanced technologies, such as artificial intelligence (AI), Internet of Things (IoT), and machine learning (ML), further enhanced the capabilities and potential of UAVs.

1.2.1. Military Influence and Technological Advancements in UAVs

The roots of UAV technology can be traced back to the early 20th century, when the first attempts at unmanned flights were performed. However, it was not until the early 2000s that UAV technology started gaining popularity for civilian applications, including agriculture and aerial photography [12]. The advancements in miniaturization, sensor technology and communication systems played a crucial role in developing UAVs. Initially, UAVs were primarily used for military purposes, providing surveillance and reconnaissance capabilities without risking human lives. Over time, the technology evolved and UAVs became more sophisticated, capable of conducting complex missions and tasks. The military has played a significant role in driving the development and advancement of UAV technology. Initially, UAVs were primarily used for military surveillance and surveys. The need for unmanned systems that could perform these tasks led to substantial investments in the research and development, resulting in technological advancements in the materials, propulsion systems, avionics, and payload capacity [13]. These technological advancements also led to the miniaturization of UAVs, making them more agile and versatile in various environments.
One of the key advantages of UAVs in military applications is their ability to reduce risks to military personnel. By deploying UAVs for reconnaissance missions, military forces can gather critical intelligence without endangering soldiers’ lives. This has led to increased accuracy in targeting activity and reduced collateral damage, which has important implications for the ethical and legal aspects of military operations. Additionally, UAVs have the potential to enhance situational awareness and support decision-making processes in complex and dynamic battlefield scenarios [13]. The ability to deploy fleets or swarms of UAVs further enhances their capabilities, enabling collaborative operations and data fusion for improved situation management.
The military’s influence on UAV technology has also significantly impacted its expansion into civilian applications. The advancements made for military purposes have paved the way for using UAVs in non-military contexts, such as logistics, environmental studies, civil protection, and disaster response [13]. The decreasing cost of components, improved battery technology, and advancements in autonomous navigation systems have made UAVs more accessible and versatile for civilian applications [14]. For example, in agriculture, UAVs are used for crop monitoring, precision spraying, and livestock management [15,16]. In environmental monitoring, UAVs enable researchers to collect high-resolution data for studying climate change impacts, mapping urban heat islands, and monitoring air quality [17,18]. The versatility and adaptability of UAVs have made them valuable tools in various industries, contributing to increased efficiency, reduced costs, and improved safety [19].

1.2.2. UAVs and Civilian Applications

UAVs have found numerous applications in civilian sectors, revolutionizing various industries and addressing a wide range of challenges.
Moreover, UAVs have proven valuable environmental monitoring and research tools. They enable the collection of high-resolution data for studying ecosystems, biodiversity, and environmental changes. For example, UAVs with remote sensing technologies have been used to monitor vegetation health, assess land-cover changes, and map natural habitats [20]. In addition, UAVs have been employed for wildlife monitoring, providing insights into animal behavior, population dynamics, and habitat assessment. A study by [21] discussed the remarkable rise of unmanned aerial vehicles in environmental research and their potential benefits compared to traditional data collection methods. The study focused on wildlife population monitoring, where the precision and accuracy of population counts were of utmost importance. This study provides compelling evidence for the practical benefits of using UAVs for wildlife population monitoring. The precision, cost-effectiveness, and data quality-improvements offered by drones have the potential to revolutionize how researchers gather critical information about wildlife populations. This technology is not only promising for improving the accuracy of population estimates, but also for advancing our understanding of ecological dynamics, ultimately contributing to more effective conservation efforts.
The use of UAVs in environmental research offers a cost-effective and efficient means of data collection, allowing for an improved understanding and management of ecosystems. Their ability to fly at low altitudes and capture high-resolution images allows for the detailed analysis and monitoring of natural environments. Additionally, they have the advantage of being able to access remote or challenging terrain, making them ideal for studying hard-to-reach areas or areas that may be hazardous for humans [22]. They can fly at low altitudes and capture detailed images, allowing for the identification and monitoring of specific features or species within an ecosystem [23]. This capability is instrumental in monitoring wildlife in heterogeneous habitats and topographically challenging areas. Furthermore, UAVs have been employed in imaging and surveillance applications, providing aerial views and the real-time monitoring of events and locations [24]. They have been used for aerial photography, cinematography, and mapping, offering unique perspectives and capturing striking visuals. In the realm of public safety and emergency response, UAVs have proven to be invaluable tools. They have been used for search-and-rescue operations, providing real-time aerial views and assisting in locating missing persons or assessing disaster-affected areas [25]. Moreover, UAVs equipped with infrared/thermal imaging cameras can detect heat signatures and aid in locating individuals in challenging environments [26]. Infrared radiation is invisible electromagnetic radiation emitted by all objects based on their temperature, with hotter objects, such as people, emitting more radiation than cooler objects, such as buildings.
Thermal imaging cameras contain special sensors that detect infrared radiation and create images showing temperature differences, with warmer objects appearing brighter. This allows thermal cameras to visualize heat variations not visible to the human eye [27]. For example, a human body normally emits more infrared radiation than its surroundings; therefore, it would appear as a bright shape against a darker background to a thermal camera, even if hidden behind bushes or in darkness. Because of this ability to detect heat signatures, thermal cameras on UAVs can identify and track the unique infrared radiation emitted by human bodies, aiding search-and-rescue or surveillance efforts in challenging conditions with poor visibility [28]. In addition, they can be used for traffic monitoring, crowd control, and surveillance in law enforcement applications. The widespread adoption of UAVs in civilian applications faces numerous challenges. Regulatory frameworks and guidelines need to be developed to ensure the safe and responsible use of drones in a civilian airspace [29]. Privacy concerns and ethical considerations must also be addressed, particularly in applications, such as surveillance and data collection [30]. Additionally, processing the extensive data collected by drones can be complex and requires specialized technical skills and computing capacities.

1.2.3. Advanced Technologies and UAVs Advancements

UAVs range vastly in size, design, and capabilities but generally contain advanced sensors, navigation systems, remote communication links, and autopilot technology, enabling increasingly autonomous flight- and data-capture results [31]. While initially constrained by high costs, short flight times, and limited payloads, UAVs have benefited immensely from numerous technological advancements, such as improved battery technology, lightweight composite materials, satellite navigation integration, and computing power supporting real-time data processing [32]. The expanding availability of affordable consumer-grade UAVs has further fueled applications by providing customizable airborne platforms for environmental sensing.
A key innovation spurring drone adoption has been the development of flexible, lightweight battery technology that can store the electrical charge needed to power rotors, sensors, and navigation systems while minimizing the overall aircraft weight [33]. Moreover, lithium polymer batteries allow for necessary power storage versus payload ratios for 30 min or more of sustained flight times for many UAV platforms [34]. Charging technologies have also improved to enable rapid recharging between missions [35]. Advanced composite materials, such as carbon fiber, also provide high strength-to-weight ratios for drone airframes and components. The hardening of microprocessors and sensors has further enabled drones to operate in harsh environmental conditions related to factors, such as temperature, humidity, and vibration [2]. Satellite-based navigation integration has been pivotal to developing UAVs, facilitating automated flight control and precise positioning capabilities [36]. Global positioning systems (GPS) and inertial measurement units (IMUs) allow drones to stabilize, calculate altitude, and follow pre-programmed routes or dynamic flight paths [37]. Onboard processing capacity leveraging graphics processing units (GPUs) supports the analysis of incoming sensor data as well as computer vision techniques, such as simultaneous localization and mapping (SLAM) for situational awareness during flight. Advanced onboard artificial intelligence can even optimize missions based on the variables encountered mid-flight or integrate the findings from previous flights [38]. Control and feedback are provided to users via radio links over a few kilometers, with some systems enabling a first-person-view (FPV) flight through a real-time video transmission [39].

1.3. UAVs and Data Capabilities

The urban environment plays a significant role in climate change, both as a contributor to greenhouse gas emissions and as a site of vulnerability to its impacts. Understanding the dynamics of climate change in urban areas is crucial for developing effective mitigation and adaptation strategies. Data collection, monitoring, and analysis in the urban environment are essential components of climate change research, providing valuable insights into the interactions between urbanization, climate, and human activities.
Satellite imagery has traditionally been a valuable tool for monitoring and analyzing climate change in urban areas. However, there are existing gaps in satellite imagery that limit its effectiveness in capturing the fine-scale details and monitoring dynamic urban environments. These gaps include limitations in the spatial resolution, temporal coverage, and cloud cover interference [40,41]. These limitations hinder accurately assessing and monitoring climate-related phenomena in urban areas. An essential characteristic of UAVs is their coverage and the resolution of the images. Figure 1 shows the scale of detail that can be achieved with drones and other imaging systems. The figure shows the application in an agriculture field; however, it is helpful to identify the scales that can be managed with UAVs. Satellite imagery covers the globe and can be used in applications on the Kilometers scale. On the other hand, UAVs can cover areas of several kilometers of extension and, at the same time, can identify details at a 1 cm scale or less. Therefore, UAVs can identify detailed defects or elaborate inventories of several kilometers [42].
Moreover, UAVs offer unique advantages over conventional geomatics approaches for capturing the highly detailed 3D spatial data of study sites, as illustrated in Figure 1. Compared to terrestrial laser scanning (TLS) systems, UAVs with LiDAR provide faster area coverage and greater flexibility to survey complex topographies and vertical structures [2]. While TLS involves fixed scan positions with occlusion gaps, UAVs can fly multiple ideal trajectories to minimize shadowing and occlusions in point clouds [43]. UAV LiDAR additionally captures geometries inaccessible from the ground, such as rooftops. Meanwhile, versus crewed aircraft LiDAR, UAVs achieve an improved low-altitude measurement resolution, down to centimeters, and cost savings of up to 80% [44]. Their capacity to hover and maneuver facilitates denser precision scanning.
Figure 1. (a) UAVs and different geomatics technologies; (b) UAV coverage scale compared to other data collection technologies [42,45].
Figure 1. (a) UAVs and different geomatics technologies; (b) UAV coverage scale compared to other data collection technologies [42,45].
Drones 07 00637 g001
For data acquisition technologies, satellites, UAVs, helicopters, and airplanes each have advantages and limitations regarding the field of view, payload capacity, operating costs, and availability. As illustrated in Figure 2, satellites can cover vast areas with revisit times of days to weeks, but provide relatively low-resolution imagery from their high-altitude orbits at 400–800 km above Earth [46]. In contrast, UAVs fly at altitudes below 7 km, achieving centimeter-scale image resolutions over areas up to a few square kilometers [2]. Their flexibility and low cost have fueled UAV adoption; however, the limited battery life restricts their flight duration to about 30 min. Manned helicopters exceed UAV flight endurance, with large models capable of carrying heavy sensors for hours of operation. However, the acquisition and operational expenses are over 50-times higher than UAVs [47]. Airplanes also have extended flight times and can carry substantial payloads with lower costs than helicopters, but lack hover and low-altitude flight capabilities.
Additionally, while satellites achieve comprehensive coverage, UAVs fill a vital niche by providing ultra-high-resolution images below what most commercial satellites offer [49]. Their imagery supports 3D point cloud creation, down to centimeter accuracy, once processed using computer vision techniques [50]. In addition to data collection, UAVs enable real-time monitoring and analysis in the urban environment. They can be equipped with sensors and instruments to measure climate-related information, such as temperature, humidity, air quality, and solar radiation, in real time [51,52]. This real-time monitoring allows for immediate feedback and decision making, facilitating timely responses to climate-related events and informing urban planning and management strategies. The flexibility, automation, and miniaturization of UAV platforms enable nimble, customizable 3D data capture tailored to project sites. Multi-copter drones can hover and rotate over targets of interest, acquiring geometrically accurate imagery that is challenging to match via alternative approaches [46]. Meanwhile, achieving comparable perspectives and positional flexibility using manned aircrafts would have substantially higher costs and logistic complexities. UAVs’ capacity for on-demand, close-range data acquisition makes them a uniquely valuable geomatics tool providing efficient, scalable, and high-quality 3D spatial data.
Additionally, unlike satellite platforms that have fixed orbital periods, UAVs can be deployed on demand to collect targeted data in response to evolving needs. Their low-altitude flight allows for centimeter-scale image resolutions, capturing details, such as individual trees, buildings, and transportation infrastructures. Customizable sensor payloads can integrate high-precision RGB, multispectral, thermal, and LiDAR instruments to match the specific data requirements of each study. A study by Meisam et al. [53] highlighted that drones provided data of unprecedented spatial, spectral, and temporal resolutions, making them ideal for monitoring the environment. They bridge the gap between field observations and remote sensing by providing high-quality spatial-resolution details over large areas in a cost-effective way. Moreover, the researchers [54] developed a simulation platform called InDrone to model UAV flight behavior and human–drone interactions for indoor infrastructure inspections. By visualizing the spatiotemporal relationships between UAV dynamics, sensor coverage, environment reconstructions, and operator commands, InDrone enables the analysis and optimization of collision avoidance algorithms and pilot training protocols. Refining these capabilities can improve the UAV navigation stability and data collection accuracy when mapping indoor thermal and air-quality conditions that influence building energy usage. The study demonstrated how leveraging UAV flight simulations could further develop automated, human-assisted, and collaborative methodologies for UAVs to safely and precisely acquire climate-relevant data at new interior vantage points. This supports the emerging potential of UAVs for the fine-grained, versatile, and comprehensive monitoring of built infrastructure emissions and hazards linked to climate change resilience. These studies exemplify how UAVs can host customized payloads to acquire geospatial data at centimeter resolutions. Their ability to flexibly sample the built environment at low altitudes provides climate researchers with unprecedented details for energy use, emissions, and environmental hazards. This supports the development of spatially targeted, timely interventions to advance local climate resilience.

2. UAV Platform Evolution

UAVs applied to climate research in urban areas have leveraged a diverse range of aircraft configurations optimized for localized data collection in complex built environments. This section examines the main UAV platforms and configurations utilized for climate change research and environmental analysis missions across urban built environments. With the proliferation of UAV technologies in recent years, a diverse range of customizable aerial platforms has emerged, each with distinct capabilities and attributes to address specialized data collection, infrastructure inspection, and monitoring needs within complex cityscapes. By reviewing the key characteristics, advantages, and limitations of multirotor, fixed-wing, and hybrid VTOL drones, this section offers a perspective into optimal UAV selections tailored to climate study requirements in densely built urban settings.

2.1. UAV Platforms Based on Aerodynamic Features

UAVs exhibit great diversity in their aerodynamic characteristics, which enables them to serve varied roles and applications. Based on the various attributes, such as their lift generation mechanism, flight speed, and range, UAVs can be categorized into quadcopters, fixed-wing, hybrid VTOL, lighter than air, and flapping wings. Quadcopters have rapidly become one of the most ubiquitous drone categories deployed across metropolitan areas due to their exceptional maneuverability and ability to take off, land, and hover in confined spaces. As rotorcrafts, quadcopters utilize four horizontally oriented lifting rotors to generate the vertical thrust needed for hovering and low-speed flights [55]. Unlike helicopters relying on a single main rotor for lift, quadcopters distribute thrust across four smaller rotors, providing a higher payload capacity for a given size and improved control redundancy and safety [3]. Adjusting the speed of each rotor enables omnidirectional stability and maneuvering in tight urban areas not accessible by fixed-wing UAVs. This makes quadcopters ideal for the proximate scanning and inspection of buildings, infrastructures, urban canyons, parks, and other environmental elements, which comprise complex cityscapes. Table 1 summarizes the differences between different UAV platforms based on their aerodynamic features, applications, and technical limitations.
While quads were initially limited by short flight times, the advances in batteries, motors, and materials have improved their endurance of 30 min or longer, even for small models [56]. Larger octocopter or hexacopter UAVs can extend their flight duration further and lift heavier sensor payloads. Camera gimbals minimize image blurring during flight maneuvers and hovering. When choosing multirotors for climate change research in urban areas, factors, such as maximum sensor weight, wind resilience, noise profiles, safety features, and integration options, for data communications and processing warrant consideration.
Table 1. Features of different UAV platforms and their applications in the built environment [15,39,57].
Table 1. Features of different UAV platforms and their applications in the built environment [15,39,57].
PlatformFlight SpeedFlight RangeApplications in the Built EnvironmentLimitations
Quadcopters0–35 mph1–3 kmUrban inspection
Urban microclimate
Limited payload capacity
Short flight times
Ducted Fan0–60 mph2–7 kmUtility inspection
Vertical infrastructure mapping
Limited payload
Complex maintenance
Fixed Wings50–90 mph10–40 kmUrban thermal mapping
Air pollution monitoring
Require assisted launch/landing
Minimal maneuverability
Hybrid VTOL0–80 mph5–25 kmLarge-scale mapping
Environmental monitoring
Complex transition mechanism
Heavier than fixed wings
Ducted-fan UAVs utilize an enclosed rotor system combining the benefits of multirotor and fixed-wing drones. They employ enclosed fans mounted within circular ducts to generate a vertical lift, eliminating exposed blades that can pose safety risks [58]. Ducting also streamlines the airflow, increasing the efficiency over open rotors [59]. This provides greater endurance than quadcopters, with flight times of around 45–60 minutes, depending on the conditions [55]. Ducted fans typically have a fixed wing for a forward flight, such as conventional UAVs. Transitioning between hovering and high-speed cruise flight modes gives ducted fans flexibility for constrained takeoff and landing tasks and sustained transits [60]. Compared to quadcopters, they can cover larger areas rapidly while still allowing a close inspection through stable hovering, as illustrated in Figure 3. Ducting shields rotors from obstruction, enabling drone operations in confined spaces inaccessible to fixed wings [39].
Fixed-wing UAVs offer streamlined airframe designs that generate a lift via forward airspeed, such as a conventional airplane. Their aerodynamic efficiency, combined with non-rotating propulsion, enables substantially longer flight times and distances than multirotor drones, making them suitable for wide-area mapping [57]. However, fixed-wing UAVs cannot take off vertically, land, or hover, relying on catapults, hand launches, skids, or bellies for landing. This limits their utility for inspecting and scanning localized urban structures where hovering proximity is critical. The maneuvering flexibility is also reduced compared to multirotor UAVs. Yet, for extensive bird’s-eye urban imaging, fixed-wing UAVs play an important role.
Hybrid VTOL platforms aim to deliver the best attributes of multirotor takeoff and landing capabilities combined with the cruising range and speed of fixed wings. This is achieved by lifting off vertically using the rotor thrust before transitioning the rotors to forward flight propulsion once an adequate altitude is achieved [61]. Some advanced hybrids can even optimize the transition speed and angle based on factors, such as wind conditions, using onboard AI [62]. This blend of VTOL flexibility and improved flight efficiency makes hybrid drones well-suited to diverse climate sensing roles, from a localized infrastructure inspection to wide-area thermographic mapping. However, their mechanical complexity remains a reliability challenge compared to other UAV platforms, as shown in Figure 3.
Figure 3. UAV classifications based on their aerodynamic features, level of autonomy, sensors, and power sources. Based on [60,63,64,65].
Figure 3. UAV classifications based on their aerodynamic features, level of autonomy, sensors, and power sources. Based on [60,63,64,65].
Drones 07 00637 g003
Across UAV types, flight times reaching 30–60 minutes are commonplace at present, enabled by the developments in battery technology. Lithium polymer and lithium-ion packs can deliver the electrical storage versus weight ratio needed for sustained UAV operations. UAV airframes have also become more durable, yet lightweight, by utilizing carbon-fiber composites and engineered thermoplastics, such as PLA (polylactic acid) [39]. Enhanced satellite navigation integration provides robust autonomous control and stability. These airframes, avionic, and battery innovations underpin drones’ expanding capabilities for flexible and responsive climate research flights, even in turbulence-prone urban settings. Furthermore, UAVs offer highly customizable and scalable platforms for integrating specialized sensor payloads needed for diverse climate data collection roles. Visible-spectrum cameras support creating detailed 2D and 3D maps of urban morphologies, vegetations, and infrastructures [4]. Thermal infrared imaging enables microclimate variability, heat island mapping, and building insulation assessments [8]. Miniaturized gas sensors measure pollutants, such as carbon dioxide and particulate matter [66]. LiDAR scanners generate precise 3D point clouds of buildings and terrain [67]. Swapping these sensors between missions adapts UAVs to differing data capture needs. Onboard processing and AI further empower sensor integration and real-time analytics during flights.

2.2. UAV Sensors

Recent advances in sensor miniaturization, onboard processing, and UAV platforms have enabled specialized airborne data capture tailored to the multi-dimensional information needs of climate change assessments in urban environments. UAVs at present carry diverse instrument payloads to map, measure, and monitor the complex factors influencing the resilience, greenhouse gas emissions, adaptation efficacy, and sustainability across the built landscape. This section provides an overview of key UAV sensor types and their importance for climate research.
A.
High-Resolution Visible-Spectrum Cameras
High-resolution visible-spectrum cameras have become ubiquitous payloads on UAVs, providing highly detailed aerial images for mapping urban morphology and topography. Miniaturized cameras using CMOS sensors can capture still images or videos at resolutions down to centimeters per pixel from low UAV altitudes [32]. This facilitates precision 3D modeling and digital surface generation to analyze urban form factors correlated with climate impacts, such as heat retention. For example, Naughton et al. [68] used UAV-derived orthomosaics at a 2 cm resolution to correlate the urban geometry and heat island intensity by tracking land-surface temperature variations in the urban canyon. High-frequency UAV mapping further enables monitoring incremental changes in the built environment, urban green infrastructure, land use, and urban form relevant to design and evaluate the adaptation strategies.
Many leading commercial UAV platforms offer high-resolution visual-band camera options tailored to remote sensing and mapping applications. DJI is considered one of the most common and popular UAV platforms that dominates the commercial market of UAVs due to its technical capabilities and diversity. For example, The DJI Phantom 4 Pro has 1-inch CMOS sensors capturing 20-megapixel still images, making them suitable for sub-decimeter GSD mapping from low altitudes [46]. Larger models, including the DJI Inspire 2, leverage Micro Four Thirds cameras to provide an even higher resolution and image quality. Some manufacturers offer modular payloads, such as AgEagle’s S.O.D.A. camera line, spanning small to large formats to customize resolution needs [69]. While most UAV cameras use nadir perspectives, adding oblique capabilities provides more immersive urban structural data. Figure 4 compares the most common commercial UAVs, including the sensor’s specification, weight, cost, and operation platform. When selecting UAV visible cameras for built environment assessments, the key factors include the intended mapping resolution, area coverage needs, available payload capacity, and processing requirements. Larger sensors typically have a higher resolution but are heavier. Smaller drones may require lower-resolution cameras to stay within the payload limits. The photogrammetric processing time also increases exponentially with higher-resolution imagery. Therefore, the balancing resolution, payload, and area coverage are essential for efficient mapping. Radiometric processing for reflectance data adds a further complexity. Overall, turnkey fixed-camera UAVs offer simplicity, while customizable gimbal and swappable payload options provide greater flexibility to tailor visual data capture to the research needs.
B.
Multispectral Sensors
Multispectral sensors extend optical imaging into non-visible wavelengths, such as near-infrared, capturing unique data for vegetation health, water stress, biomass, and material characteristics unavailable from standard RGB cameras [75]. As UHI mitigation strategies often rely on expanding urban greenery, multispectral UAV data help monitor these heat-regulating plants’ growth, canopy size, and condition over time [76]. Near-infrared bands also enhance the differentiation of urban ground surfaces and materials relevant to urban energy modeling. While the resolution is coarser than for visible cameras, multispectral UAV-based mapping provides a rapid wide-area assessment, not requiring extensive processing. Figure 5 compares the different multispectral sensors commonly used in the market, resolution, costs, and compatibility with different UAV platforms.
Multispectral UAVs present immense opportunities for assessing and addressing climate vulnerabilities in cities. One of the main applications of UAVs with multispectral sensors in climate change research is monitoring urban vegetation health and its dynamics in cities. Multispectral sensors can capture the data related to chlorophyll content, leaf area index, and vegetation stress, providing insights into the health and vitality of urban vegetation [77,78]. This information is crucial for assessing the impact of climate change on urban ecosystems, identifying the areas of vulnerability, and developing strategies for urban greening and adaptation. Additionally, high-resolution thermal mapping coupled with meteorological data can significantly assist in modeling urban heat dynamics [79]. Tracking the vegetation NIR reflectance reveals drought-stressed areas needing water conservation and additional-shade trees [80]. Combined thermal and visible imagery facilitates quantifying the cooling provided by green spaces [81]. UAV spectral data also enable the construction of 3D urban models distinguishing rooftop materials to target solar panel deployment [82]. Moreover, repeat multispectral surveys allow the monitoring of the efficacy of resilience strategies, such as cool roofs and stormwater-retention landscaping.
Another application of multispectral sensors in climate change research is the analysis of land-cover changes in cities. By capturing the data in different spectral bands, these sensors can differentiate between land-cover types, such as impervious surfaces, vegetation, and water bodies [77,83]. This information helps monitor urban expansion, changes in land-use patterns, and the loss of green spaces. These are all critical factors for understanding the urban heat island effect and its contribution to climate change. In addition to vegetation and land-cover analyses, multispectral sensors on UAVs can also contribute to assessing urban heat island effects. By capturing thermal infrared data, these sensors can measure surface temperatures and identify areas of excessive heat in urban environments [84]. This information is crucial for understanding the spatial distribution of heat in cities, identifying heat mitigation strategies, and assessing the effectiveness of urban cooling interventions.
Figure 5. Comparison between standard multispectral UAV sensors, resolution, costs, and compatibility [70,85,86,87,88,89].
Figure 5. Comparison between standard multispectral UAV sensors, resolution, costs, and compatibility [70,85,86,87,88,89].
Drones 07 00637 g005
C.
Hyperspectral Sensors
Hyperspectral sensors enable UAVs to capture hundreds of contiguous spectral bands across a wide range of wavelengths [90]. Typical hyperspectral sensors cover wavelengths from 400–2500nm with a resolution below 10 nm [91]. Thus, they can capture a wide range of light spectrum bands to generate rich datasets that detect the features invisible to sensors with limited bandwidths. The large number of narrow spectral bands enables the identification of specific conditions and characteristics. Sensors with hundreds of bands, especially visible and infrared wavelengths, are increasingly utilized in diverse applications. Hyperspectral sensor developers, such as Headwall Photonics, specialize in small, rugged hyperspectral systems viable for UAV platforms, as illustrated in Figure 6. Integrating hyperspectral imaging into standard UAV electro-optical and infrared sensor suites poses engineering challenges; however, recent advances have made high-resolution UAV-based hyperspectral analysis achievable. This allows exploiting hyperspectral data’s detailed spectral signatures for UAV remote sensing tasks.
Compared to the multispectral data, hyperspectral data offers a superior identification and discrimination of targets due to the narrow bandwidth information acquisition. The key benefits of hyperspectral data are the capacity for precise spectral signature detection and monitoring; however, effective utilization requires determining the most informative bands for discriminating specific features while minimizing the redundant data. Careful hyperspectral sensor selection, calibration, and data processing allow realizing the benefits of rich spectral information for target identification and change detection. Within the built environment, high-spectral resolution facilitates identifying specific materials through absorption features and mapping of urban composition, including vegetation type, water quality, and infrastructure materials [92]. Hyperspectral sensors also help construct 3D models distinguishing building materials for assessing energy efficiency opportunities [93], where roof type and insulation defects prone to energy deficiencies can become more apparent.
Figure 6. Different hyperspectral sensors and their spatial and spectral resolutions [90,94,95,96,97].
Figure 6. Different hyperspectral sensors and their spatial and spectral resolutions [90,94,95,96,97].
Drones 07 00637 g006
D.
Meteorological, Chemical, and LiDAR Sensors
Meteorological sensors for UAVs encompass a range of technical specifications that enable the collection of crucial weather data. These sensors typically include instruments for measuring parameters, such as air temperature, relative humidity, atmospheric pressure, wind speed, and wind direction [98]. The accuracy and precision of these measurements are essential for understanding the dynamics of the atmosphere and its interactions with the environment. Common meteorological UAV sensors include miniaturized air temperature, humidity, barometric pressure, wind speed, and direction sensors that weigh as little as 25 grams [66]. UAVs facilitate sampling at low altitudes versus traditional weather balloons or weather stations that are relatively remote from the urban layer [99]. This enables the detailed measurement of near-surface urban heat islands, air pollution layers, and urban microclimates. For example, UAVs equipped with thermometers, hygrometers, and anemometers can monitor outdoor thermal comfort levels and help identify the areas needing improved ventilation at the neighborhood level [4]. Furthermore, meteorological sensors on UAVs can contribute to assessing air-quality and pollution levels in urban areas. By measuring parameters, such as particulate matter, carbon dioxide, and other pollutants, these sensors provide valuable data for understanding the impacts of climate change on air quality and human health [100]. Moreover, UAVs with meteorological sensors can profile vertical air temperature and humidity stratifications along building facades to optimize passive cooling and ventilation.
UAVs can also carry a range of miniature chemical sensors to detect gases, particulates, and environmental pollutants. Gas sensors, such as metal oxide semiconductors and electrochemical and infrared spectrometers, enable UAVs to identify and map concentrations of CO, CO2, SO2, NO2, and more [66]. Particulate matter sensors using laser scattering or embedded quartz crystal microbalances can measure PM2.5 and PM10 levels [101]. UAV flexibility allows the targeted sampling and gradient mapping of industrial emission plumes, volcanic outgassing, and air pollution, even in hazardous settings [12]. UAV chemical sensing shows immense promise for environmental research and regulation. Networked UAVs can provide rapid, widespread air-quality assessments and help enforce pollution standards [4,101]. For climate change research, UAV methane mapping can improve our understanding of gas emission sources, such as permafrost thaw, and quantify carbon cycle feedbacks [102]. Equipping UAVs with radon sensors can also improve earthquake predictions by tracking underground gas seepages [103,104]. As sensors become miniaturized, UAVs will likely grow into a standard airborne platform for environmental chemical sensing across scales from localized events to global processes. Figure 7 summarizes the different meteorological, chemical, and LiDAR sensors, resolution, size, and weight.
Figure 7. Meteorological, chemical, and LiDAR UAV sensors and their resolutions [33,105,106,107,108].
Figure 7. Meteorological, chemical, and LiDAR UAV sensors and their resolutions [33,105,106,107,108].
Drones 07 00637 g007
Light-detection and ranging (LiDAR) sensors utilize laser radiation to survey environmental features by measuring reflected laser pulses. By precisely timing the flight duration between emission and reflected beam detection, known as time of flight, the range or distance between the scanner and object can be calculated based on the speed of light. The angles of the emitted and reflected beams are also measured to determine precise 3D point locations. Detailed 3D maps of the surrounding geometry and terrain can be rapidly constructed through millions of timed laser pulse reflections from the LiDAR sensors. LiDAR’s active sensing approach using laser range determination enables the highly accurate mapping of physical environments. LiDAR sensors on UAV platforms (LiDAR-UAV) provide high- to very-high-spatial-resolution terrain and surface data. Unlike satellite data, LiDAR UAVs offer a superior resolution and are not limited by cloud cover conditions [109]. UAV-based LiDAR can also acquire accurate ground elevation data, even in areas with dense vegetation or restricted accessibility [110]. The studies show UAV LiDAR generates quality comparable to terrestrial laser scanning (TLS) for terrain mapping [111]. RGB photogrammetry from UAVs offers a lower-cost alternative for some vegetation surveys versus LiDAR [112,113]. However, photogrammetry has limitations in analyzing ground surfaces obscured by dense vegetation [114]. UAV LiDAR provides accessibility, ease of planning, and cost-effectiveness compared to airborne laser scanning (ALS) [115,116].
E.
Infrared Sensors
Infrared (IR) imaging measures land-surface temperature (LST) and sea-surface temperature (SST) by detecting the infrared radiation emitted from surfaces. The integration of lightweight thermal cameras on UAVs has proven transformative across industries by enabling accessible aerial thermal data collection. UAVs with thermal sensors can readily identify heat signatures and efficiently gather accurate surface temperature measurements over large areas. This provides new capabilities for applications, such as assessing commercial building insulation, monitoring wildfires, and inspecting solar installations. The combination of infrared thermography and UAV mobility offers an accessible and cost-effective approach to thermal mapping that unlocks new possibilities across sectors. Common IR sensors include microbolometers detecting longwave IR (7.5–14 μm) and InGaAs detectors for shortwave IR (0.5–2.55 μm) [117]. The resolution varies from a centimeter scale for object identification to meters for thermal mapping, with FLIR sensors being considered the most commonly used thermal sensors for UAVs, as illustrated in Figure 8 [118]. High-resolution IR thermography from UAVs can pinpoint building envelope deficiencies prone to energy loss. IR cameras identify areas of heat leakage, moisture accumulation, and insufficient insulation when temperature gradients exist between interior and exterior surfaces [119]. Frequent UAV IR surveys can help track building deterioration and prioritize retrofits. In addition, thermal mapping enhances the urban heat island research, linking surface temperatures to land-cover types [79]. Combined with elevation data, UAV IR enables the modeling of microclimate variations across neighborhoods to guide the heat mitigation research [120]. IR sensors can help quantify the effects of urban vegetation on outdoor thermal comfort and verify thermal anomalies in buildings envelopes through time-lapse thermography techniques [121].

3. UAV Applications in Climate Change Research

UAVs are emerging as an instrumental tool for cities seeking to understand and respond to escalating climate change challenges through adaptation and mitigation initiatives. This section explores the diverse ways UAV technologies are enabling more effective urban climate research to inform policy decisions and resilience strategies.

3.1. UAVs in Climate Change Research

The potential of UAVs in climate change research emerged in the early 2000s as miniaturized sensors, GPS navigation, and battery technology enabled lightweight remote sensing platforms. NASA scientists first demonstrated the utility of small fixed-wing UAVs for studying atmospheric dynamics, composition, and thermodynamics in the early 2000s [124]. This pioneering work sparked the recognition that UAVs could fill in the gaps in climate observation networks, given their flexibility, cost-effectiveness, and ability to sample hazardous environments.
UAVs were first deployed for humanitarian action in the early 2000s by nonprofit organizations seeking aerial damage assessments after disasters when manned aircraft flights were limited. In 2004, humanitarian groups operated small UAVs to survey post-hurricane flooding outcomes and assess infrastructure damage from the Haitian earthquake [125]. These initial efforts highlighted the potential for UAVs to support disaster relief by providing rapid emergency mapping when access was constrained. The early demonstrations of UAVs for post-disaster mapping and relief accelerated into more operational deployments in the early 2010s. By 2015, UNICEF also tested UAVs in Malawi and the Dominican Republic for transporting medical samples, vaccines, and other cargo as part of the agency’s UAV evaluation initiative [126]. This was followed by the launch of the Kazakhstan Drone Corridor as a testing ground for drone delivery systems [127]. These initiatives helped pave the way for more robust and ethical frameworks for using UAVs for humanitarian aid and crisis responses. Figure 9 illustrates UAVs’ innovation timeline and integration in humanitarian research since the early 2000s.
In climate change research, UAVs are emerging as a valuable tool for monitoring environmental impacts through flexible high-resolution aerial mapping, and have been deployed in multiple climate change research domains, as shown in Figure 10, as exemplified by a case study conducted in the Yukon–Kuskokwim Delta region of Alaska [128]. The researchers implemented multi-sensor-equipped UAVs to autonomously survey and map the challenging terrain, acquiring high-resolution visible, multispectral, and thermal infrared images. The versatility and customizability of the UAV platforms allowed the research team to tailor the systems to the specific scientific requirements of evaluating the landscape changes related to permafrost degradation. Geospatial data analytics generated from the UAV-acquired remote sensing datasets provided the efficient monitoring of the impacts of global warming on critical cultural and ecological heritage sites in the Arctic tundra ecosystem. The study demonstrated the emerging potential of UAV remote sensing to enable rapid, cost-effective, and frequent landscape characterizations, furthering the capabilities for climate change assessment and adaptation strategies.
Recent urban expansion coupled with the impacts of climate change have created a need for enhanced methods of monitoring and managing sustainable city developments, as investigated by Djimantoro and Suhardjanto [129]. The researchers experimentally implemented UAVs equipped with photogrammetric imaging payloads to autonomously survey and reconstruct 3D digital surface models of building infrastructures at sub-meter resolutions. Compared to the existing means of geospatial data acquisition, UAV mapping demonstrated a greater flexibility, higher cost-efficiency, improved time-effectiveness, and increased safety versus conventional manned aerial mapping. The study results highlighted the emerging potential of UAVs to provide city planning departments with rapid, up-to-date 3D-mapping capabilities to better control urban development amidst the constraints of limited governmental budgets and personnel. The findings emphasize the importance of developing UAV and photogrammetric innovations to support frequent, precise 3D urban infrastructure mapping to inform data-driven policies for sustainable city growth and resilience.
Additionally, UAVs with multispectral imagery have been used to assess losses as a result of herbicide-induced stress by monitoring decreased vegetation greenness as indicative of stress [130]. Furthermore, UAVs have found numerous applications in Antarctic environmental research. By capturing detailed imagery and data, UAVs provide valuable insights into the impacts of climate change on Antarctic ecosystems, including changes in vegetation distribution, ice dynamics, and wildlife habitats [131]. UAVs have been used to monitor erosion and sediment control practices [124]. In addition, UAVs have been deployed to assist in mapping shrubland structural attributes, monitoring grassland ecosystems, and studying cryospheric components, such as glaciers and ice shelves [132]. Using UAVs in Antarctic environmental research has enhanced data collection capabilities, improved spatial resolution, and the enabled efficient monitoring and assessment of the unique and fragile Antarctic environment.
UAVs have proven useful for mapping the erosion and accretion patterns in coastal environments vulnerable to sea-level rise. A study in Ghana combined LiDAR elevation data and visual imagery to analyze shoreline changes from extreme storms [133]. UAVs also enable the rapid assessment of coastal flooding extents during storm surges and tide cycles using visual cameras or thermal sensors to identify inundated areas [134]. These high-resolution shoreline data help improve coastal flood modeling and predictions. In the domain of emission monitoring, researchers have used UAV-based methane detection and quantification to track wetland emissions and offshore gas seeps [135]. Moreover, multiple studies have utilized UAVs to measure carbon dioxide concentrations and fluxes in complex urban environments [136,137,138]. Such flexible aerial sampling can improve emissions inventories and our understanding of climate feedback. Furthermore, UAVs equipped with gas sensors have been used to monitor and evaluate the environmental impacts of specific industries, such as mining operations. These UAV-based monitoring systems enable the measurement of greenhouse gases and particulate matter, providing insights into the emissions generated by mining activities and their potential effects on the surrounding environment [139].
UAVs are emerging as a critical tool for wildfire agencies to map active blazes and their progression in real time. Equipped with visual and infrared cameras, UAVs flown over fires provide continuous video footage and imagery to track the leading edge of the fire perimeter [140]. This helps personnel on the ground identify threatening spread directions and allocate resources effectively. Thermal sensors on UAVs enable penetrating smoke to pinpoint hotspots and flare-ups [141]. Immediate aerial views of unfolding wildfires aid the tactical planning and maintenance of responder safety.
In climate hazard assessments, UAVs provide a versatile tool for studying landslide hazards and terrain instability exacerbated by climate change. High-resolution orthomosaic and digital surface model reconstructions from aerial photogrammetry enable the detection and measurement of ground deformations indicative of landslide activity [142,143]. Furthermore, UAVs have been proven to be effective in mapping landslides in highly dense vegetation areas [144]. Combining LiDAR scans and visual data allows mapping discontinuities in rock masses to model failure potentials [145]. UAVs contribute to the characterization of rock masses by extracting the geometric properties of rock mass discontinuities, such as orientation and spacing, providing insights into the stability conditions of rock slopes [146]. Recently, a considerable amount of research has deployed UAVs with infrared thermography, further enhancing the monitoring capabilities, identifying thermal anomalies, assessing rock mass conditions in complex environments, and identifying slope hydrology dynamics influencing instability [147,148]. For characterizing debris flows, UAVs can capture high-resolution images and generate digital terrain models (DTMs) that identify and map debris flow paths, help estimate deposited volumes, and quantify torrent evolution [149].
UAVs have also been deployed for climate modeling and air-quality monitoring, providing access to more granular data. Combining microclimate sensors and air-quality instruments allows UAVs to relate local weather, pollution, and environmental factors to community health. For instance, a study in California gathered hyper-local air-quality data to model pollution exposure risks and guide the policies for protecting vulnerable populations. UAVs can also be integrated with advanced technologies, such as artificial intelligence and computer vision, to analyze and interpret the collected data, which can identify pollution hotspots, predict air-quality trends, and develop early warning systems for potential health risks. As climate change leads to more frequent extreme heat and air pollution episodes, UAV sensor networks can enable real-time monitoring to issue health warnings.

3.2. Urban Challenges and UAVs Opportunities

As previously mentioned, climate change places a lot of pressure on the built environment with escalating impacts, ranging from climate hazards, such as floods, severe storms, and heatwaves, to infrastructure vulnerabilities. UAVs have emerged as a transformative technology to meet cities’ sustainability challenges. With flexible deployment and data capture at neighborhood scales, UAVs enable cities to monitor their built environments and climate risks in real time, while helping diagnose and evaluate the solutions. The rich spatial and temporal perspectives offered by UAVs have become integral across diverse urban climate research domains, unlocking actionable intelligence needed by cities on the frontlines of climate disruption. UAVs are indispensable tools for developing resilient and sustainable cities. They provide critical data that enable informed decision making, efficient resource management, and proactive measures to address climate change challenges. As urbanization continues to occur, leveraging drone technology is essential in creating cities that are not only better prepared for climate-related threats, but also more sustainable and livable for their residents. Their capacity to furnish critical insights through drone-based climate change research is pivotal for enhancing urban planning, infrastructure design, and policy making. The precision and real-time monitoring capabilities of drones, equipped with advanced sensors, facilitate the acquisition of high-quality data on climate parameters, pollution levels, and environmental factors. This high-resolution data accessibility is critical to empower decision makers with the necessary information to navigate climate challenges. In a world marked by increasing urbanization and environmental challenges, UAVs emerge as a cornerstone for crafting cities that are both well-prepared for climate-induced threats and genuinely sustainable for their inhabitants.
A.
Urban Microclimate Assessment
One of the main domains where UAVs play an integral role in climate change research in cities is mapping and analyzing localized phenomena, such as urban heat islands. UAVs with thermal infrared cameras can rapidly map fine-scale urban surface temperature variations to delineate heat islands and cooler areas linked to land cover. Flights at altitudes below 300 meters provide pixel resolutions down to 10–50 cm, capturing microclimate gradients within neighborhoods. A study by Naughton [68] leveraged drone thermal cameras to map micro-scale temperature variations down to individual city blocks in Texas, USA. By combining these UAV thermal data with land-cover information, the researchers identified localized hotspots correlated with limited greenery, ineffective roofs, and urban materials. This granular UHI quantification and diagnosis assisted targeted interventions in the built environment to mitigate heat risks. Moreover, UAV thermal data have been used to correlate temperatures with vegetation, impervious surfaces, building density, and road types across cities coupled with satellite imagery to provide a comprehensive assessment of the dynamics of urban microclimates [150]. This helps model microclimate interactions with the built environment and identify the localized areas most vulnerable to extreme heat.
In addition to thermal imaging applications, UAVs have been deployed to collect high-resolution aerial imagery and LiDAR data to construct detailed 3D models of urban morphology. Photogrammetry processing generates point clouds, mesh models, and orthomosaic classifying features, such as buildings, roads, and vegetation [2]. LiDAR complements the visual data with urban elevation mapping [151]. Classified 3D models enable the analysis of urban geometry, including sky view factors, building heights, and canyon orientations influencing solar exposure and heat retention [152]. By correlating land-surface temperatures with land-cover types derived from UAV-based 3D models, the researchers can gain insights into the factors contributing to urban heat island formations and develop strategies for mitigating their effects [153]. Furthermore, a study by Yang et al. [154] examined the spatial and temporal variations of heat islands in Zhengzhou, China, considering the influence of urbanization and urban forestry. The drone imagery helped assess the effectiveness of urban forest construction in mitigating the urban heat island effect.
B.
Building Envelope Performance
UAVs empowered with infrared sensors have emerged as an essential tool for building envelope inspections and building energy audits. One major application is using thermographic cameras mounted on drones to detect heat leaks, air infiltration, exfiltration, and insulation deficiencies in roofs and facades. UAVs can be easily deployed to survey every side of a building’s structure from optimal proximity compared to the ground-based thermography providing more accurate insights into the thermal performance of building envelopes [155]. A study by Rathinam et al. [156] presented a comprehensive exploration of the utilization of UAVs for real-time structure detection and tracking, focusing on linear structures, such as roads, highways, and canals, based on the visual feedback. The research tackled two critical components of this challenge: vision-based structure detection and controlling the UAV to follow the structure accurately. The insights provided by this paper have broad applications in different domains. For instance, in infrastructure monitoring, the paper highlights the potential for autonomous UAVs to inspect and monitor structures, such as pipelines, roads, and power grids. This capability can enhance safety and reduce the operational costs. The paper also emphasizes the significance of this technology in disaster responses. Following events, such as earthquakes or natural disasters, where infrastructures can be severely affected, autonomous UAVs can rapidly assess the damage and provide real-time visual feedback, aiding in efficient recovery efforts. Additionally, the ability to autonomously monitor the critical infrastructure can have implications for homeland security, ensuring the integrity of key assets, such as bridges, roads, railways, and power transmission corridors
Producing orthorectified thermal maps reveals the location and severity of envelope flaws causing energy waste, which can be addressed through suitable retrofit strategies. A study by Falorca and Lanzinha [157] explored the use of drones for facade inspections and building envelope pathology diagnoses. Drone technology proved to be an effective and promising alternative methodology for supporting the technical inspection and diagnosis of building envelope pathologies. UAVs also enable low-cost frequent inspections to monitor repairs or catch new leaks as they emerge post-occupancy or post-retrofit. Moreover, UAVs can help facilitate rapid scanning to map cracking and spalling that undermine structural integrity. The photogrammetry processing of visual imagery can construct 3D point clouds capable of millimeter-scale crack detections [158]. Utilizing infrared sensors also highlights areas of moisture intrusion within wall systems signaled by temperature differentials [119]. A study by Alzarrad et al. [159] presented a promising approach to automating sloped roof inspections using UAVs and deep learning. It highlighted the significance of such technology in enhancing efficiency, reducing safety risks, and providing accurate assessments of roof conditions. While acknowledging the limitation of a small dataset, the paper suggested that, with more data, the model’s accuracy could be further improved. Overall, this study contributed to the advancement of automated inspection techniques, paving the way for more efficient and safer practices in the construction and insurance industries.
Detailed UAV-based envelope inspections enable proactive maintenance to avoid expensive repairs and enhance resilience to climate impacts. In addition, hyperspectral data can detect moisture content and vegetation health in green roofs [160]. Photogrammetry models produced from UAV data help quantify surface buckling, blistering, and deformation over large areas. For a moisture intrusion diagnosis, combining thermal and visible imagery allows a precise inspection. Thermal sensors lack a visual context, while moisture below surfaces remains invisible in visible bands. However, fused thermal–visible data confirm and localize the anomalous heat signatures related to water damage [161]. UAVs also enable safer roof inspections versus human-based auditing and inspections, which provides a safer setting for building envelope inspections and, as a result, reduces the time and cost spent on traditional manual envelope inspections.
C.
Inspection and Monitoring of Urban Infrastructures
UAVs play a significant role in urban infrastructure assessments as they can rapidly map infrastructure conditions at high resolutions to identify the risks and needs before failures occur. For example, UAVs enable the detailed inspection of bridges, roads, railways, and waterways to detect corrosion, cracks, scour, or structural issues exacerbated by escalating extreme weather [156,162]. Multi-angle imagery provides a comprehensive documentation of asset states across otherwise challenging to inspect surfaces. UAVs similarly facilitate assessing power grid assets, such as above-ground transmission towers and lines, for weathering and vegetation overgrowth under altered climate conditions [36]. Integrating electromechanical sensors on drones enables electromagnetic field measurements around components to detect developing faults. For pipeline rights of way, LiDAR and hyperspectral UAV data help maintain vegetation clearing for asset access and safety, and can thus provide further rapid damage assessments of utilities after severe climate events.
UAVs are useful for periodic inspections and their automated flight capabilities enable the continuous monitoring of infrastructure conditions to identify issues as they emerge. For example, UAVs equipped with thermal and visible cameras can autonomously conduct weekly flights over bridges to track the progression of fatigue cracks or corrosion damage on concrete and steel elements [163]. This frequent imaging creates time-series data to identify when deterioration exceeds the allowable thresholds, triggering needed repairs. UAVs similarly facilitate the regular assessment of road surfaces, rail tracks, buried pipelines, and canal levees to quantify incremental weathering, subsidence, or erosion [164]. Operators can implement more frequent inspections of assets identified as higher priority or already in degraded states. Researchers have explored various approaches for UAV-based traffic monitoring, as surveyed in “A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance” [165]. With their mobility and sensor integration, UAVs offer new opportunities to continuously assess road conditions and traffic flows beyond what static systems allow. The data timelines provided by repeat UAV flights can assist in modeling climate impact accumulation rates and target retrofits to extend asset lifetimes. Continuous monitoring also supports the evaluation of repair durability and new materials under actual field conditions. By enabling a low-cost persistent oversight, UAVs are a pivotal tool for adaptive infrastructure management amidst intensifying climate stresses.
D.
Assessment of Climate Hazard Impacts and Emergency Response Coordination
In the domain of climate hazard assessments, UAVs are emerging as an invaluable tool for rapidly assessing infrastructure damage and the environmental impacts following climate-related disasters. Following floods, storms, wildfires, and other events, UAVs provide rapid aerial-based assessments when manual assessments are restricted due to safety concerns [166,167]. Visible and multispectral UAV sensors supply detailed visual evidence of the destruction critical for compiling robust post-disaster reports [168,169]. UAVs also enable the scoping of disaster extents across neighborhoods and cities to allocate a response capacity. Another application is coordinating emergency response logistics using UAVs as communication relays and cargo transporters. UAVs can provide wireless connectivity to areas with destroyed infrastructures and monitor personnel deployments [170]. Using thermal and multispectral sensors, UAVs help locate victims during search-and-rescue operations, especially in inaccessible urban areas [171]. For cargo delivery, UAVs assist in transporting critical supplies, such as medical samples, water testing kits, radio equipment, and batteries, when ground transport is hindered post-disaster [172].
Furthermore, UAVs can be utilized to assess flood events and their impacts on urban areas. By collecting high-resolution images and mapping the extent of flooding, UAVs provide valuable information for estimating direct tangible losses to residential properties and assessing the overall impact of floods. These data can support emergency response coordination, aid in resource allocation, and inform decision-making processes for future flood mitigation and adaptation measures [173]. Another critical value of deploying UAVs is providing real-time information on the behavior and impacts of natural hazards, such as wildfires, landslides, and coastal erosion. These data are crucial for emergency response coordination, enabling authorities to make informed decisions and perform actions in a timely manner to protect communities and critical infrastructure [174]. UAVs also aid impact attribution studies by providing baseline mapping to compare with post-disaster conditions. As the climate risks increase, UAVs are becoming integral to assessing hazards, coordinating responses, and building climate resilience.

3.3. Data Collection and Processing Platforms

As outlined above, UAVs provide ample opportunities for a more accurate and rapid monitoring and assessment of numerous climate change challenges in the built environment. The existing literature reveals various frameworks to structure UAV operations and data collection workflows. Khan et al. [175] proposed a comprehensive multi-stage model for UAV-based sensing encompassing: (i) defining the operational scope and objectives, (ii) flight planning and route design, (iii) UAV deployment for planned data acquisition, (iv) sensor data gathering, (v) the processing and analysis of the collected data, and (vi) interpretation and communication of analysis results. Similar staged methodologies aim to provide end-to-end guidance for executing UAV remote sensing missions from initial planning through data extraction and interpretation methods. A systematic workflow is critical for effectively leveraging UAV capabilities across diverse monitoring and inspection applications. This multistep framework can be aggregated into three primary stages: (1) mission planning, (2) data acquisition, and (3) data output or processing. Figure 11 illustrates the key parameters associated with each stage and the average time spent.
The mission planning step involves flight planning based on the sensing objectives. Key mission parameters, such as altitude, speed, overlap, and routing, are determined to ensure the desired area is covered at an appropriate resolution. This frames key mission parameters, such as sensor payload, flight duration, altitudes, and required resolution. Sensor payload selection is also critical to gather relevant information, whether RGB, multispectral, thermal, or LiDAR systems. With the flight mapped out, the UAV is deployed to autonomously follow the programmed path and trigger sensor recording [176]. After mission scoping, the flight planning process includes the design of detailed flight plans specifying automated flight routes, sensor triggering locations, altitudes, and speed. The flight parameters ensure full coverage at the desired resolutions, orientations, and overlap percentages between each captured image. Additionally, mission planning includes data collection parameters, such as sensor calibrations and ground control point (GCP) design. The mission planning time ranges from hours to weeks depending on the mission scale and complexity, representing around 5% of the overall data acquisition process [176]. Thorough planning is crucial for successful data acquisition.
The data acquisition stage encompasses UAV deployment, in-field pre-flight checks, data capture flights, recording sensor data, and calibration of GCPs. The UAV-certified pilot operates UAVs according to the mission plan. Test flights validate the aircraft performance before conducting the full flights. The UAV then autonomously navigates the planned routes using GPS, while recording imagery, video, LiDAR, or other sensor data. The total flight time depends on the sampling scale, from less than an hour for a small site to days for large areas. Sensor and telemetry data are retrieved post-flight for cleaning and processing, which is a process that can take up to 20% of the total process time.
In the data processing stage, the collected aerial data enter processing workflows to generate analysis-ready outputs. As shown in Figure 11, the processing steps vary by data type, but often involve photogrammetry, geospatial referencing, 3D modeling, topography modeling, feature detection and classification, and/or index calculation. For example, overlapping UAV images are stitched into orthomosaic maps and digital surface models through structure-from-motion algorithms. LiDAR point clouds also undergo classification and terrain modeling. Specialized software streamlines the processing; however, a skilled analysis is required to ensure the data quality. The results are interpreted to derive actionable insights based on the mission objectives and different application needs, as shown in Figure 11.
The proliferation of UAV-based aerial mapping sparked the development of specialized photogrammetry software to process the captured imagery into 3D reconstructions, point clouds, and orthomosaic maps. As UAV adoption expands across sectors ranging from construction to conservation, selecting the appropriate data processing workflow is pivotal to efficiently transforming raw sensor data into actionable ideas. A range of photogrammetry solutions exists at present, ranging from user-friendly commercial packages designed for simplified mapping workflows to open-source options offering advanced customizations, as shown in Figure 12. Cloud computing has also created new web-based processing capabilities. Determining the optimal solutions requires aligning processing capabilities with program needs and resources. However, as sensors and airframes advance, the continued evolution of computation power and algorithms will further unleash the great potential of UAV-derived data across applications.
Popular commercial platforms include Pix4D, DroneDeploy, PrecisionMapper, and ArcGIS Drone2Map for general mapping needs [1]. These streamline workflows for common applications, such as agriculture, surveys, and inspections. Photogrammetry-focused, open-source options, such as OpenDroneMap, provide more flexibility for advanced users. Some sensor manufacturers also offer specialized processing suited to visual, thermal, LiDAR, or spectral data, as illustrated in the review provided in Figure 12.
Cloud-based processing through web services offered by industry leaders, such as Pix4D and DroneDeploy, allows rapid analysis without local computing resources [2,46]. However, cloud processing incurs ongoing subscription fees versus one-time software purchases. For projects dealing with sensitive data, local processing may be preferred or required to ensure security and prevent external data storage [40]. However, this requires an investment in high-performance workstations for intensive computation. Selecting optimal processing platforms depends on weighing factors, such as usage costs, processing times, data security needs, analysis requirements, and team experience level. For most inspection and mapping applications, user-friendly commercial photogrammetry suites require minimal training while providing speed, scalability, and robust outputs [24,177]. Open-source tools better suit research needs for a customized analysis, such as environmental monitoring applications and climate change research. Cloud platforms offer convenience, but may lack security; however, weighing these tradeoffs is important to match processing solutions to application goals.

4. UAVs and Artificial Intelligence

Artificial intelligence (AI) refers to computer systems designed to perform tasks and exhibit behaviors typically requiring human intelligence. AI incorporates machine learning and deep learning techniques to develop algorithms that can improve through experience and data analysis rather than explicit programming. By mimicking abilities like reasoning, perception, prediction, and decision making, AI enables the automation and augmentation of complex cognitive tasks across diverse fields. From powering medical diagnoses to optimizing supply chains, AI is transforming how many human-centric problems are addressed. As AI research continues to advance, the technology is expected to become even more deeply integrated into society, enhancing services, operations, and quality of life in ways not otherwise possible. AI has already delivered innovations that make our living conditions more pleasant, efficient, and safe by efficiently handling complicated challenges beyond the scope of traditional computational approaches.
AI is advancing rapidly, with daily innovations integrating AI into diverse tasks and systems. While previous AI applications focused on modest capabilities, such as image classification or auto-piloting functions, the researchers at present aim to develop more complex and general AI that matches or exceeds human-level proficiency. This involves tackling more cognitively demanding skills, ranging from strategic games, such as chess and AlphaGo, to solving abstract mathematical theorems. Through machine and deep learning advances, AI systems are becoming increasingly sophisticated at perceiving their environments, reasoning about optimal actions, and generalizing knowledge across domains. While current AI still falls short of human flexibility and contextual understanding, the field continues to perform breakthroughs in natural language processing, computer vision, robotics, and other facets required for more broadly intelligent systems. As the research progresses, AI is expected to produce transformative technologies that complement and enhance human capabilities in revolutionary ways across education, business, transportation, healthcare, and many aspects of daily life.
UAVs are increasingly improved with AI capabilities that mimic human-like cognition to perform more sophisticated operations. AI enables drones to exhibit advanced abilities, such as independent decision making, navigating unfamiliar environments, predicting outcomes, and coordinating with humans or other drones [180]. Rather than just following pre-programmed routines, AI algorithms allow drones to dynamically process sensor data, identify patterns, and determine optimal actions based on a situation. This facilitates more flexible autonomous flights, complex data analysis, and coordination on previously infeasible missions without human oversights. As drone hardware improves in tandem with AI advances, UAVs can take on evermore challenging tasks to assist people in applications ranging from infrastructure monitoring to emergency response. The integration of AI is transforming drones beyond basic flying machines into intelligent robotic systems capable of performing advanced roles across many industries.
Machine learning, a prominent AI technology, was recently widely applied in UAVs to improve their wireless communication, routing, collision avoidance, and navigation [181]. Moreover, AI is transforming UAV capabilities by enabling advanced aerial data analysis, for example, deep learning algorithms can rapidly process the visual data captured by UAVs to identify objects, detect changes, and model environments [182]. This allows extracting meaningful information from UAV imagery at large scales. For instance, convolutional neural networks classify land cover using UAV orthomosaics with high accuracy [183]. Machine learning also assists in 3D reconstructions from UAV data through improved feature recognition and matching. AI enables UAVs to process large amounts of data rapidly and accurately, making them more efficient and robust. It also enables UAVs to optimize their resources, such as computing, caching, and communication, improving their efficiency and performance [184].

4.1. AI and UAVs: Breakthroughs and Crossovers

The rapid evolution of AI and UAV technologies has led to a growing crossover between the two fields. As AI techniques, such as deep learning and machine learning, continue to progress, they are being integrated into UAV technology to expand the capabilities of these systems significantly. AI enables drones to operate with more autonomy, adaptability, and intelligence than ever before. In turn, UAVs serve as ideal testbeds for applying and validating cutting-edge AI innovations, ranging from computer vision to real-time object identification. This fusion is transforming UAVs from remotely controlled devices into intelligent aerial robotic systems, and unlocking emergent applications that were not previously feasible. Key areas where AI advances are intersecting with UAVs include computer vision, autonomous navigation and flight control, human–drone teaming, swarm coordination, mission planning, and air combat tactics.
Machine Vision: Powerful computer vision algorithms enable drones to perceive and map their surroundings for navigation and data capture. Deep neural networks can be trained on visual data to identify objects, interpret scenes, detect obstacles, and track targets. This allows UAVs to autonomously navigate unfamiliar environments, avoid collisions, and follow dynamic objects. AI-enabled computer vision also facilitates automated inspection, surveillance, and mapping tasks by UAVs. A study by Flammini et al. [185] emphasized the significant role that UAVs equipped with AI-driven systems played in improving the efficiency and effectiveness of railway monitoring. The synergistic relationship between UAVs and AI in this context was particularly relevant to climate change research, as it enabled more comprehensive and cost-effective data collection and analyses. The deep learning-empowered analysis of UAV imagery data facilitates rapidly mapping new developments, roads, and buildings critical for applications, such as disaster response and urban planning. Automated feature extraction through AI promises to make mapping infrastructure faster, cheaper, and more up to date.
Autonomous Flight Control: AI techniques, such as reinforcement learning, allow drone flight controllers to improve stability, efficiency, and safety through experience. Instead of pre-programmed control, drones can dynamically optimize flight paths and handling in response to aerodynamic effects and environments. Reinforcement learning has been applied to teach complex aerial maneuvers, smooth trajectories, wind gust compensation, and obstacle avoidance [186]. In a study [187], the researchers explored the challenging task of enabling UAVs to autonomously follow forest trails, a problem of considerable importance for various applications, including wilderness mapping and search-and-rescue missions. The authors emphasized that UAVs, especially micro-aerial vehicles (MAVs), have become a viable option for navigating forested environments due to recent technological advancements, making them a valuable tool for environmental monitoring and climate change research. UAVs equipped with advanced AI equipment, such as DNNs, can navigate complex forested environments, gather data on vegetation, monitor changes in the terrain, and contribute valuable insights into the impact of climate change on forests and ecosystems. The ability to autonomously follow trails facilitates efficient data collection and enhances the role of UAVs in environmental research. More broadly, AI pilots can coordinate complex maneuvers and contingencies without human guidance. This is critical for long-duration or long-distance flights, where continuous human operation is infeasible.
Human–Drone Teaming: To collaborate more seamlessly with human operators, AI provides drones with natural language and gesture recognition, intention prediction, and adjustable autonomy. This allows an intuitive human–drone communication, a delegation of high-level goals rather than step-by-step directions, and fluid handoffs of control between automated and manual modes [188]. AI facilitates a human–drone team coordination similar to human teams.
Swarm Coordination: For drone swarms, AI enables a decentralized coordination between units to achieve collective goals. Algorithms allow large-scale swarms to cooperatively survey areas, navigate tightly in formation, dynamically self-organize, and exhibit collective intelligence [189]. More advanced swarming AI leverages emergent intelligence principles to complete the global objectives. Each drone follows the rules for assessing its environment and self-organizing based on the distributed feedback. This opens applications ranging from distributed sensing to impressive aerial displays.
Mission Planning: AI planning and reasoning algorithms empower drones to translate high-level mission objectives into customized, optimized flight plans accounting for the conditions [190]. Automated mission planning with contingencies reduces the operator workload. During missions, drones can utilize AI algorithms to adjust the routes and actions based on the sensor data.
Air Combat: In simulated dogfights, AI-controlled drones have defeated human pilots using maneuvering and tactics learned through deep reinforcement learning [191]. While still nascent, AI can eventually revolutionize aerial combat by outthinking human opponents. Drone dogfighting offers a proving ground for advanced air combat AI. As the AI research accelerates, UAVs are poised to benefit immensely from embedded intelligence, making them smarter, safer, and more useful across industrial, civil, and defense domains.

4.2. AI-Empowered UAVs: Supporting Adaptation and Mitigation Strategies

AI and machine learning techniques can enhance multiple aspects of UAV-based data vital in climate change research, ranging from data acquisition to analytics. Advanced computer vision enables UAVs to autonomously detect features, such as forest fire boundaries or flood extents, improving the monitoring efficiency [192]. In addition, processing large volumes of UAV images using deep learning supports land-cover classifications at regional scales relevant to carbon and hydrologic cycle research. Numerous studies have found that AI agents also show promise for optimizing UAV flight planning and coordination when collecting climate data [193]. Onboard AI computing can guide autonomous target tracking, sensor triggering, and dynamic obstacle avoidance to reduce the pilot workload during long UAV climate research flights [187]. Numerous studies have examined how UAVs can utilize artificial intelligence and automated control for navigation and collision avoidance. Several recent publications focused on this concept [187,194,195], where deep neural networks were employed as a machine learning technique to train drone systems using the data from sensors and cameras. In a study by Gandhi et al. [187], the researchers built a dataset of over 11,500 crash scenarios to develop a deep learning model that categorized collision risks and instructed drones on how to avoid them. As the datasets grow through groups sharing drone data, deep learning is expected to progress rapidly, leading to drones with more advanced situational analysis, decision-making, and autonomous flight capabilities.
AI creates additional opportunities to integrate UAVs into multivariate climate and Earth system models, assimilating aerial data streams to strengthen model predictive capabilities [196]. As climate impacts accelerate, deep neural networks fed with diverse labeled UAV training data will become invaluable for managing and understanding the climate threats. Investing in AI and UAV integration can significantly extend the value of drones for addressing interconnected environmental challenges. UAVs provide an agile means of gathering high-resolution aerial data to feed into AI-driven predictive models for climate change analysis. For instance, machine learning can help construct robust models forecasting the impacts based on recognizing patterns and correlations in the multivariate time-series data captured by UAVs [197]. Moreover, UAV surveys tracking urban vegetation health, temperatures, demographics, and land use over time can train models to forecast heat illness vulnerability at neighborhood scales under warming projections [198].
The role of neural networks in leveraging UAV datasets also shows promising potential in predicting extreme weather and hazards exacerbated by climate shifts. To refine the predictions, models can continually ingest updated UAV inputs on forest moisture, storm energetics, coastal erosion, or other indicators [197]. Real-time UAV video feeds before disasters, such as hurricanes or wildfires, can improve AI damage and loss models to guide emergency responses [199]. Capturing diverse training data through UAV monitoring enables for more a accurate AI model encoding of complex climate–environment relationships [200]. Deep learning algorithms can also help uncover localized variables and relationships driving microclimate patterns that legacy models overlook from the UAV data [201]. For cities developing adaptation plans, machine learning can help strategically optimize new green space, cool surface, and green infrastructure investments to affordably mitigate warming [202]. Moreover, AI enables a more automated and insightful analysis of UAV-derived data to track climate shifts in forest ecosystems. Repeated UAV surveys allow training AI agents to identify subtle spectral indicators of various climate hazards [81]. This AI-generated intelligence helps model climate impacts on urban biodiversity, the built environment’s performance, and the infrastructures’ susceptibility. For reforestation initiatives, AI-assisted UAVs show promise for optimizing planting patterns and monitoring restoration. Machine learning can help design heterogeneous native species mixes and spacing for climate resilience based on the local soil, moisture, and topography data [203]. UAVs empower the frequent high-resolution tracking of regeneration rates to identify the issues needing intervention. As climate risks accelerate, AI and UAVs are poised to provide comprehensive, scalable solutions for mapping, modeling, restoring, and sustaining the natural ecosystem and built environment.

5. Challenges and Future Directions

The rapid evolution of UAV systems has unlocked transformative possibilities across industries, research domains, and public services. However, myriad technical, regulatory, and ethical challenges remain that constrain the further advancement and integration of UAVs. Regulatory and legal restrictions may constrain the areas and methods for UAV operations. Technical limitations, such as flight duration, sensor resolution, and interference, can affect the data quality. The ongoing improvements in areas, such as battery life, robust autonomy, payload capacity, weather resilience, and human–drone interaction, are needed to overcome the current limitations. Furthermore, safety, privacy, and appropriate use cases require resolutions through legal frameworks and policies. Realizing the immense potential of UAVs demands focused research and innovations to surmount the existing barriers while aligning the technology growth with social values. This section provides an overview of the critical challenges facing UAV integration and the promising directions for the future progress that responsibly harness their capabilities toward addressing climate change and its applications in the built environment.

5.1. Technical Challenges

UAVs face multiple technical constraints that can limit their effectiveness for climate studies in complex urban environments. Restricted flight endurance poses challenges for the extended duration sampling of heat patterns, air quality, and other dynamics across metro regions [36]. As discussed above, most commercial UAVs have less than 30 min of flight time; though, advances in batteries and efficiency help extend this. Payload restrictions also constrain the size, weight, and combinations of scientific sensors used to characterize urban climate factors. While rotary-wing UAVs are highly agile and adaptable, their flight endurance remains limited due to battery life constraints. A single battery charge often does not provide sufficient power for long-distance missions. Factors, such as hovering, maneuvering, payload weight, speed, and headwinds, can negatively impact the battery’s life and range [204]. In practice, rotary drones may need to stop for recharging multiple times before reaching faraway destinations.
A.
Energy Use and Payload
Efforts have been made to improve drone energy efficiency and extend flight times. Airframes composed of lightweight carbon fiber reduce power needs for lift [205]. Brushless motors also offer superior power-to-weight ratios at present to minimize energy draw. However, rotary drones cannot simply add more batteries to increase the endurance due to strict weight limits. The research shows an optimal mass exists, above which any added weight reduces the stability and flight times for rotorcrafts [206]. While incremental gains in efficiency continue being made, the fundamental constraints on rotary drone endurance remain difficult to overcome given the current battery technologies. Long-range missions may require fixed-wing platforms, advanced hybrid designs, or foundational innovations in energy storage.
The limited payload capacity of lightweight UAVs also poses challenges. Their inability to carry heavier onboard equipment, such as cameras, multiple sensors, and scanning devices, restricts functionality. UAVs often need to be equipped with heavy sensor systems, including laser scanners, ultrasonic, RADAR, and LADAR. Additionally, cameras used on drones can be costly, yet have a low resolution. For instance, most thermal cameras range from 640x480 pixels in resolution and USD 2000–5000 in price. Thermal aerial imaging also faces factors impairing its quality, including the emitted/reflected radiation, distance, and atmospheric moisture. The compact size of UAVs constrains their ability to support multiple payloads, such as high-resolution cameras, scanning sensors, and other heavy gear. This limits the sensing and imaging capabilities, while onboard cameras remain expensive but have a low resolution. Thermal imaging also contends with environmental factors impacting the image quality. Enhancing the UAV payload capacity, sensors, and imaging continue to pose technology and cost challenges.
B.
Automation and Data
While UAVs offer tremendous potential for autonomous data capture and analysis, limited onboard computational resources pose challenges. The restrictive size, weight, power, and cost budgets of UAV platforms constrain the processing capabilities that can be conducted on board. This impacts the feasibility of performing intensive computations, such as high-definition video encoding, 3D modeling, AI inferencing, and other algorithms, requiring substantial computing performances in real-time during a flight. Since UAVs must react rapidly to a dynamic environment, optimal path planning algorithms are often not viable because they require heavy processing unsuitable for real-time operations. While advances are being made with onboard GPUs to expand in-flight computations [207], developing efficient drone path planning and control methods that work within tight processing constraints remains a significant challenge. As drones take on more sophisticated roles, their limited onboard computational resources require careful management through strategic workload balancing and system optimizations. Limited autonomy and artificial intelligence capabilities pose another barrier, requiring intensive human piloting and analytic efforts. Advanced computer vision and AI would enable a more intelligent target identification, adaptive sampling, and decision making by UAVs over cities. Weather-resilience shortcomings, such as wind tolerance, restrict reliable and regular UAV data capture for the climate trend analysis [208]. Ongoing improvements to automated stability and envelope protection will help operations persist in diverse conditions. Finally, the uncertainty surrounding UAV impacts on privacy and noise may limit the urban deployment potential.
C.
Weather, Communication, and Path Planning
The recent research explores solutions to improve UAV path planning and navigation in harsh weather conditions. Wind, rain, fog, and other extreme weather conditions can impair drone sensors, reduce visibility, and decrease endurance [209,210]. To address these challenges, some studies propose risk assessment and path planning algorithms that avoid severe weather regions [211]. Other works apply numerical analyses using the Euler method to predict drone trajectories under wind loads [212]. Tilting drones based on the wind direction can minimize camera deviations when tracking ground objects. Since fog severely limits visibility, it also poses problems for camera-based obstacle avoidance [213]. Overall, the advances in drone design and path planning algorithms show promise for enabling successful navigations, despite precipitation, wind, fog, and other adverse weather.
Collision avoidance in constantly changing environments has also posed challenges for UAVs’ path planning [214]. Since complete environment information is unavailable before drone takeoff, unknown stationary or moving obstacles may hinder planned flight paths. One research area involves developing sensors, such as LiDAR and ultrasonic, to detect obstacles during flight [215]. Other research focuses on real-time obstacle avoidance by incorporating sensor data. For instance, [216] proposed using a Gaussian process occupancy map to predict the collision probabilities along a path using collected sensor data. The algorithm then selected the potential waypoints for the path generation. More recently, learning-based algorithms were suggested for navigation in dynamic environments [217,218]. By learning from past drone actions, these techniques can generate paths without prior knowledge of the surrounding environment. Coordinating multiple drones without inter-drone collisions in unknown environments also emerged as a challenge. Lastly, integrating UAVs with AI and IoT technologies presents both opportunities and challenges. While AI can enhance the capabilities of UAVs in data processing and analysis, it also introduces challenges related to data privacy, security, and real-time decision making. Integrating UAVs with IoT systems requires robust communication networks and protocols to ensure seamless data transmission and integration [219]. Overcoming these technical challenges is essential for maximizing the potential of UAVs in climate change research in cities.

5.2. Regulations and Security

Despite the rapid growth of UAVs, there is an urgent need for regulatory bodies to develop standardized regulations for operating drones across different countries. A major obstacle to broader UAV adoption is the ambiguity or lack of clear standards and regulations around permitted airspace, size, height limits, privacy considerations, cost, safety requirements, and operating characteristics. In addition, there is a lack of consistency in government rules for UAV deployment. UAVs can impact commercial airplane navigation; therefore, countries should implement proper regulations and procedures for drone operations. For example, in the US, the Federal Aviation Administration (FAA) issued certificates and air traffic rules for UAV operations [220]. The regulatory and policy frameworks surrounding UAV operations in urban areas pose technical challenges. The lack of complete and operational regulatory frameworks and traffic management infrastructure hinders the widespread use of UAVs in an urban airspace [219].
Addressing these challenges requires the development of comprehensive regulations and standards that ensure the safe and efficient operation of UAVs in urban environments. Additionally, policy frameworks need to be designed to support the integration of UAVs into the climate change research and mitigation strategies in cities. AI algorithms for regulatory compliance can provide a viable approach to support risk mitigation, improved confidential data security, and rapid adaptation to new regulations. With AI-enabled systems, UAVs can achieve authorization for domestic use across various countries by ensuring they meet the specific operational rules of each geographic location. By incorporating AI for standardized compliance, drones can be rapidly deployed in many nations while satisfying each region’s regulations for risk, data privacy, and flight requirements. AI can potentially allow UAVs to comply with disparate regulations across borders efficiently. Ensuring the security of sensitive UAV data, such as position and location, presents a critical challenge. Since UAVs lack encryption, there is a risk of hijacking. Hacking and cyber liability are key concerns for UAV use. In military operations, UAVs face the potential threats of data leakage. Hackers can gain complete UAV control to steal data or invade privacy and illegal activities, such as smuggling. The lack of encryption leaves drones vulnerable to hijacking, hacking, and data theft. This poses significant security risks, especially for military UAVs where sensitive information can be leaked. Protecting position, navigation, and other critical data are urgent priorities to prevent hackers from commandeering UAVs for illegal purposes, such as privacy invasion and smuggling.
Another concern regarding using UAVs, especially in cities, is privacy. UAV cameras and equipment that can take photographs or record videos raise significant privacy and security concerns, potentially violating individual rights [221]. In the USA, the Center for Democracy and Technology (CDT) advised the Federal Aviation Administration (FAA) to develop regulations explicitly preserving privacy. Privacy by Design (PbD) was introduced to address this challenge, supporting remedies for privacy breaches [222]. PbD regulations notably restrict intrusive UAV monitoring. Additionally, consistent UAV flights over companies can damage their market value by exposing their trade secrets and strategic plans. Drones equipped with cameras and sensors pose risks of infringing on privacy rights and commercial secrets through persistent surveillance.

5.3. The Future of UAVs

While AI approaches, such as machine learning and neural networks, have been implemented in UAVs, deep learning and reinforcement learning solutions have yet to be fully adopted [223,224]. This is mainly due to power and processing limitations on drones. Researchers should develop novel deep learning strategies tailored to UAVs, especially for search-and-rescue (SAR) missions. Such schemes can provide context-aware decisions and trajectory-based learning. The compact size of UAV platforms restricts the use of more advanced AI, such as deep learning and reinforcement learning solutions. Developing deep learning solutions that work within the power and processing constraints of drones remains an open challenge. This represents an opportunity for AI research to unlock autonomous decision-making and learning capabilities for UAVs used in search-and-rescue and other applications.
A.
Power Supply and Flight Duration
To enable extended UAV missions, researchers should investigate novel strategies for energy harvesting and advanced battery materials. Developing new lightweight, high-efficiency batteries can support increased flight times. Further studies need to address efficient power control and energy consumption mechanisms for drones. There is a need to explore new ways of harvesting energy and improving battery technology to overcome the limitations of UAV flight endurance. Lightweight, high-density battery chemistries and intelligent power management can help optimize energy usage during flights. Advancing UAV capabilities require prioritizing the research into onboard power systems and energy consumption. Moreover, further research into optimizing aerodynamic designs, lightweight structures, and energy-efficient components can improve the power consumption and range in UAVs. Investigating the mechanisms for in-flight charging or swappable power packs also offers flexibility to extend UAVs’ flight duration. As researchers make progress in harvesting energy, storing power, and managing consumption efficiently, UAVs will be able to take on more ambitious applications in science, commerce, and remote sensing.
B.
Data and Sensor Capabilities
To support efficient anomaly detection, urban mapping, and other environmental applications, researchers should focus on implementing new sensing devices, specialized cameras, multi-criteria decision algorithms, multispectral imagery, edge/fog computing coexistence [225], and remote sensing and positioning mechanisms for UAVs. Advancing drone-based deforestation mapping, for instance, requires integrating emerging technologies, such as multi-sensor systems, hyper/multispectral cameras, distributed computing architectures, and high-precision geospatial tracking. These capabilities can enable the advanced detection of various tree species, biodiversity richness, and urban ecosystem characteristics by aerial UAV platforms. Another promising area for UAV applications in climate change research is data filtering. This technique can be implemented in UAVs to prevent redundant data, limit duplicates, block illegal access, and filter false locations. UAVs can be integrated with emerging intelligent reflecting surfaces (IRSs) to enhance the positioning, localization, and sensing (PLS) performances of IRS-assisted UAV systems in diverse scenarios while maintaining computational complexity and overall system capabilities [226]. Embedding data filtering in UAVs can help avoid redundant data and false location information. Integrating UAVs with intelligent reflecting surfaces can also boost navigation and sensing abilities in complex urban environments without increasing system complexity.
C.
Safety and Privacy
Further research should explore the use of UAVs in resilient public safety networks. The key areas to address include public safety communications, disaster response in cities for public health, and blockchain integration in UAVs to improve health monitoring [227]. UAVs have great potential to support public safety initiatives and disaster management, especially in urban areas. The future work should focus on enabling UAV-based communications, healthcare delivery, and supply chain tracking for public health emergencies through technologies, such as blockchain. This can enhance the resilience and responsiveness of safety networks using autonomous drone platforms. Implementing security algorithms, such as blockchain, for UAV swarms requires increased computing power and storage on drones, which can reduce the flight time and create latency. However, the resource-constrained nature of UAVs must be considered. Further research is needed to develop secure blockchain implementations tailored to drones without negatively impacting their limited payloads, processing capabilities, and flight endurance times. Security enhancements, such as blockchain, can improve UAV swarm resilience but must balance the processing demands and power consumption [228]. Optimized blockchain architectures can provide security with lower overhead equipment. Advancing UAV swarm security will necessitate co-designing optimized security protocols and hardware to maintain the capabilities within strict weight, processing, and power limits.
With the further advancement of UAVs in areas, such as standardized regulations, privacy protections, image-processing algorithms, low-cost sensors, extended flight times, and larger payload capacities, there will be more opportunities to integrate drone technology into diverse applications, such as field crop phenotyping. However, image-processing techniques for UAVs still face critical challenges, including varying image orientations, higher overlaps, variable scales, and changing altitudes. Overcoming these issues should be a priority for future UAV research to leverage their potential across different domains fully. As drones become more capable and compliant, researchers will need to develop robust image-processing capabilities that can handle the demands of real-world agricultural, commercial, and scientific applications. Tackling variables, such as image overlaps, scales, and orientations, will be vital to unlocking the full value of UAVs with enhanced flight endurance times, sensors, and payloads.

6. Conclusions

The rapid growth of UAVs in various research domains, including climate change, has led to increasing numbers of patents and scientific publications related to drone technology. This expanding R&D is driven by innovations enabling greater mobility, autonomy, and range for UAV platforms. The demand for enhanced capabilities has resulted in new designs for battery swapping systems, docking stations, and precision landing. These advancements have opened up drone applications in areas, such as fire detection, disaster monitoring, precision agriculture, wireless communication, remote sensing, built environment assessment, power-line inspection, and traffic control. As the research expands and UAV technologies progress, drones are being adopted across more industries and public services that can benefit from their unique high mobility and autonomy attributes. The pace of innovation through patents, publications, and R&D continues to accelerate, creating drones with improved performances to take on critical civilian, commercial, and governmental roles.
This paper highlighted the considerable potential of UAVs to advance climate change research and monitoring in the built environment. Due to their mobility, flexibility, and low cost, drones equipped with thermal sensors, multispectral cameras, and other instruments can perform the high-resolution mapping of cities. In the future, ubiquitous and persistent drone monitoring can revolutionize how cities track and model the local climate. Enabled by improved batteries, drones may patrol above cities, streaming real-time thermal and emissions data to inform early warning systems, infrastructure upgrades, and policy adjustments. As the regulations evolve, autonomous drones can map constantly shifting heat and pollutant hotspots to protect vulnerable residents. With regular UAV-based 3D modeling, cities can visualize local warming and resilience efforts in microscale detail. On-demand drone data will likely transform urban climate adaptation through hyperlocal actions.
UAVs offer tremendous potential to support cities worldwide in crafting targeted, sustainable strategies for urban climate resilience. With regular drone mapping, cities can track indicators and vulnerabilities in granular detail to assess the climate risks for each neighborhood. UAVs equipped with multispectral sensors can identify where urban greenery interventions would most effectively cool down vulnerable communities. Drones also enable the rapid assessment of infrastructure integrity and renewable energy potential across rooftops to guide upgrades and decentralized energy growth. For disaster responses, UAVs allow real-time damage mapping so repairs can rapidly restore crucial services.
Furthermore, persistent UAV monitoring provides an early warning for emerging threats, ranging from rising temperatures in heat islands to pollutants from infrastructure failures. Drone sensor data streamed continuously can trigger alerts and protective measures. UAVs also support smarter urban designs, with drone-enabled 3D modeling and simulations of ventilation corridors, albedo changes, and green infrastructure to optimize warming resilience. When recovering from climate disasters, UAVs have proven their worth for search/rescue, needs assessment, and guiding rebuilt infrastructures to be low carbon and hazard resistant. However, challenges remain in data processing, flight endurance, citizen concerns over privacy, and airspace management. To fully leverage UAVs, cities must address data security, regulations, citizen privacy, and air traffic management concerns. The regulations need to balance innovation, safety, and public interests. Moreover, researchers should develop robust data fusion and modeling techniques to synthesize large drone-based datasets for cities. Energy and battery limitations currently constrain continuous operations. Regulators need to balance safety, privacy, and innovation in rulemaking. Nonetheless, the outlook is bright for UAVs to provide unprecedented spatiotemporal details on urban climate change, providing cities worldwide with new ideas to create targeted, neighborhood-level strategies for mitigation and adaptation.
As UAV and AI technologies continue advancing, drones are poised to become integral to cities’ arsenal against climate change. Swarms of interconnected drones enabled with AI can eventually conduct the autonomous, real-time mapping of hyperlocal emissions, urban heat patterns, infrastructure stresses, and extreme weather impacts. Advanced computer vision algorithms can rapidly translate drone-captured visual, thermal, and hyperspectral data into actionable insights and alerts for city leaders. UAVs have great potential when augmented with AI for intelligent navigation, swarm coordination, and data modeling. AI-assisted UAVs can revolutionize urban planning and climate adaptation decision making. Thus, UAVs are set to provide urban leaders worldwide with an unprecedented window into local climate risks, mitigation options, and vulnerabilities to create targeted resilience strategies that improve sustainability and provide climate justice.
In the future, cities may deploy multi-drone fleets with AI oversights that operate continuously to provide persistent environmental intelligence. Community concerns and air traffic risks must be addressed; however, cities worldwide stand to benefit immensely from the 24/7 bird’s eye view of local climate threats only achieved through tireless, AI-enhanced UAVs. With a strategic adoption, drones and AI can enable cities to enact precisely tailored interventions while continuously evaluating their effectiveness, creating an agile, responsive climate adaptation loop. UAVs are thus set to shape the next generation of urban climate resilience worldwide. In conclusion, this paper underscored how the fusion of UAVs and AI can provide unprecedented insights to help cities worldwide confront the growing threat of climate change. Through continued innovation, standardization, and responsible use, UAVs are poised to transform how cities understand and adapt to accelerating local climate impacts in the coming decades.

Author Contributions

Conceptualization, N.B. and J.E.F.; methodology, N.B. and J.E.F.; formal analysis, N.B. and J.E.F.; investigation, N.B. and J.E.F.; resources, N.B. and J.E.F.; writing—original draft preparation, N.B. and J.E.F.; writing—review and editing, N.B. and J.E.F.; visualization, N.B.; supervision, J.E.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The Authors acknowledge the support from University Mohammed vi Polytechnic and the Bank of Bangkok for supporting this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [PubMed]
  2. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  3. Pajares, G. Overview and Current Status of Remote Sensing Applications Based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef]
  4. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef]
  5. Roldán, J.J.; Joossen, G.; Sanz, D.; Del Cerro, J.; Barrientos, A. Mini-UAV Based Sensory System for Measuring Environmental Variables in Greenhouses. Sensors 2015, 15, 3334–3350. [Google Scholar] [CrossRef] [PubMed]
  6. Zhang, J.; Jung, J.; Sohn, G.; Cohen, M. Thermal Infrared Inspection Of Roof Insulation Using Unmanned Aerial Vehicles. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-1/W4, 381–386. [Google Scholar] [CrossRef]
  7. Rakha, T.; Amanda Liberty; Gorodetsky, A.; Kakillioglu, B.; Velipasalar, S. Heat Mapping Drones: An Autonomous Computer-Vision-Based Procedure for Building Envelope Inspection Using Unmanned Aerial Systems (UAS). Technol. Des. 2018, 2, 30–44. [Google Scholar] [CrossRef]
  8. Villa, T.F.; Gonzalez, F.; Miljievic, B.; Ristovski, Z.D.; Morawska, L. An Overview of Small Unmanned Aerial Vehicles for Air Quality Measurements: Present Applications and Future Prospectives. Sensors 2016, 16, 1072. [Google Scholar] [CrossRef]
  9. Isibue, E.W.; Pingel, T.J. Unmanned aerial vehicle based measurement of urban forests. Urban For. Urban Green. 2019, 48, 126574. [Google Scholar] [CrossRef]
  10. Yigitcanlar, T.; Desouza, K.C.; Butler, L.; Roozkhosh, F. Contributions and Risks of Artificial Intelligence (AI) in Building Smarter Cities: Insights from a Systematic Review of the Literature. Energies 2020, 13, 1473. [Google Scholar] [CrossRef]
  11. Wagner, B.; Egerer, M. Application of UAV remote sensing and machine learning to model and map land use in urban gardens. J. Urban Ecol. 2022, 8, juac008. [Google Scholar] [CrossRef]
  12. Olson, D.; Anderson, J. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agron. J. 2021, 113, 971–992. [Google Scholar] [CrossRef]
  13. Bouvry, P.; Chaumette, S.; Danoy, G.; Guerrini, G.; Jurquet, G.; Kuwertz, A.; Muller, W.; Rosalie, M.; Sander, J. Using heterogeneous multilevel swarms of UAVs and high-level data fusion to support situation management in surveillance scenarios. In Proceedings of the 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Baden-Baden, Germany, 19–21 September 2016; pp. 424–429. [Google Scholar] [CrossRef]
  14. Khosiawan, Y.; Park, Y.; Moon, I.; Nilakantan, J.M.; Nielsen, I. Task scheduling system for UAV operations in indoor environment. Neural Comput. Appl. 2018, 31, 5431–5459. [Google Scholar] [CrossRef]
  15. Alanezi, M.A.; Shahriar, M.S.; Hasan, B.; Ahmed, S.; Sha’Aban, Y.A.; Bouchekara, H.R.E.H. Livestock Management With Unmanned Aerial Vehicles: A Review. IEEE Access 2022, 10, 45001–45028. [Google Scholar] [CrossRef]
  16. Lan, Y.; Chen, S. Current status and trends of plant protection UAV and its spraying technology in China. Int. J. Precis. Agric. Aviat. 2018, 1, 1–9. [Google Scholar] [CrossRef]
  17. Cai, H.; Geng, Q. Research on the Development Process and Trend of Unmanned Aerial Vehicle. In Proceedings of the 2015 International Industrial Informatics and Computer Engineering Conference; Xi'an, Shaanxi, China Advances in Computer Science Research; Atlantis Press: Paris, France, 2015. [Google Scholar] [CrossRef]
  18. Qi, N.; Wang, W.; Ye, D.; Wang, M.; Tsiftsis, T.A.; Yao, R. Energy-efficient full-duplex UAV relaying networks: Trajectory design for channel-model-free scenarios. ETRI J. 2021, 43, 436–446. [Google Scholar] [CrossRef]
  19. Szalanczi-Orban, V.; Vaczi, D. Use of Drones in Logistics: Options in Inventory Control Systems. Interdiscip. Descr. Complex Syst. 2022, 20, 295–303. [Google Scholar] [CrossRef]
  20. Ventura, D.; Grosso, L.; Pensa, D.; Casoli, E.; Mancini, G.; Valente, T.; Scardi, M.; Rakaj, A. Coastal benthic habitat mapping and monitoring by integrating aerial and water surface low-cost drones. Front. Mar. Sci. 2023, 9, 1096594. [Google Scholar] [CrossRef]
  21. Hodgson, J.C.; Baylis, S.M.; Mott, R.; Herrod, A.; Clarke, R.H. Precision wildlife monitoring using unmanned aerial vehicles. Sci. Rep. 2016, 6, 22574. [Google Scholar] [CrossRef]
  22. Mangewa, L.J.; Ndakidemi, P.A.; Munishi, L.K. Integrating UAV Technology in an Ecological Monitoring System for Community Wildlife Management Areas in Tanzania. Sustainability 2019, 11, 6116. [Google Scholar] [CrossRef]
  23. Jiang, W.; Liu, L.; Xiao, H.; Zhu, S.; Li, W.; Liu, Y. Composition and distribution of vegetation in the water level fluctuating zone of the Lantsang cascade reservoir system using UAV multispectral imagery. PLoS ONE 2021, 16, e0247682. [Google Scholar] [CrossRef]
  24. Serafinelli, E. Imagining the social future of drones. Converg. Int. J. Res. N. Media Technol. 2022, 28, 1376–1391. [Google Scholar] [CrossRef]
  25. Hobbs, A.; Lyall, B. Human factors guidelines for unmanned aircraft systems. Ergon. Des. 2016, 24, 23–28. [Google Scholar] [CrossRef]
  26. Abdelkader, M.; Koubaa, A. Unmanned Aerial Vehicles Applications: Challenges and Trends; Springer: Berlin/Heidelberg, Germany, 2023. [Google Scholar]
  27. Minkina, W. Theoretical basics of radiant heat transfer—Practical examples of calculation for the infrared (IR) used in infrared thermography measurements. Quant. Infrared Thermogr. J. 2020, 18, 269–282. [Google Scholar] [CrossRef]
  28. Deane, S.; Avdelidis, N.P.; Ibarra-Castanedo, C.; Williamson, A.A.; Withers, S.; Zolotas, A.; Maldague, X.P.V.; Ahmadi, M.; Pant, S.; Genest, M.; et al. Development of a thermal excitation source used in an active thermographic UAV platform. Quant. Infrared Thermogr. J. 2022, 1–32. [Google Scholar] [CrossRef]
  29. Radmanesh, M.; Kumar, M.; Nemati, A.; Sarim, M. Dynamic optimal UAV trajectory planning in the National Airspace System via mixed integer linear programming. Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng. 2015, 230, 1668–1682. [Google Scholar] [CrossRef]
  30. Sarim, M.; Radmanesh, M.; Dechering, M.; Kumar, M.; Pragada, R.; Cohen, K. Distributed Detect-and-Avoid for Multiple Unmanned Aerial Vehicles in National Air Space. J. Dyn. Syst. Meas. Control 2019, 141, 071014. [Google Scholar] [CrossRef]
  31. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Madrigal, V.P.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
  32. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  33. Mader, D.; Blaskow, R.; Westfeld, P.; Maas, H.-G. Uav-based acquisition of 3d point cloud—A comparison of a low-cost laser scanner and sfm-tools. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-3/W3, 335–341. [Google Scholar] [CrossRef]
  34. Wang, Q.; Mao, B.; Stoliarov, S.I.; Sun, J. A review of lithium ion battery failure mechanisms and fire prevention strategies. Prog. Energy Combust. Sci. 2019, 73, 95–131. [Google Scholar] [CrossRef]
  35. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  36. Ahmed, F.; Mohanta, J.C.; Keshari, A.; Yadav, P.S. Recent Advances in Unmanned Aerial Vehicles: A Review. Arab. J. Sci. Eng. 2022, 47, 7963–7984. [Google Scholar] [CrossRef] [PubMed]
  37. Ranyal, E.; Jain, K. Unmanned Aerial Vehicle’s Vulnerability to GPS Spoofing a Review. J. Indian Soc. Remote Sens. 2020, 49, 585–591. [Google Scholar] [CrossRef]
  38. Koubaa, A.; Ammar, A.; Abdelkader, M.; Alhabashi, Y.; Ghouti, L. AERO: AI-Enabled Remote Sensing Observation with Onboard Edge Computing in UAVs. Remote Sens. 2023, 15, 1873. [Google Scholar] [CrossRef]
  39. Mohsan, S.A.H.; Othman, N.Q.H.; Li, Y.; Alsharif, M.H.; Khan, M.A. Unmanned aerial vehicles (UAVs): Practical aspects, applications, open challenges, security issues, and future trends. Intell. Serv. Robot. 2023, 16, 1–29. [Google Scholar] [CrossRef]
  40. Piekkoontod, T.; Pachana, B.; Hrimpeng, K.; Charoenjit, K. Assessments of Nipa Forest Using Landsat Imagery Enhanced with Unmanned Aerial Vehicle Photography. Appl. Environ. Res. 2020, 42, 49–59. [Google Scholar] [CrossRef]
  41. Suran, N.A.; Shafri, H.Z.M.; Shaharum, N.S.N.; Radzali, N.A.W.M.; Kumar, V. Uav-based hyperspectral data analysis for urban area mapping. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-4/W16, 621–626. [Google Scholar] [CrossRef]
  42. Dario, J.; Millian, R. Towards the Application of UAS for Road Maintenance at the Norvik Port. Master’s Thesis, KTH Royal Institute of Technology School of Architecture and the Built Environment, Stockholm, Sweden, 2019. [Google Scholar]
  43. Zohdi, T.I. Multiple UAVs for Mapping: A Review of Basic Modeling, Simulation, and Applications. Annu. Rev. Environ. Resour. 2018, 43, 523–543. [Google Scholar] [CrossRef]
  44. Chisholm, R.A.; Cui, J.; Lum, S.K.Y.; Chen, B.M. UAV LiDAR for below-canopy forest surveys. J. Unmanned Veh. Syst. 2013, 1, 61–68. [Google Scholar] [CrossRef]
  45. Ader, M.; Axelsson, D. Drones in Arctic Environments. Master’s Thesis, KTH School of Industrial Engineering and Management (ITM), Stockholm, Sweden, 2017. [Google Scholar]
  46. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  47. Merkert, R.; Bushell, J. Managing the drone revolution: A systematic literature review into the current use of airborne drones and future strategic directions for their effective control. J. Air Transp. Manag. 2020, 89, 101929. [Google Scholar] [CrossRef]
  48. Bansod, B.; Singh, R.; Thakur, R.; Singhal, G. A comparision between satellite based and drone based remote sensing technology to achieve sustainable development: A review. J. Agric. Environ. Int. Dev. 2017, 111, 383–407. [Google Scholar] [CrossRef]
  49. Myburgh, A.; Botha, H.; Downs, C.T.; Woodborne, S.M. The Application and Limitations of a Low-Cost UAV Platform and Open-Source Software Combination for Ecological Mapping and Monitoring. Afr. J. Wildl. Res. 2021, 51, 166–177. [Google Scholar] [CrossRef]
  50. Cook, K.L. An evaluation of the effectiveness of low-cost UAVs and structure from motion for geomorphic change detection. Geomorphology 2017, 278, 195–208. [Google Scholar] [CrossRef]
  51. Awais, M.; Li, W.; Hussain, S.; Cheema, M.J.M.; Li, W.; Song, R.; Liu, C. Comparative Evaluation of Land Surface Temperature Images from Unmanned Aerial Vehicle and Satellite Observation for Agricultural Areas Using In Situ Data. Agriculture 2022, 12, 184. [Google Scholar] [CrossRef]
  52. Al-Farabi, M.; Chowdhury, M.; Readuzzaman, M.; Hossain, M.R.; Sabuj, S.R.; Hossain, M.A. Smart Environment Monitoring System using Unmanned Aerial Vehicle in Bangladesh. EAI Endorsed Trans. Smart Cities 2018, 5, e1. [Google Scholar] [CrossRef]
  53. Gordan, M.; Ismail, Z.; Ghaedi, K.; Ibrahim, Z.; Hashim, H.; Ghayeb, H.H.; Talebkhah, M. A Brief Overview and Future Perspective of Unmanned Aerial Systems for In-Service Structural Health Monitoring. Eng. Adv. 2021, 1, 9–15. [Google Scholar] [CrossRef]
  54. Eiris, R.; Albeaino, G.; Gheisari, M.; Benda, W.; Faris, R. InDrone: A 2D-based drone flight behavior visualization platform for indoor building inspection. Smart Sustain. Built Environ. 2021, 10, 438–456. [Google Scholar] [CrossRef]
  55. Sabour, M.; Jafary, P.; Nematiyan, S. Applications and classifications of unmanned aerial vehicles: A literature review with focus on multi-rotors. Aeronaut. J. 2022, 127, 466–490. [Google Scholar] [CrossRef]
  56. Idrissi, M.; Salami, M.; Annaz, F. A Review of Quadrotor Unmanned Aerial Vehicles: Applications, Architectural Design and Control Algorithms. J. Intell. Robot. Syst. 2022, 104, 1–33. [Google Scholar] [CrossRef]
  57. Wang, J.; Ma, C.; Chen, P.; Yao, W.; Yan, Y.; Zeng, T.; Chen, S.; Lan, Y. Evaluation of aerial spraying application of multi-rotor unmanned aerial vehicle for Areca catechu protection. Front. Plant Sci. 2023, 14, 1093912. [Google Scholar] [CrossRef]
  58. Johnson, E.N.; Mooney, J.G. A Comparison of Automatic Nap-of-the-earth Guidance Strategies for Helicopters. J. Field Robot. 2014, 31, 637–653. [Google Scholar] [CrossRef]
  59. Amorim, M.; Lousada, A. Tethered Drone for Precision Agriculture. Master’s Thesis, University of Porto, Porto, Portugal, 2021. [Google Scholar]
  60. Winnefeld, J.A., Jr.; Kendall, F. Unmanned Systems Integrated Roadmap FY2011–2036; Technical Report 14-S-0553; United States Department of Defence: Washington, DC, USA, 2017. [Google Scholar]
  61. Tang, H.; Zhang, D.; Gan, Z. Control System for Vertical Take-off and Landing Vehicle’s Adaptive Landing Based on Multi-Sensor Data Fusion. Sensors 2020, 20, 4411. [Google Scholar] [CrossRef]
  62. Misra, A.; Jayachandran, S.; Kenche, S.; Katoch, A.; Suresh, A.; Gundabattini, E.; Selvaraj, S.K.; Legesse, A.A. A Review on Vertical Take-Off and Landing (VTOL) Tilt-Rotor and Tilt Wing Unmanned Aerial Vehicles (UAVs). J. Eng. 2022, 2022, 1–27. [Google Scholar] [CrossRef]
  63. Vergouw, B.; Nagel, H.; Bondt, G.; Custers, B. Drone technology: Types, payloads, applications, frequency spectrum issues and future developments. In The Future of Drone Use; TMC Asser Press: Hague, The Netherlands, 2016; pp. 21–45. [Google Scholar] [CrossRef]
  64. Hayat, S.; Yanmaz, E.; Muzaffar, R. Survey on Unmanned Aerial Vehicle Networks for Civil Applications: A Communications Viewpoint. IEEE Commun. Surv. Tutor. 2016, 18, 2624–2661. [Google Scholar] [CrossRef]
  65. Gupta, L.; Jain, R.; Vaszkun, G. Survey of Important Issues in UAV Communication Networks. IEEE Commun. Surv. Tutor. 2016, 18, 1123–1152. [Google Scholar] [CrossRef]
  66. Villa, T.F.; Salimi, F.; Morton, K.; Morawska, L.; Gonzalez, F. Development and Validation of a UAV Based System for Air Pollution Measurements. Sensors 2016, 16, 2202. [Google Scholar] [CrossRef]
  67. Wallace, L.O.; Lucieer, A.; Watson, C.S. Assessing the feasibility of uav-based lidar for high resolution forest change detection. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B7, 499–504. [Google Scholar] [CrossRef]
  68. Naughton, J.; McDonald, W. Evaluating the Variability of Urban Land Surface Temperatures Using Drone Observations. Remote Sens. 2019, 11, 1722. [Google Scholar] [CrossRef]
  69. Eagle, A. S.O.D.A—eBee Series. 2023. Available online: https://ageagle.com/solutions/ebee-series/ (accessed on 15 May 2023).
  70. DJI. DJI Drones. 2023. Available online: https://www.dji.com/ (accessed on 8 July 2023).
  71. Parrot. Parrot Drones—Anafi. 2023. Available online: https://www.parrot.com/us/drones (accessed on 22 July 2023).
  72. Yuneec. Yuneec Drones. 2023. Available online: https://yuneec.online/ (accessed on 23 June 2023).
  73. Walkera Tech. Voyager 3. 2023. Available online: http://www.walkeratech.com/25.html (accessed on 10 July 2023).
  74. Rajan, J.; Shriwastav, S.; Kashyap, A.; Ratnoo, A.; Ghose, D. Disaster Management Using Unmanned Aerial Vehicles; Elsevier: Amsterdam, The Netherlands, 2021; pp. 129–155. [Google Scholar] [CrossRef]
  75. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
  76. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.R.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agric. Forest Meteorol. 2013, 171–172, 281–294. [Google Scholar] [CrossRef]
  77. Cillero Castro, C.; Domínguez Gómez, J.A.; Delgado Martín, J.; Hinojo Sánchez, B.A.; Cereijo Arango, J.L.; Cheda Tuya, F.A.; Díaz-Varela, R. An UAV and Satellite Multispectral Data Approach to Monitor Water Quality in Small Reservoirs. Remote Sens 2020, 12, 1–33. [Google Scholar] [CrossRef]
  78. Wei, G.; Li, Y.; Zhang, Z.; Chen, Y.; Chen, J.; Yao, Z.; Lao, C.; Chen, H. Estimation of soil salt content by combining UAV-borne multispectral sensor and machine learning algorithms. PeerJ 2020, 8, e9087. [Google Scholar] [CrossRef]
  79. Meier, F.; Scherer, D.; Richters, J.; Christen, A. Atmospheric correction of thermal-infrared imagery of the 3-D urban environment acquired in oblique viewing geometry. Atmos. Meas. Tech. 2011, 4, 909–922. [Google Scholar] [CrossRef]
  80. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  81. Feng, Q.; Liu, J.; Gong, J. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef]
  82. Ren, S.; Malof, J.; Fetter, R.; Beach, R.; Rineer, J.; Bradbury, K. Utilizing Geospatial Data for Assessing Energy Security: Mapping Small Solar Home Systems Using Unmanned Aerial Vehicles and Deep Learning. ISPRS Int. J. GeoInf. 2022, 11, 222. [Google Scholar] [CrossRef]
  83. Jia, J.; Cui, W.; Liu, J. Urban Catchment-Scale Blue-Green-Gray Infrastructure Classification with Unmanned Aerial Vehicle Images and Machine Learning Algorithms. Front. Environ. Sci. 2022, 9, 734. [Google Scholar] [CrossRef]
  84. Ahmad, J.; Eisma, J.A. Capturing Small-Scale Surface Temperature Variation across Diverse Urban Land Uses with a Small Unmanned Aerial Vehicle. Remote Sens. 2023, 15, 2042. [Google Scholar] [CrossRef]
  85. Sentera. High-Precision Single Sensor. 2023. Available online: https://sentera.com/products/fieldcapture/sensors/single/ (accessed on 10 August 2023).
  86. Mapir. Survey3 Cameras. 2023. Available online: https://www.mapir.camera/collections/survey3 (accessed on 5 July 2023).
  87. GeoSpatial PhaseOne. Phaseone iXM-100|iXM-50. 2023, p. 100. Available online: https://geospatial.phaseone.com/cameras/ixm-100/ (accessed on 5 August 2023).
  88. Imaging, R. RICOH GR III/GR IIIx. 2023. Available online: https://www.ricoh-imaging.co.jp/ (accessed on 15 May 2023).
  89. Sentek Systems. Gems Sensor. 2023. Available online: http://precisionaguavs.com/ (accessed on 18 July 2023).
  90. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
  91. Green, D.R.; Hagon, J.J.; Gómez, C.; Gregory, B.J. Using Low-Cost UAVs for Environmental Monitoring, Mapping, and Modelling: Examples From the Coastal Zone. In Coastal Management; Elsevier: Amsterdam, The Netherlands, 2019; pp. 465–501. [Google Scholar] [CrossRef]
  92. Popescu, D.; Ichim, L.; Stoican, F. Unmanned Aerial Vehicle Systems for Remote Estimation of Flooded Areas Based on Complex Image Processing. Sensors 2017, 17, 446. [Google Scholar] [CrossRef] [PubMed]
  93. Kuras, A.; Brell, M.; Rizzi, J.; Burud, I. Hyperspectral and Lidar Data Applied to the Urban Land Cover Machine Learning and Neural-Network-Based Classification: A Review. Remote Sens. 2021, 13, 3393. [Google Scholar] [CrossRef]
  94. Headwall. Hyperspectral Sensors. 2023. Available online: https://www.headwallphotonics.com/products/hyperspectral-sensors (accessed on 22 May 2023).
  95. Corning Optics. Nova Sol. 2023. Available online: https://www.corning.com/asean/en/products/advanced-optics/product-materials/aerospace-defense/spectral-sensing.html (accessed on 19 April 2023).
  96. Cubert. Hyperspectral Sensors. 2023. Available online: https://www.cubert-hyperspectral.com (accessed on 25 April 2023).
  97. Resonon. Hyperspectral Imaging Cameras|Hyperspectral Imaging Solutions. 2023. Available online: https://resonon.com/objective-lenses (accessed on 13 June 2023).
  98. Carotenuto, F.; Brilli, L.; Gioli, B.; Gualtieri, G.; Vagnoli, C.; Mazzola, M.; Viola, A.P.; Vitale, V.; Severi, M.; Traversi, R.; et al. Long-Term Performance Assessment of Low-Cost Atmospheric Sensors in the Arctic Environment. Sensors 2020, 20, 1919. [Google Scholar] [CrossRef] [PubMed]
  99. Wildmann, N.; Ravi, S.; Bange, J. Towards higher accuracy and better frequency response with standard multi-hole probes in turbulence measurement with remotely piloted aircraft (RPA). Atmos. Meas. Tech. 2014, 7, 1027–1041. [Google Scholar] [CrossRef]
  100. Brosy, C.; Krampf, K.; Zeeman, M.; Wolf, B.; Junkermann, W.; Schäfer, K.; Emeis, S.; Kunstmann, H. Simultaneous multicopter-based air sampling and sensing of meteorological variables. Atmos. Meas. Tech. 2017, 10, 2773–2784. [Google Scholar] [CrossRef]
  101. Fumian, F.; Chierici, A.; Bianchelli, M.; Martellucci, L.; Rossi, R.; Malizia, A.; Gaudio, P.; D’errico, F.; Di Giovanni, D. Development and performance testing of a miniaturized multi-sensor system combining MOX and PID for potential UAV application in TIC, VOC and CWA dispersion scenarios. Eur. Phys. J. Plus 2021, 136, 1–19. [Google Scholar] [CrossRef]
  102. Scheller, J.H.; Mastepanov, M.; Christensen, T.R. Toward UAV-based methane emission mapping of Arctic terrestrial ecosystems. Sci. Total. Environ. 2022, 819, 153161. [Google Scholar] [CrossRef]
  103. Luo, Z.; Che, J.; Wang, K. Detection of UAV target based on Continuous Radon transform and Matched filtering process for Passive Bistatic Radar. Authorea Preprints; 7 April 2022. [CrossRef]
  104. Tian, B.; Liu, W.; Mo, H.; Li, W.; Wang, Y.; Adhikari, B.R. Detecting the Unseen: Understanding the Mechanisms and Working Principles of Earthquake Sensors. Sensors 2023, 23, 5335. [Google Scholar] [CrossRef]
  105. Li-Cor. TriSonica Weather Sensors. 2023. Available online: https://anemoment.com/shop/sensors/trisonica-mini-wind-and-weather-sensor/ (accessed on 18 August 2023).
  106. AirMar. AIRMAR Sensors 2023. Available online: https://www.airmar.com/ (accessed on 23 August 2023).
  107. FLIR. MUVE C360 2023. Available online: https://www.flir.com/products/muve-c360/ (accessed on 5 August 2023).
  108. Optech, T. Teledyne LiDAR 2023. Available online: https://www.teledyneoptech.com/en/HOME/ (accessed on 10 June 2023).
  109. Esin, A.I.; Akgul, M.; Akay, A.O.; Yurtseven, H. Comparison of LiDAR-based morphometric analysis of a drainage basin with results obtained from UAV, TOPO, ASTER and SRTM-based DEMs. Arab. J. Geosci. 2021, 14, 1–15. [Google Scholar] [CrossRef]
  110. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  111. Brede, B.; Lau, A.; Bartholomeus, H.M.; Kooistra, L. Comparing RIEGL RiCOPTER UAV LiDAR Derived Canopy Height and DBH with Terrestrial LiDAR. Sensors 2017, 17, 2371. [Google Scholar] [CrossRef] [PubMed]
  112. Gao, M.; Yang, F.; Wei, H.; Liu, X. Individual Maize Location and Height Estimation in Field from UAV-Borne LiDAR and RGB Images. Remote Sens. 2022, 14, 2292. [Google Scholar] [CrossRef]
  113. Thiel, C.; Schmullius, C. Comparison of UAV photograph-based and airborne lidar-based point clouds over forest from a forestry application perspective. Int. J. Remote Sens. 2016, 38, 2411–2426. [Google Scholar] [CrossRef]
  114. Ressl, C.; Brockmann, H.; Mandlburger, G.; Pfeifer, N. Dense Image Matching vs. Airborne Laser Scanning—Comparison of two methods for deriving terrain models. Photogramm. Fernerkund. Geoinf. 2016, 2016, 57–73. [Google Scholar] [CrossRef]
  115. Štroner, M.; Urban, R.; Línková, L. A New Method for UAV Lidar Precision Testing Used for the Evaluation of an Affordable DJI ZENMUSE L1 Scanner. Remote Sens. 2021, 13, 4811. [Google Scholar] [CrossRef]
  116. Yan, K.; Di Baldassarre, G.; Solomatine, D.P.; Schumann, G.J. A review of low-cost space-borne data for flood modelling: Topography, flood extent and water level. Hydrol. Process. 2015, 29, 3368–3387. [Google Scholar] [CrossRef]
  117. Casas-Mulet, R.; Pander, J.; Ryu, D.; Stewardson, M.J.; Geist, J. Unmanned Aerial Vehicle (UAV)-Based Thermal Infra-Red (TIR) and Optical Imagery Reveals Multi-Spatial Scale Controls of Cold-Water Areas Over a Groundwater-Dominated Riverscape. Front. Environ. Sci. 2020, 8, 1–16. [Google Scholar] [CrossRef]
  118. Rakha, T.; Gorodetsky, A. Review of Unmanned Aerial System (UAS) applications in the built environment: Towards automated building inspection procedures using drones. Autom. Constr. 2018, 93, 252–264. [Google Scholar] [CrossRef]
  119. Eschmann, C.; Kuo, C.M.; Kuo, C.H.; Boller, C. Unmanned aircraft systems for remote building inspection and monitoring. In Proceedings of the 6th European Workshop—Structural Health Monitoring—Th.2.B.1, Dresden, Germany, 3–6 July 2012; pp. 1179–1186. [Google Scholar]
  120. Feng, L.; Liu, Y.; Zhou, Y.; Yang, S. A UAV-derived thermal infrared remote sensing three-temperature model and estimation of various vegetation evapotranspiration in urban micro-environments. Urban For. Urban Green. 2022, 69, 127495. [Google Scholar] [CrossRef]
  121. Rakha, T.; El Masri, Y.; Chen, K.; De Wilde, P. 3D drone-based time-lapse thermography: A case study of roof vulnerability characterization using photogrammetry and performance simulation implications. In Proceedings of the 17th IBPSA Conference, Bruges, Belgium, 1–3 September 2021; pp. 2023–2030. [Google Scholar] [CrossRef]
  122. FLIR. FLIR IR Sensors n.d. Available online: https://www.flir.com/ (accessed on 17 May 2023).
  123. Workswell. Thermal Imaging Cameras for UAV Systems. 2023. Available online: https://workswell-thermal-camera.com/ (accessed on 12 August 2023).
  124. Cox, T.H.; Somers, I.; Fratello, S. Earth Observations and the Role of UAVs: A Capabilities Assessment, Version 1.1; Technical Report; Civil UAV Team, NASA: Washington, DC, USA, 2006; p. 346. [Google Scholar]
  125. Lessard-Fontaine, A.; Alschner, F.; Soesilo, D. Using High-resolution Imagery to Support the Post-earthquake Census in Port-au-Prince, Haiti. Drones Humanit Action 2013:0–4. European Civil Protection and Humanitarian Aid Operations, Brussels, Belgium. Available online: https://reliefweb.int/report/haiti/drones-humanitarian-action-case-study-no7-using-high-resolution-imagery-support-post (accessed on 7 July 2023).
  126. UNICEF Innovation. Low-Cost Drones Deliver Medicines in Malawi; UNICEF: New York, NY, USA, 2017. [Google Scholar]
  127. UNICEF. Drone Testing Corridors Established in Kazakhstan; UNICEF: New York, NY, USA, 2018. [Google Scholar]
  128. Lim, J.S.; Gleason, S.; Williams, M.; Matás, G.J.L.; Marsden, D.; Jones, W. UAV-Based Remote Sensing for Managing Alaskan Native Heritage Landscapes in the Yukon-Kuskokwim Delta. Remote Sens. 2022, 14, 728. [Google Scholar] [CrossRef]
  129. Djimantoro, M.; Suhardjanto, G. The Advantage by Using Low-Altitude UAV for Sustainable Urban Development Control. IOP Conf. Ser. Earth Environ. Sci. 2017, 109, 012014. [Google Scholar] [CrossRef]
  130. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV Multispectral Imagery Can Complement Satellite Data for Monitoring Forest Health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef]
  131. Gaffey, C.; Bhardwaj, A. Applications of Unmanned Aerial Vehicles in Cryosphere: Latest Advances and Prospects. Remote Sens. 2020, 12, 948. [Google Scholar] [CrossRef]
  132. Musso, R.F.G.; Oddi, F.J.; Goldenberg, M.G.; Garibaldi, L.A. Applying unmanned aerial vehicles (UAVs) to map shrubland structural attributes in northern Patagonia, Argentina. Can. J. For. Res. 2020, 50, 615–623. [Google Scholar] [CrossRef]
  133. Addo, K.A.; Jayson-Quashigah, P.-N.; Codjoe, S.N.A.; Martey, F. Drone as a tool for coastal flood monitoring in the Volta Delta, Ghana. Geoenviron. Disasters 2018, 5, 1–13. [Google Scholar] [CrossRef]
  134. Shaw, A.; Hashemi, M.R.; Spaulding, M.; Oakley, B.; Baxter, C. Effect of Coastal Erosion on Storm Surge: A Case Study in the Southern Coast of Rhode Island. J. Mar. Sci. Eng. 2016, 4, 85. [Google Scholar] [CrossRef]
  135. Gålfalk, M.; Påledal, S.N.; Bastviken, D. Sensitive Drone Mapping of Methane Emissions without the Need for Supplementary Ground-Based Measurements. ACS Earth Space Chem. 2021, 5, 2668–2676. [Google Scholar] [CrossRef]
  136. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef]
  137. Shaw, J.T.; Shah, A.; Yong, H.; Allen, G. Methods for quantifying methane emissions using unmanned aerial vehicles: A review. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2021, 379, 20200450. [Google Scholar] [CrossRef]
  138. Gullett, B.; Aurell, J.; Mitchell, W.; Richardson, J. Use of an unmanned aircraft system to quantify NOx emissions from a natural gas boiler. Atmos. Meas. Tech. 2021, 14, 975–981. [Google Scholar] [CrossRef] [PubMed]
  139. Raval, S. Smart Sensing for Mineral Exploration through to Mine Closure. Int. J. Georesour. Environ. 2018, 4, 115–119. [Google Scholar] [CrossRef]
  140. Namburu, A.; Selvaraj, P.; Mohan, S.; Ragavanantham, S.; Eldin, E.T. Forest Fire Identification in UAV Imagery Using X-MobileNet. Electronics 2023, 12, 733. [Google Scholar] [CrossRef]
  141. Carvajal-Ramírez, F.; da Silva, J.R.M.; Agüera-Vega, F.; Martínez-Carricondo, P.; Serrano, J.; Moral, F.J. Evaluation of Fire Severity Indices Based on Pre- and Post-Fire Multispectral Imagery Sensed from UAV. Remote Sens. 2019, 11, 993. [Google Scholar] [CrossRef]
  142. Yavuz, M.; Koutalakis, P.; Diaconu, D.C.; Gkiatas, G.; Zaimes, G.N.; Tufekcioglu, M.; Marinescu, M. Identification of Streamside Landslides with the Use of Unmanned Aerial Vehicles (UAVs) in Greece, Romania, and Turkey. Remote Sens. 2023, 15, 1006. [Google Scholar] [CrossRef]
  143. Brook, M.S.; Merkle, J. Monitoring active landslides in the Auckland region utilising UAV/structure-from-motion photogrammetry. Jpn. Geotech. Soc. Spec. Publ. 2019, 6, 1–6. [Google Scholar] [CrossRef]
  144. Ilinca, V.; Șandric, I.; Chițu, Z.; Irimia, R.; Gheuca, I. UAV applications to assess short-term dynamics of slow-moving landslides under dense forest cover. Landslides 2022, 19, 1717–1734. [Google Scholar] [CrossRef]
  145. Mora, O.E.; Lenzano, M.G.; Toth, C.K.; Grejner-Brzezinska, D.A.; Fayne, J.V. Landslide Change Detection Based on Multi-Temporal Airborne LiDAR-Derived DEMs. Geosciences 2018, 8, 23. [Google Scholar] [CrossRef]
  146. Migliazza, M.; Carriero, M.T.; Lingua, A.; Pontoglio, E.; Scavia, C. Rock Mass Characterization by UAV and Close-Range Photogrammetry: A Multiscale Approach Applied along the Vallone dell’Elva Road (Italy). Geosciences 2021, 11, 436. [Google Scholar] [CrossRef]
  147. Mineo, S.; Caliò, D.; Pappalardo, G. UAV-Based Photogrammetry and Infrared Thermography Applied to Rock Mass Survey for Geomechanical Purposes. Remote Sens. 2022, 14, 473. [Google Scholar] [CrossRef]
  148. Loiotine, L.; Andriani, G.F.; Derron, M.-H.; Parise, M.; Jaboyedoff, M. Evaluation of InfraRed Thermography Supported by UAV and Field Surveys for Rock Mass Characterization in Complex Settings. Geosciences 2022, 12, 116. [Google Scholar] [CrossRef]
  149. Fu, X.; Ding, H.; Sheng, Q.; Chen, J.; Chen, H.; Li, G.; Fang, L.; Du, W. Reproduction Method of Rockfall Geologic Hazards Based on Oblique Photography and Three-Dimensional Discontinuous Deformation Analysis. Front. Earth Sci. 2021, 9, 755876. [Google Scholar] [CrossRef]
  150. Dimitrov, S.; Popov, A.; Iliev, M. Mapping and assessment of urban heat island effects in the city of Sofia, Bulgaria through integrated application of remote sensing, unmanned aerial systems (UAS) and GIS. In Proceedings of the Eighth International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2020), Paphos, Cyprus, 16–18 March 2020. [Google Scholar] [CrossRef]
  151. Erenoglu, R.C.; Erenoglu, O.; Arslan, N. Accuracy Assessment of Low Cost UAV Based City Modelling for Urban Planning. Teh. Vjesn. Tech. Gaz. 2018, 25, 1708–1714. [Google Scholar] [CrossRef]
  152. Trepekli, K.; Balstrøm, T.; Friborg, T.; Allotey, A.N. UAV-Borne, LiDAR-Based Elevation Modelling: An Effective Tool for Improved Local Scale Urban Flood Risk Assessment. Nat. Hazards 2021, 113, 423–451. [Google Scholar] [CrossRef]
  153. Pratomo, J.; Widiastomo, T. Implementation of the markov random field for urban land cover classification of uav vhir data. Geoplanning J. Geomat. Plan. 2016, 3, 127–136. [Google Scholar] [CrossRef]
  154. Yang, Y.; Song, F.; Ma, J.; Wei, Z.; Song, L.; Cao, W. Spatial and temporal variation of heat islands in the main urban area of Zhengzhou under the two-way influence of urbanization and urban forestry. PLoS ONE 2022, 17, e0272626. [Google Scholar] [CrossRef] [PubMed]
  155. Bayomi, N.; Nagpal, S.; Rakha, T.; Fernandez, J.E. Building envelope modeling calibration using aerial thermography. Energy Build. 2020, 233, 110648. [Google Scholar] [CrossRef]
  156. Rathinam, S.; Kim, Z.; Soghikian, A.; Sengupta, R. Vision Based Following of Locally Linear Structures using an Unmanned Aerial Vehicle. In Proceedings of the 44th IEEE Conference on Decision and Control, Seville, Spain, 15 December 2005; pp. 6085–6090. [Google Scholar] [CrossRef]
  157. Falorca, J.F.; Lanzinha, J.C.G. Facade inspections with drones–theoretical analysis and exploratory tests. Int. J. Build. Pathol. Adapt. 2020, 39, 235–258. [Google Scholar] [CrossRef]
  158. Ding, W.; Yang, H.; Yu, K.; Shu, J. Crack detection and quantification for concrete structures using UAV and transformer. Autom. Constr. 2023, 152, 104929. [Google Scholar] [CrossRef]
  159. Alzarrad, A.; Awolusi, I.; Hatamleh, M.T.; Terreno, S. Automatic assessment of roofs conditions using artificial intelligence (AI) and unmanned aerial vehicles (UAVs). Front. Built Environ. 2022, 8, 1026225. [Google Scholar] [CrossRef]
  160. Shao, H.; Song, P.; Mu, B.; Tian, G.; Chen, Q.; He, R.; Kim, G. Assessing city-scale green roof development potential using Unmanned Aerial Vehicle (UAV) imagery. Urban For. Urban Green. 2020, 57, 126954. [Google Scholar] [CrossRef]
  161. Vance, S.J.; Richards, M.E.; Walters, M.C. Evaluation of Roof Leak Detection Utilizing Unmanned Aircraft Systems Equipped with Thermographic Sensors; Construction Engineering Research Laboratory (U.S.): Champaign, IL, USA; The U.S. Army Engineer Research and Development Center (ERDC): Vicksburg, MS, USA, 2018. [Google Scholar]
  162. Seo, J.; Duque, L.; Wacker, J. Drone-enabled bridge inspection methodology and application. Autom. Constr. 2018, 94, 112–126. [Google Scholar] [CrossRef]
  163. Ellenberg, A.; Kontsos, A.; Moon, F.; Bartoli, I. Bridge related damage quantification using unmanned aerial vehicle imagery. Struct. Control. Health Monit. 2016, 23, 1168–1179. [Google Scholar] [CrossRef]
  164. Duque, L.; Seo, J.; Wacker, J. Synthesis of Unmanned Aerial Vehicle Applications for Infrastructures. J. Perform. Constr. Facil. 2018, 32, 04018046. [Google Scholar] [CrossRef]
  165. AgEagle Aerial Systems Inc. Drones vs. Traditional Instruments: Corridor Mapping in Turkey UAVs vs. Classical Surveying 2015. pp. 1–4. Available online: https://geo-matching.com/articles/corridor-mapping-in-turkey-using-drones-versus-traditional-instruments (accessed on 25 June 2023).
  166. Nikhil, N.; Shreyas, S.M.; Vyshnavi, G.; Yadav, S. Unmanned Aerial Vehicles (UAV) in Disaster Management Applications. In Proceedings of the 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 August 2020. [Google Scholar]
  167. Nugroho, G.; Taha, Z.; Nugraha, T.S.; Hadsanggeni, H. Development of a Fixed Wing Unmanned Aerial Vehicle (UAV) for Disaster Area Monitoring and Mapping. J. Mechatron. Electr. Power Veh. Technol. 2015, 6, 83–88. [Google Scholar] [CrossRef]
  168. Gao, Y.; Lyu, Z.; Assilzadeh, H.; Jiang, Y. Small and low-cost navigation system for UAV-based emergency disaster response applications. In Proceedings of the 4th Joint International Symposium on Deformation Monitoring (JISDM), Athens, Greece, 15–17 May 2019; pp. 15–17. [Google Scholar]
  169. Suzuki, T.; Miyoshi, D.; Meguro, J.-I.; Amano, Y.; Hashizume, T.; Sato, K.; Takiguchi, J.-I. Real-time hazard map generation using small unmanned aerial vehicle. In Proceedings of the 2008 SICE Annual Conference, Chofu, Japan, 20–22 August 2008. [Google Scholar]
  170. Erdelj, M.; Natalizio, E.; Chowdhury, K.R.; Akyildiz, I.F. Help from the Sky: Leveraging UAVs for Disaster Management. IEEE Pervasive Comput. 2017, 16, 24–32. [Google Scholar] [CrossRef]
  171. Półka, M.; Ptak, S.; Kuziora, Ł. The Use of UAV's for Search and Rescue Operations. Procedia Eng. 2017, 192, 748–752. [Google Scholar] [CrossRef]
  172. Sheng, T.; Jin, R.; Yang, C.; Qiu, K.; Wang, M.; Shi, J.; Zhang, J.; Gao, Y.; Wu, Q.; Zhou, X.; et al. Unmanned Aerial Vehicle Mediated Drug Delivery for First Aid. Adv. Mater. 2023, 35, e2208648. [Google Scholar] [CrossRef]
  173. Casado, M.R.; Irvine, T.; Johnson, S.; Palma, M.; Leinster, P. The Use of Unmanned Aerial Vehicles to Estimate Direct Tangible Losses to Residential Properties from Flood Events: A Case Study of Cockermouth Following the Desmond Storm. Remote Sens. 2018, 10, 1548. [Google Scholar] [CrossRef]
  174. Giordan, D.; Manconi, A.; Remondino, F.; Nex, F. Use of unmanned aerial vehicles in monitoring application and management of natural hazards. Geomat. Nat. Hazards Risk 2016, 8, 1–4. [Google Scholar] [CrossRef]
  175. Khan, M.A.; Ectors, W.; Bellemans, T.; Janssens, D.; Wets, G. UAV-Based Traffic Analysis: A Universal Guiding Framework Based on Literature Survey. Transp. Res. Procedia 2017, 22, 541–550. [Google Scholar] [CrossRef]
  176. Cermakova, I.; Komarkova, J. Modelling a process of UAV data collection and processing. In Proceedings of the 2016 International Conference on Information Society (i-Society), Dublin, Ireland, 10–13 October 2016. [Google Scholar]
  177. Mohanty, S.N.; Ravindra, J.V.; Narayana, G.S.; Pattnaik, C.R.; Sirajudeen, Y.M. Drone Technology: Future Trends and Practical Applications; Wiley: New Jersey, NJ, USA, 2023. [Google Scholar] [CrossRef]
  178. Pix4D. Surveying and Mapping. 2023. Available online: https://www.pix4d.com/ (accessed on 10 May 2023).
  179. DroneDeploy Platform. DroneDeploy. 2023. Available online: https://drondeploy.com (accessed on 17 April 2023).
  180. Hartmann, J.; Jueptner, E.; Matalonga, S.; Riordan, J.; White, S. Artificial Intelligence, Autonomous Drones and Legal Uncertainties. Eur. J. Risk Regul. 2022, 14, 31–48. [Google Scholar] [CrossRef]
  181. Rezwan, S.; Choi, W. Artificial Intelligence Approaches for UAV Navigation: Recent Advances and Future Challenges. IEEE Access 2022, 10, 26320–26339. [Google Scholar] [CrossRef]
  182. Zhang, S.; Zhuo, L.; Zhang, H.; Li, J. Object Tracking in Unmanned Aerial Vehicle Videos via Multifeature Discrimination and Instance-Aware Attention Network. Remote Sens. 2020, 12, 2646. [Google Scholar] [CrossRef]
  183. McEnroe, P.; Wang, S.; Liyanage, M. A Survey on the Convergence of Edge Computing and AI for UAVs: Opportunities and Challenges. IEEE Internet Things J. 2022, 9, 15435–15459. [Google Scholar] [CrossRef]
  184. Hu, L.; Tian, Y.; Yang, J.; Taleb, T.; Xiang, L.; Hao, Y. Ready Player One: UAV-Clustering-Based Multi-Task Offloading for Vehicular VR/AR Gaming. IEEE Netw. 2019, 33, 42–48. [Google Scholar] [CrossRef]
  185. Flammini, F.; Naddei, R.; Pragliola, C.; Smarra, G. Towards automated drone surveillance in railways: State-of-the-art and future directions. In Advanced Concepts for Intelligent Vision Systems 17th International Conference, ACIVS 2016, Lecce, Italy, 24–27 October 2016; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar] [CrossRef]
  186. AlMahamid, F.; Grolinger, K. Autonomous Unmanned Aerial Vehicle navigation using Reinforcement Learning: A systematic review. Eng. Appl. Artif. Intell. 2022, 115, 105321. [Google Scholar] [CrossRef]
  187. Giusti, A.; Guzzi, J.; Ciresan, D.C.; He, F.-L.; Rodriguez, J.P.; Fontana, F.; Faessler, M.; Forster, C.; Schmidhuber, J.; Di Caro, G.; et al. A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots. IEEE Robot. Autom. Lett. 2015, 1, 661–667. [Google Scholar] [CrossRef]
  188. Hummel, K.A.; Pollak, M.; Krahofer, J. A Distributed Architecture for Human-Drone Teaming: Timing Challenges and Interaction Opportunities. Sensors 2019, 19, 1379. [Google Scholar] [CrossRef]
  189. Kumari, S.; Tripathy, K.K.; Kumbhar, V. Data Science and Analytics; Emerald Publishing Limited: Bingley, UK, 2020. [Google Scholar] [CrossRef]
  190. Yue, L.; Yang, R.; Zhang, Y.; Yu, L.; Wang, Z. Deep Reinforcement Learning for UAV Intelligent Mission Planning. Complexity 2022, 2022, 3551508. [Google Scholar] [CrossRef]
  191. Yang, Q.; Zhang, J.; Shi, G.; Hu, J.; Wu, Y. Maneuver Decision of UAV in Short-Range Air Combat Based on Deep Reinforcement Learning. IEEE Access 2020, 8, 363–378. [Google Scholar] [CrossRef]
  192. Kinaneva, D.; Hristov, G.; Raychev, J.; Zahariev, P. Early Forest Fire Detection Using Drones and Artificial Intelligence. In Proceedings of the 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019. [Google Scholar] [CrossRef]
  193. Rovira-Sugranes, A.; Razi, A.; Afghah, F.; Chakareski, J. A review of AI-enabled routing protocols for UAV networks: Trends, challenges, and future outlook. Ad Hoc Netw. 2022, 130, 102790. [Google Scholar] [CrossRef]
  194. Ross, S.; Melik-Barkhudarov, N.; Shankar, K.S.; Wendel, A.; Dey, D.; Bagnell, J.A.; Hebert, M. Learning monocular reactive UAV control in cluttered natural environments. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar] [CrossRef]
  195. Chakravarty, P.; Kelchtermans, K.; Roussel, T.; Wellens, S.; Tuytelaars, T.; Van Eycken, L. CNN-based single image obstacle avoidance on a quadrotor. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017. [Google Scholar] [CrossRef]
  196. Ditria, E.M.; Buelow, C.A.; Gonzalez-Rivero, M.; Connolly, R.M. Artificial intelligence and automated monitoring for assisting conservation of marine ecosystems: A perspective. Front. Mar. Sci. 2022, 9, 918104. [Google Scholar] [CrossRef]
  197. Verendel, V. Tracking AI in climate inventions with patent data. Nat. Clim. Change 2021, 13, 40–47. [Google Scholar] [CrossRef]
  198. Gupta, R.; Goodman, B.; Patel, N.; Hosfelt, R.; Sajeev, S.; Heim, E.; Doshi, J.; Lucas, K.; Choset, H.; Gaston, M. Creating XBD: A dataset for assessing building damage from satellite imagery. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
  199. MacDonald, N.; Howell, G. Killing Me Softly: Competition in Artificial Intelligence and Unmanned Aerial Vehicles. Prism 2020, 8, 102–126. [Google Scholar]
  200. Reichstein, M.; Camps-Valls, G.; Stevens, B.; Jung, M.; Denzler, J.; Carvalhais, N.; Prabhat, F. Deep learning and process understanding for data-driven Earth system science. Nature 2019, 566, 195–204. [Google Scholar] [CrossRef]
  201. Molina, M.J.; O’brien, T.A.; Anderson, G.; Ashfaq, M.; Bennett, K.E.; Collins, W.D.; Dagon, K.; Restrepo, J.M.; Ullrich, P.A.; Henry, A.J.; et al. A Review of Recent and Emerging Machine Learning Applications for Climate Variability and Weather Phenomena. Artif. Intell. Earth Syst. 2023, 1, 1–46. [Google Scholar] [CrossRef]
  202. Kontokosta, C.E.; Tull, C. A data-driven predictive model of city-scale energy use in buildings. Appl. Energy 2017, 197, 303–317. [Google Scholar] [CrossRef]
  203. Puhm, M.; Deutscher, J.; Hirschmugl, M.; Wimmer, A.; Schmitt, U.; Schardt, M. A Near Real-Time Method for Forest Change Detection Based on a Structural Time Series Model and the Kalman Filter. Remote Sens. 2020, 12, 3135. [Google Scholar] [CrossRef]
  204. Abeywickrama, H.V.; Jayawickrama, B.A.; He, Y.; Dutkiewicz, E. Comprehensive Energy Consumption Model for Unmanned Aerial Vehicles, Based on Empirical Studies of Battery Performance. IEEE Access 2018, 6, 58383–58394. [Google Scholar] [CrossRef]
  205. Morbidi, F.; Cano, R.; Lara, D. Minimum-energy path generation for a quadrotor UAV. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016. [Google Scholar] [CrossRef]
  206. Abdilla, A.; Richards, A.; Burrow, S. Power and endurance modelling of battery-powered rotorcraft. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar] [CrossRef]
  207. Rojas-Perez, L.O.; Martinez-Carranza, J. On-board processing for autonomous drone racing: An overview. Integration 2021, 80, 46–59. [Google Scholar] [CrossRef]
  208. Hassanalian, M.; Abdelkefi, A. Classifications, applications, and design challenges of drones: A review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  209. Derrouaoui, S.H.; Bouzid, Y.; Guiatni, M.; Dib, I. Comprehensive Review on Reconfigurable Drones: Classification, Characteristics, Design and Control. Unmanned Syst. 2022, 10, 3–29. [Google Scholar] [CrossRef]
  210. Roseman, C.A.; Argrow, B.M. Weather Hazard Risk Quantification for sUAS Safety Risk Management. J. Atmospheric Ocean. Technol. 2020, 37, 1251–1268. [Google Scholar] [CrossRef]
  211. Gianfelice, M.; Aboshosha, H.; Ghazal, T. Real-time Wind Predictions for Safe Drone Flights in Toronto. Results Eng. 2022, 15, 100534. [Google Scholar] [CrossRef]
  212. Hu, S.; Mayer, G. Three-dimensional Euler solutions for drone delivery trajectory prediction under extreme environments. Soc. Photo Opt. Instrum. Eng. 2022, 12259, 1185–1190. [Google Scholar] [CrossRef]
  213. Lin, H.-Y.; Peng, X.-Z. Autonomous Quadrotor Navigation With Vision Based Obstacle Avoidance and Path Planning. IEEE Access 2021, 9, 102450–102459. [Google Scholar] [CrossRef]
  214. Cheng, C.; Sha, Q.; He, B.; Li, G. Path planning and obstacle avoidance for AUV: A review. Ocean Eng. 2021, 235, 109355. [Google Scholar] [CrossRef]
  215. Singh, J.; Dhuheir, M.; Refaey, A.; Erbad, A.; Mohamed, A.; Guizani, M. Navigation and Obstacle Avoidance System in Unknown Environment. In Proceedings of the 2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), London, ON, Canada, 30 August–2 September 2020. [Google Scholar] [CrossRef]
  216. Lin, Y.; Saripalli, S. Path planning using 3D dubins curve for unmanned aerial vehicles. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 296–304. [Google Scholar] [CrossRef]
  217. Zhao, Y.; Zheng, Z.; Zhang, X.; Liu, Y. Q learning algorithm based UAV path learning and obstacle avoidence approach. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017; pp. 3397–3402. [Google Scholar] [CrossRef]
  218. Hou, X.; Liu, F.; Wang, R.; Yu, Y. A UAV dynamic path planning algorithm. In Proceedings of the 2020 35th Youth Academic Annual Conference of Chinese Association of Automation (YAC), Zhanjiang, China, 16–18 October 2020; pp. 127–131. [Google Scholar] [CrossRef]
  219. Labib, N.S.; Brust, M.R.; Danoy, G.; Bouvry, P. The Rise of Drones in Internet of Things: A Survey on the Evolution, Prospects and Challenges of Unmanned Aerial Vehicles. IEEE Access 2021, 9, 115466–115487. [Google Scholar] [CrossRef]
  220. FAA. Small Unmanned Aircraft Systems (UAS) Regulations (Part 107). 2023. Available online: https://www.faa.gov/newsroom/small-unmanned-aircraft-systems-uas-regulations-part-107 (accessed on 22 August 2023).
  221. Krichen, M.; Adoni, W.Y.H.; Mihoub, A.; Alzahrani, M.Y.; Nahhal, T. Security Challenges for Drone Communications: Possible Threats, Attacks and Countermeasures. In Proceedings of the 2022 2nd International Conference of Smart Systems and Emerging Technologies (SMARTTECH), Riyadh, Saudi Arabia, 9–11 May 2022; pp. 184–189. [Google Scholar] [CrossRef]
  222. Vattapparamban, E.; Güvenç, I.; Yurekli, A.I.; Akkaya, K.; Uluaǧaç, S. Drones for smart cities: Issues in cybersecurity, privacy, and public safety. In Proceedings of the 2016 International Wireless Communications and Mobile Computing Conference (IWCMC), Paphos, Cyprus, 5–9 September 2016; pp. 216–221. [Google Scholar] [CrossRef]
  223. Lv, Z.; Li, Y.; Feng, H.; Lv, H. Deep Learning for Security in Digital Twins of Cooperative Intelligent Transportation Systems. IEEE Trans. Intell. Transp. Syst. 2021, 23, 16666–16675. [Google Scholar] [CrossRef]
  224. Zhang, L.; Gao, T.; Cai, G.; Hai, K.L. Research on electric vehicle charging safety warning model based on back propagation neural network optimized by improved gray wolf algorithm. J. Energy Storage 2022, 49, 104092. [Google Scholar] [CrossRef]
  225. Cao, B.; Fan, S.; Zhao, J.; Tian, S.; Zheng, Z.; Yan, Y.; Yang, P. Large-Scale Many-Objective Deployment Optimization of Edge Servers. IEEE Trans. Intell. Transp. Syst. 2021, 22, 3841–3849. [Google Scholar] [CrossRef]
  226. Mohsan, S.A.H.; Khan, M.A.; Alsharif, M.H.; Uthansakul, P.; Solyman, A.A.A. Intelligent Reflecting Surfaces Assisted UAV Communications for Massive Networks: Current Trends, Challenges, and Research Directions. Sensors 2022, 22, 5278. [Google Scholar] [CrossRef] [PubMed]
  227. Wan, S.; Lu, J.; Fan, P.; Letaief, K.B. To Smart City: Public Safety Network Design for Emergency. IEEE Access 2017, 6, 1451–1460. [Google Scholar] [CrossRef]
  228. Ko, Y.; Kim, J.; Duguma, D.G.; Astillo, P.V.; You, I.; Pau, G. Drone Secure Communication Protocol for Future Sensitive Applications in Military Zone. Sensors 2021, 21, 2057. [Google Scholar] [CrossRef]
Figure 2. Comparison between UAVs and conventional aerial technologies regarding resolution, payload capacity, and operation costs. Based on [48].
Figure 2. Comparison between UAVs and conventional aerial technologies regarding resolution, payload capacity, and operation costs. Based on [48].
Drones 07 00637 g002
Figure 4. Comparison between common commercial UAV platforms and their high-resolution sensor capabilities [70,71,72,73,74].
Figure 4. Comparison between common commercial UAV platforms and their high-resolution sensor capabilities [70,71,72,73,74].
Drones 07 00637 g004
Figure 8. Common UAV thermal imaging sensors and their resolutions [122,123].
Figure 8. Common UAV thermal imaging sensors and their resolutions [122,123].
Drones 07 00637 g008
Figure 9. UAV innovation timeline and integration with humanitarian action research.
Figure 9. UAV innovation timeline and integration with humanitarian action research.
Drones 07 00637 g009
Figure 10. Summary of UAV applications in climate change research.
Figure 10. Summary of UAV applications in climate change research.
Drones 07 00637 g010
Figure 11. Typical data acquisition and processing frameworks and output needs for different UAV application categories.
Figure 11. Typical data acquisition and processing frameworks and output needs for different UAV application categories.
Drones 07 00637 g011
Figure 12. Different processing platforms and their applications, advantages, and disadvantages [177,178,179].
Figure 12. Different processing platforms and their applications, advantages, and disadvantages [177,178,179].
Drones 07 00637 g012
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bayomi, N.; Fernandez, J.E. Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges. Drones 2023, 7, 637. https://doi.org/10.3390/drones7100637

AMA Style

Bayomi N, Fernandez JE. Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges. Drones. 2023; 7(10):637. https://doi.org/10.3390/drones7100637

Chicago/Turabian Style

Bayomi, Norhan, and John E. Fernandez. 2023. "Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges" Drones 7, no. 10: 637. https://doi.org/10.3390/drones7100637

APA Style

Bayomi, N., & Fernandez, J. E. (2023). Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges. Drones, 7(10), 637. https://doi.org/10.3390/drones7100637

Article Metrics

Back to TopTop