Next Article in Journal
Modeling and Optimization of Microwave-Based Bio-Jet Fuel from Coconut Oil: Investigation of Response Surface Methodology (RSM) and Artificial Neural Network Methodology (ANN)
Previous Article in Journal
Method of Real-Time Wellbore Surface Reconstruction Based on Spiral Contour
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Internal Wind Turbine Blade Inspections Using UAVs: Analysis and Design Issues

by
Andrius Kulsinskas
,
Petar Durdevic
* and
Daniel Ortiz-Arroyo
Department of Energy Technology, Aalborg University, 6700 Esbjerg, Denmark
*
Author to whom correspondence should be addressed.
Energies 2021, 14(2), 294; https://doi.org/10.3390/en14020294
Submission received: 6 December 2020 / Revised: 30 December 2020 / Accepted: 5 January 2021 / Published: 7 January 2021
(This article belongs to the Section A3: Wind, Wave and Tidal Energy)

Abstract

:
Interior and exterior wind turbine blade inspections are necessary to extend the lifetime of wind turbine generators. The use of unmanned vehicles is an alternative to exterior wind turbine blade inspections performed by technicians that require the use of cranes and ropes. Interior wind turbine blade inspections are even more challenging due to the confined spaces, lack of illumination, and the presence of potentially harmful internal structural components. Additionally, the cost of manned interior wind turbine blade inspections is a major limiting factor. This paper analyses all aspects of the viability of using manually controlled or autonomous aerial vehicles for interior wind turbine blade inspections. We discuss why the size, weight, and flight time of a vehicle, in addition to the structure of the wind turbine blade, are the main limiting factors in performing internal blade inspections. We also describe the design issues that must be considered to provide autonomy to unmanned vehicles and the control system, the sensors that can be used, and introduce some of the algorithms for localization, obstacle avoidance and path planning that are best suited for the task. Lastly, we briefly describe which non-destructive test instrumentation can be used for the purpose.

1. Introduction

Wind turbines (WT), machines that convert wind power into electricity, have been around for over a century, with first ones appearing in 1887 and 1888 [1,2]. Lately, wind turbines have been gaining more attention as a renewable source of energy, rapidly growing in numbers [3,4]. At present, 6% of the global electricity demands can be covered by WTs [5], with 10% in USA alone in the year 2020 and this is expected to grow closer to 20% by the year 2030 [6]. WTs are large structures that are subject to various defects, requiring periodic maintenance [7]. Damages like delamination of the blade or erosion may not cause complete failure of the WT, but it will affect its efficiency [8,9,10]. Without proper inspections and repairs, turbines can fail completely, resulting in both energy generation loss and damage to the surrounding area [11]. It is estimated that offshore turbines can suffer approximately 10 failures per year by its third operational year [12]. On the other hand, proper maintenance can increase annual energy production by 5%, profits by 20% and extend wind turbine’s life span [13,14]. With damage to WTs being unavoidable, an optimal strategy is needed between maintenance efforts and costs due to failures [7,15]. The Wind Turbine Blades (WTBs) are one of the most expensive parts of the structure [16], both in terms of material and labour costs [17,18], and replacing them is a costly operation, therefore, preventative maintenance is necessary.
Non-Destructive Testing (NDT) is the standard way of performing inspection and this is usually done by manned operations [19]. However, as the number of WTs grows, so does the amount of human-involved accidents and fatalities during maintenance [20], calling for the use of unmanned inspection methods. An unmanned inspection is a cheaper [21] and more efficient alternative, with exterior inspections being done in 15 min using Unmanned Aerial Vehicles (UAVs) [22]. Air pollution is another reason to choose Unmanned Vehicles (UVs), since while a WT is down for inspection, the energy loss needs to be compensated for, using other energy sources, such as coal. While WTs emit 0.011 to 0.014 kg CO 2 /kWh, coal emissions are 0.87 to 1 kg CO 2 /kWh, according to [23,24,25,26]. For offshore wind farms, the CO 2 emissions of marine vessels used for inspection need to be considered as well. For a 400 MW offshore wind farm in Denmark, it is estimated that total annual emissions would be 7274 tonnes of CO 2 —or 0.01% of Denmark’s annual emissions in 2011—from a fleet of 16 crew transportation vessels and 1 offshore support vessel [27]. Performing regular inspections using UVs could reduce CO 2 emissions by prolonging the lifetime of the WT and requiring less inspection crew. It is also estimated that this 400 MW offshore wind farm would save about 690,000 tonnes of CO 2 per year with the least efficient turbines available [27]. UVs could help perform inspections faster, reducing reliance on coal and other energy sources during downtime of WTs.
UAVs have been applied in other inspection tasks such as: pipelines [28,29], search and rescue missions [30,31], air quality measurement [32], agriculture [33,34] and other fields [35]. Unmanned inspection platforms have been used in WT inspection as well [22,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52]. However, both land and air UVs tend to focus on the exterior and NDT inspections [53,54,55,56]. While exterior inspections and NDTs provide the majority of information about possible defects in WTBs, there is still a possibility of defects occurring in the interior only [57], requiring inspection from inside the turbine as well.
The goal of this paper is to analyze the problems in designing UVs and especially UAVs for interior WTB inspections. The usage of UAVs for exterior WT inspections has shown positive results, hence we reason that using UAVs for interior WTB inspections could reduce inspection time, cost and associated risk to human life. Faster and cheaper interior WTB inspections could potentially improve the lifetime and efficiency of the WT. In this paper we analyze all the aspects involved in performing interior WTB inspections using manually operated and autonomous UAVs. We also describe some of the algorithms and techniques that can be used to design an UAV capable of performing internal WTB inspections. The information contained in this paper is aimed at researchers and specialists interested in all design aspects related to the use of manually controlled and autonomous vehicles for the specific task of internal wind turbine blade inspections.
The remainder of the paper is structured as follows: Section 2 describes manned exterior and interior WTB inspections and its challenges. In Section 3, discussion on unmanned platforms for the exterior and interior WTB inspections as well as the problems of designing an UV for interior WTB inspections is carried out. Section 4 formulates the design specifications for an internal WTB inspection UAV based on known challenges. A discussion is provided in Section 5, where we talk about future trends and fields where more research is needed. Finally, in Section 6, we conclude with our findings and how they reflect the goal of this paper.

2. Inspections of Wind Turbine Blades

WT structural components may be produced with manufacturing defects [7], but also suffer from constant damage from the environment while in operation, especially the blades that are commonly affected by lightning strikes, collision with insects, accumulation of ice and dirt, corrosion due to saltwater or erosion due to rain and small dirt particles in the wind [16,57,58,59,60]. Manned interior and exterior inspections can be carried out by trained expert technicians that use ropes or cranes to move around and inside the turbines to perform tests using special tools [61]. These technicians use various NDT methods such as ultrasound, thermography, shearography, visual inspection or radiography to gather damage information about various parts of the turbine [57,62,63]. These methods are applied in different situations and work better under certain conditions [64]. For example, infrared thermography is affected by ambient heat, ultrasound—by electrical noise and applying radiography requires special permits. For interior inspections, crawler robots can be used as well [65]. Alternatively, sensors can be embedded in the blades, providing continuous monitoring of the status of the structure [66]. On the other hand, these sensors increase the cost of the blade considerably, as it scales with the size of the blade as well as quality and quantity of the sensors. Moreover, the sensors can suffer damages along with the rest of the blade, for example, when the blade is hit by a lightning strike, making this inspection method less attractive [16].
In order to ensure a safe and thorough inspection process, maintenance technicians should follow standard guidelines during inspection [59]. This includes the use of safety procedures as well as the recommendations for which sections to inspect and which types of defects to look for. Depending on the severity of the damages or defects, repairs should be carried out immediately or a followup inspection performed at a later time [57]. Moreover, all types of inspections are required-exterior, interior and NDT. Even in a best case scenario, exterior inspection and NDT can detect only 90% of the defects [57].
A WTB, as seen in Figure 1, is made up of two aero shells bonded together, with a shear web located in the middle. The shear webs, seen in Figure 1, can also come in different formations, consisting of wider or narrower beams placed closer or further apart, as seen in Figure 2. These shear webs are bonded to the WTB and reinforce it, but they also create obstacles for anyone inside the WTB. Besides having constrained space, the WTB is also an enclosed location. With no direct sunlight, visibility is reduced and artificial light sources are needed in order to avoid hazards. WTBs have limited space, as it gets narrower while approaching the tip, as seen in Figure 1. For example, on a 61.5 m blade, the width is about 3 m and thickness is about 0.54 m at 2 3 of blade’s length [67]. As such, manned inspections are usually limited to two thirds of the blade. With WTBs being as long as 107 m in length [68], it also becomes difficult to evacuate in the case of an emergency, as the workers need to travel large distances in confined space. There is also risk of falling for the maintenance workers, as accessing the WTB requires first climbing to the WTB either through the turbine tower, which can be 100 m in height [69] or accessing it from an entrance located on the nacelle.
The interior WTB inspection may reveal damages to the shear webs located in the WTB, the interior WTB cracks, structural manufacturing damages, corrosion of the interior due to infiltration of saltwater or erosion of the bonding material due to accumulation of the liquids or dirt inside the WTB. Despite this, internal WTB inspections are not common due to their costs. Moreover, to the best of our knowledge, a manned inspection of the WTB interior can take 1 to 2 h per blade, while an entire 8 MW WT inspection can take roughly 8 h including additional time needed for shut-off of the turbine and safety procedures. Although there is no data available on how much time it takes for an UV to do the same inspection in the interior, UV approaches are known to be faster than manned inspections on the exterior of the WTBs [14,22,40] A single turbine inspection using UAVs, including transportation and other operations, is estimated to take 1.5 h, while rope access-based inspection would take 8 h [70]. Besides reducing inspection time, there are other motivations for using unmanned solutions. In particular, while working inside the WTB, the workers are facing risks due to fires, explosions, falling from high ground or slipping, high voltage equipment, static electricity build-up, presence of hazardous chemicals and lack of oxygen [71,72,73]. Unmanned inspections of WTB interior could also reduce inspection costs, as they did in the exterior inspections [21], by reducing required crew size, marine vessel number needed for offshore WT inspections and lower the WT downtime. Therefore, there is a need to analyze the convenience of using UVs for inspection of the interior of WTBs.

3. Unmanned Wind Turbine Blade Inspections

Research on the usage of UVs for exterior inspection of WTB includes robotic vehicles using wheels that travel along the turbine’s parts [53,56,74,75]. Similarly, various UAV approaches have also been reported in [54,55,76,77,78], including the use of machine learning models for defect detection based on data taken by UAVs [79,80]. Usage of machine learning allows to speed up the inspection process by detecting defects and generating reports as the UV performs the inspection.
While the topic of exterior WTB inspection using UVs is well researched, to our best knowledge, there are no academic works focusing on UVs for internal WT blade inspections. However, attempts to realize such technology have been made in now-withdrawn patents. For instance, an invention proposed in [81] uses a platform with sensors that is lowered into a WTB by a crane located in the tower of the turbine. In [82], an UAV for turbine inspection is proposed, which could be used to inspect the turbine from the outside, but the UAV could also enter the turbine from an access point and inspect the nacelle from inside. In [83], an UAV operating entirely inside the wind turbine, waiting in stand-by mode and remotely operated from outside, is proposed. The UAV proposed in [83] would be equipped with necessary sensors for navigation and gathering inspection data, as well as communication methods to send inspection data back to the operating station outside.
However, to potentially use UVs for interior WTB inspection, challenges of operating an UV inside a WTB need to be explored. Besides the problems faced by maintenance workers mentioned in the previous section, an UV operating inside the turbine cannot be seen by the operator directly. The UV needs to be equipped with additional cameras or sensors so that the platform could navigate in the enclosed space, either remotely operated by a person or autonomously. Moreover, localization is a problem in Global Navigation Satellite System (GNSS)-denied environments [84,85], such as WTB. An Indoor Positioning System (IPS) is needed for obstacle avoidance, navigation and localization. The vehicle may also need additional light sources to illuminate the area for easier navigation. Lastly, the UV requires to also carry the sensors and tools needed to detect the defects. Hence, the UV needs to be small and robust enough to be able to freely navigate in an enclosed space with obstacles, yet powerful enough to carry all of the necessary equipment.
Another issue is that as WT operates, it builds up static electricity due to wind’s interaction with the WTBs. Static electricity as well as lightning strikes in WTBs are being grounded by the grounding stations at the bottom of the WT [72]. However, if it is not properly mitigated, the static electricity may affect the UV operating inside the WTB.
To address some of these problems, commercial solutions are available [86] or in development [87]. A platform equipped with cameras, similar to the one proposed in [81], is used by [86], which still requires maintenance workers to set up and operate the equipment from the inside of the WT. A wheeled UV navigates in the WTB and inspects it [87]. However, a land-based UV using wheels [87], a crawler robot [65] or a similar approach has limited maneuverability, as they require either being attached to, or travel on, a surface. Travelling through an uneven surface for a crawler and wheeled robots can be difficult [71,88]. In particular, navigating around shear webs, seen in Figure 1 and Figure 2, could be difficult for a small wheeled platform due to uneven surface or challenging for a large platform due to lack of space. Meanwhile, UAVs have more degrees of freedom, being capable of moving in any direction. However, an UAV operating in confined space also faces the challenges of obstacle avoidance, as any collision could cause a crash [71]. Some solutions have already been developed and applied, allowing UAVs to operate in confined spaces. An UAV with protective frame [89] can prevent the UAV from causing damage to the interior of the structure and provide easier navigation by bouncing off of obstacles.
On the other hand, UAVs are not perfect either. UAVs operating inside WTBs need to be small and lightweight, limiting the battery capacity and flight time. A manually operated UAV also requires constant radio communication, which has a limited range and may be absorbed or deflected by the walls of WT. Manual flight is challenging, as the UAV’s movement needs to be precise to avoid hitting obstacles yet fast to conserve battery lifetime. Moreover, oscillations in the WTB caused by the wind [90] create another risk of the UAV colliding with the walls of the WTB when hovering in the air. The UAV also requires cameras and sensors to help with navigation, which affects the flight time due to increased weight. However, these problems could be partially solved with advancing technologies, such as autonomous flight or improved batteries. The design of an UAV for internal WTB inspection should address all previous issues.

4. UAVs for Internal Wind Turbine Blade Inspections

An UAV performing interior WTB inspections can enter the blade through a manhole, fly into each section of the blade and gather footage of the defects with on-board cameras. To achieve this, the challenges discussed in Section 2 and Section 3 must be considered. For an UAV operating inside a WTB, size is a limiting factor, as there is constrained space inside the blade, as shown in Figure 1. Other design choices such as flight time, weight, aerodynamic disturbances and navigation methods must be considered as well.

4.1. Physical Design

A quad-copter, a popular type of an UAV, offers a good trade-off between maneuverability, size and lift force [91]. Although UAVs with more motors aid with redundancy and produce more thrust, they come at a cost of increased weight, size and power consumption. Off-the-shelf flight controllers are used in commercial quad-copters to provide compact hardware and firmware needed for stabilization and movement of the UAV. In [92], an overview of popular open source flight controllers is given, however, they have limited hardware support and are not designed for confined space maneuverability in terms of control methods. Commercial quad-copters that are under 10 cm in diameter exist [93,94] and could navigate closer to the tip of the blade seen in Figure 1. However, these quad-copters can carry light loads only, limiting their flight time to 3–5 min [93,94]. Larger quad-copters can extend the flight time to 10 min or more [95,96], and can be equipped with additional sensors and cameras, but the length of the blade they can inspect would be limited due to increased size. Therefore, physical design of the UAV for interior WTB inspections starts with a trade-off between size, lift force and flight time.

4.1.1. Required Flight Time

The UAV needs enough flight time to fly into all accessible sections of the blade and back. To calculate the flight time, an assumption can be made that each shear web produces a new section in the blade that extends throughout the entire blade length that the UAV needs to inspect. This assumption stems from Figure 1 and Figure 2, where each shear web is shown to split the WTB into two uneven parts. Therefore, the flight time required for a single blade is defined by Equation (1):
T f l i g h t = ( 2 · x · L ) ( 1 + N ) + D v a v g + T p l u s
where T f l i g h t is the flight time required for the inspection of the blade, x is the specified fraction of the length of the WTB that the UAV can navigate through, L is the length of the blade, N is the number of shear webs, D is the overhead distance between the point of origin and the entrance of the blade, v a v g is the average speed of the UAV and T p l u s is overhead time, defined by three parameters, seen in Equation (2):
T p l u s = { T b , T p , T o f }
where T b is the time needed to change a battery, which could happen multiple times throughout turbine inspection, T p is the equipment positioning time and T o f is the time introduced by any other factors.

4.1.2. Achievable Flight Time

The flight time required for a WTB inspections varies with the blade specifications. However, the required flight time may not be achievable by the UAV if the UAV is too heavy, requiring more power to produce the necessary thrust to take off and move. For an UAV to hover, the lift force F l i f t produced by the UAV needs to equal the gravitational force F g r a v i t y , as shown in Equation (3).
F l i f t = F g r a v i t y = m U A V · g
Here, m U A V is the mass of the UAV and g is the gravitational acceleration. The mass of the UAV is divided into the mass of the battery m b a t t e r y , which could change with higher capacity batteries, and the body of the UAV m b o d y , which can be assumed to be static, as shown in Equation (4).
m U A V = m b o d y + m b a t t e r y
The mass of the battery can be written as a product of the battery’s specific energy ϵ and battery’s energy capacity C [97], as shown in Equation (5).
m b a t t e r y = ϵ · C
Therefore, the mass of the battery scales linearly with its capacity. However, the lift force is a product of the square of the motor angular velocity ω and constant k, which consists of motor torque and motor back-emf constants, torque proportionality to thrust constant, air pressure and area swept by the propeller [98], as shown in Equation (6).
F l i f t = k · ω 2
Angular velocity for a motor is a relationship between power P and torque τ , as shown in Equation (7).
ω = P τ
We can then rewrite hovering Equation (3), as shown in Equation (8).
k · P τ 2 = m b o d y + ϵ · C
The quadratic relationship between the battery capacity and power requirement dictates that doubling the battery capacity quadruples the power consumed by the motors to keep the UAV in the air. Moreover, the achievable lift force is dictated by the limits of the motor and propeller combination. Furthermore, increasing motor power consumption also increases the heat produced by the motors, potentially damaging them if the heat is too high [99]. Therefore, there is a trade-off between weight carried by the UAV and the achievable flight time.

4.1.3. Size

The optimal size of the UAV can be calculated based on the dimensions of the blade as the blade gets narrower moving towards the tip of the blade. For a 61.5 m blade [67], seen in Figure 1, the width is 3 m at 2 3 of the blade, but the shear web splits the blade into three uneven sections. The thickness in the middle of the horizon of the reference blade is given as 0.54 m, as seen in Figure 1, leaving limited space for an UAV operating close to the blade’s walls. Moreover, since the length of the blade can vary [68,69,100] and the shear webs can come in different arrangements, as seen in Figure 2, one size of the UAV may not be applicable to all WTBs. On one hand, small UAV design that still offers enough lift force and flight time would be most optimal, allowing to inspect smaller blades. At the edges of the blades, small commercially available manually controlled UAVs with limited capabilities, such as [93] or [94], could be used. Having a width of around 10 cm, they can be flown into the narrow ends of the blades while still being capable of recording the inspection with an on-board camera. On the other hand, with WTs growing in size [101] we can expect larger blades in the future and the interior WTB inspection UAVs could be larger for those WTs, allowing the UAVs to carry more sensors and inspection equipment for better navigation and damage detection.

4.2. Navigation and Control

In Section 2 and Section 3, navigation challenges inside a WTB for an UAV were introduced. To reiterate, the challenges are: (1) the GNSS signal-denied environment has constrained and enclosed space with irregular surfaces and no natural illumination, (2) IPS is needed for localization, path planning and obstacle avoidance, (3) Blade oscillations due to the wind act as a moving obstacle, (4) there are aerodynamic disturbances to consider for an UAV operating inside a constrained space, (5) the UAV could be autonomous or manually controlled by an operator outside of the WT and both control methods present challenges that will be discussed in Section 4.2.3 and Section 4.2.2, respectively.

4.2.1. Aerodynamic Disturbances and Mitigation

Inside a WTB, the wind does not disturb the UAV, however, due to the confined space, the UAV will be in close proximity to the walls of the WTB, especially when nearing the tip of the blade. As propellers spin, they create changes in pressure below and above the propeller, causing changes in airflow. However, the walls of the WTB act as obstacles for the airflow and alter the airflow’s trajectory [102], causing disturbances for the UAV in a form of additional thrust. This behaviour is referred to as ground, ceiling and wall effects, named after the surfaces that the UAV is in close proximity to [103].
In contrast with the other two effects, the ground effect is well researched, where the Cheeseman-Bennet model [104], seen in Equation (9), has been derived for the ground effect for a helicopter:
F I G E F O G E = 1 1 ( R 4 Z ) 2 i f f Z R > 0.25
Here, F I G E is the thrust with in-ground-effect - the force created by the motors while the vehicle is close enough to the ground and F O G E is the thrust due to out-of-ground-effect, when the vehicle is high enough and not affected by the ground effect anymore, R is the radius of the propeller and Z is the distance between the vehicle and the ground. The ground effect is a ratio between the vehicle’s relative height and its propeller size, which relates to additional thrust produced when the vehicle is close to the ground. Experiments with quad-copters in [105,106] have shown that the height to propeller radius ratio Z R , at which the ground effect is negligible, so that F I G E F O G E , can be between 3 and 5. More thorough experimentation done in [103] have shown that the Z R ratio until F I G E F O G E can be larger than 5 and it depends on the placement of motors from the body centre as well as the type of propellers used. While the ground effect is present, the UAV requires less power to hover due to additional thrust, however, the ground effect also makes stabilization with conventional controllers challenging due to the unknown disturbances. In [107], it was shown that a controller, that is used to stabilize the UAV in mid-air, produces oscillations while the UAV is affected by the ground effect. In a WTB scenario, when the UAV is close to ground, especially towards the tip of the blade, it will experience the ground effect, affecting navigation due to oscillations. [107] proposed an adaptive control method using rapidly switching controllers when the UAV is in close proximity to the ground. While the control method used in [107] has reduced the altitude error produced by the ground effect, the height threshold for switching controllers was chosen arbitrarily based on known Z R range. In [108], the ground effect is treated as an always present unknown disturbance estimated by a Kalman filter and compensated for by a Model Predictive Control (MPC). The advantage of having a robust control strategy such as [108] over adaptive control approach used in [107] is that the Z R ratio does not need to be known beforehand, as external disturbances such as aerodynamic disturbances are treated as a noise affecting the system at all times.
Ceiling effect displays similarities to the ground effect, producing additional lifting thrust when the UAV is near the ceiling due to changes in airflow trajectory [102,103,105]. The ceiling effect is also based on a Z R ratio [103]. However, while the ceiling effect produces oscillations when attempting to hover near ceiling, there is also a risk of crashing into the ceiling as the UAV gains more thrust the closer it is to the ceiling. Due to constrained space in a WTB and low thickness of the blade, especially near the tip, the UAV will be experiencing ceiling effect often. In order to ensure that the UAV does not collide with the ceiling, it needs to be compensated for before the UAV can operate inside a WTB. Due to similarities between ground and ceiling effects, controllers proposed in [107,108] could be adapted for the ceiling effect.
The third disturbance is the wall effect, which differs from ground and ceiling effects. While wall effect experiments in [102] have shown that the wall effect is negligible, more thorough experimentation in [103] has shown that the wall effect both pulls in the UAV towards the wall and alters its attitude, causing tilting towards the wall. This effect presents a hazard for an UAV operating inside a WTB, as both the walls of the blade and the shear webs can produce the wall effect. Due to constrained space in the blade, the UAV will experience stronger wall effect as it approaches the tip of the WTB, presenting a risk of collision if the effect is not compensated for. Authors in [109] have proposed a method for detecting when the UAV experiences the wall effect, allowing the UAV to keep its distance. Controllers used by [107,108] may also be adapted for the relevant angles.
Although there are several works exploring the aerodynamic disturbances affecting an UAV operating in a constrained space, there are two additional challenges present in the WTB. Firstly, the surfaces in the WTB are curved and complex, while works in [102,103,105,106,107,108,109] have only considered flat surfaces. To the best of our knowledge, there are insufficient academic works dealing with the aerodynamic disturbances for quad-copters operating near uneven surfaces. Secondly, an UAV operating inside a WTB will be subject to multiple aerodynamic disturbances at the same time, as the blade gets narrower towards the tip. To the best of our knowledge, there is a lack of academic papers that have analyzed the interactions between all three aerodynamic disturbances and their effects on an UAV operating in a constrained space. However, while the aerodynamic disturbances present stabilization challenges and collision hazards, the ground and ceiling effects could potentially be beneficial. The additional thrust produced due to close proximity to the ground and ceiling in a WTB would reduce the power requirements for the UAV to hover, conserving battery lifetime. Therefore, aerodynamic disturbances for an UAV in a WTB present a unique control and optimization challenge that requires further research.

4.2.2. Manual Navigation

Compared to autonomous navigation, manual UAV navigation can be a cheap and easy to implement solution that leaves the navigation challenges in the hands of the operator. However, the UAV operating inside a WTB cannot be seen by an operator outside of the WTB, requiring a camera to transmit the UAV’s surroundings for the operator to navigate. Moreover, the enclosed space acts as an obstacle for radio signals used by the camera and control input receiver. For better signal propagation, lower frequency signals could be used [110]. There is also a lack of illumination inside the blade, requiring light sources to be mounted on the UAV for the operator to be able to see the camera’s footage. Manually controlled UAV in a constrained space also requires obstacle detection and avoidance methods to avoid collision with obstacles in the UAV’s blind spots. Time-of-Flight (ToF) sensors such as ultrasound or active infrared sensors could provide a means of detecting when the UAV is close to an obstacle, preventing the operator from flying closer to the obstacle or moving the UAV away from the obstacle. A protective frame could also be mounted on the UAV to allow the UAV to bounce off from the obstacles in the case of a collision [96], however, the protective frame would increase the dimensions of the UAV, reducing the amount of the WTB the UAV can access. A manually controlled UAV for interior WTB inspection would also require localization methods to provide detailed information about the location of the damages detected by the UAV. Localization methods and more advanced obstacle avoidance approaches will be discussed in Section 4.2.3.

4.2.3. Autonomous Navigation

Autonomous navigation for an interior WTB inspection UAV could offer more efficient inspections than a manually controlled UAV. With algorithms handling navigation, obstacle avoidance and damage inspection simultaneously, the UAV could perform inspection faster, conserve battery lifetime and reduce labour costs by removing the need for an operator to control the UAV. On the other hand, localization and mapping methods need to provide real-time information about the UAV’s surroundings paired with high frequency sensors that need to see in all directions in order to gather full information of the surrounding area. Autonomous navigation becomes a computationally expensive operation which requires more hardware than a manually controlled UAV. For example, computers weighing 45 g [111] and multiple cameras or LiDARs weighing 75 [112] and 95 [113] grams, respectively, could provide mapping of surrounding area. However, increased weight would result in higher energy requirements for the motors and the power consumption would be affected by additional hardware as well, demanding larger batteries. Therefore, the autonomous navigation UAV has an increased weight and cost due to necessary hardware compared to a manually controlled UAV, but offers improved inspection speed and lowers inspection labour costs.

Obstacle Avoidance and Localization

An autonomous UAV operating in a constrained space needs to have precise information of the obstacles surrounding the UAV. Knowing the exact position of the defects is vital for the WT owners, so the autonomous UAV needs precise localization as well. Localization allows the UAV to know how relatively close the obstacles are to the UAV and the relative location of the detected defects. Simultaneous Localization and Mapping (SLAM) is a popular solution for obstacle avoidance and localization in autonomous UAVs [114,115,116]. SLAM generates a map of the surrounding area and estimates the distance to each obstacle based on sensor data. ToF sensors such as Light Distance and Ranging (LiDAR) can provide high frequency data that is used by SLAM to create real-time online mapping of the surrounding area [117]. However, there is a large amount of data from the surrounding environment that needs to be updated constantly, connecting each data point perceived by the sensors together to form a map. As a real-time solution, SLAM requires high amount of computing power to estimate the distances between objects several times every second [118] in order for the UAV to constantly avoid collision. Moreover, LiDARs and other ToF sensors are subject to scattering or multipath interference [119]. In a WTB scenario where the surfaces are uneven, signal scattering or multipath interference may present a challenge for SLAM implementation, as sensors could return incorrect information. Furthermore, having full vision of the surrounding area requires multiple ToF sensors or LiDARs with large sweeping angle, which contributes to the cost and weight and increasing the power consumption of the UAV [120]. However, many types of SLAM exist [114], with Visual SLAM (V-SLAM) offering a popular alternative to LiDAR sensor-based SLAM implementations [121,122,123,124]. Vision-based approaches contain information-rich scenes, offering more data over ToF and similar sensors that only provide information about detected points. Monocular, stereo, depth and other cameras can be used for V-SLAM [125,126], providing high flexibility in terms of hardware used. On the other hand, V-SLAM operates by recognizing distinguishing features of the objects detected by the camera, such as edges, corners or colours [125,126], which can present challenges in scenarios such as a WTB. Inside a WTB, the surfaces are curved and a low-texture environment with no sunlight, which are known challenges for V-SLAM [127]. Due to constant motion of the UAV, the camera footage may also appear blurry, which presents another challenge for V-SLAM algorithms [125,126]. On the other hand, the WTB environment is static and lacks moving obstacles, which allows V-SLAM to treat all changes between camera frames as a movement of the UAV. However, both LiDAR SLAM and V-SLAM have their advantages and disadvantages [120,128,129]. While LiDAR SLAM offers more accurate mapping, V-SLAM uses cheaper and lighter sensors and could re-purpose cameras used for inspection or navigation. On the other hand, both methods may potentially encounter problems in a WTB environment due to curved surfaces, low-texture environment and lack of sunlight. Nevertheless, localization and obstacle avoidance is crucial for any UAV operating inside a WTB and either of SLAM implementations could fulfill this task if the operational limitations can be overcome.

Path Planning

Global path planning for an UAV requires having the map of the area that considers the boundaries for an UAV to use as constraints [130]. With a known map, accurate and precise navigation can be achieved [131,132]. For a single WTB, a map consisting of blade dimensions such as the one seen in Figure 1 could be used. On the other hand, many different blades exist [17,18,68,69,100] and the different arrangements of shear webs, seen in Figure 2, present another mapping challenge. Therefore, generating a map for every blade becomes an infeasible solution. Path planning in unknown environments is a broad subject with many challenges and solutions [130], as it can include path generation and optimization based on parameters such as time or distance and obstacle-free navigation to destination. One approach could be the use of signals emitted by emitters embedded in a WTB to provide a path for the UAV. In [133], emitters were embedded at target locations, allowing the UAV to navigate by measuring the signals emitted by the targets. In a WTB scenario, these emitters could be embedded in key locations such as manhole entrance or the safe region for UAV to fly into. On the other hand, it would also require making modifications to the WTB and maintaining these emitters, making it an infeasible approach on a large scale. Another possible approach could be a Bug algorithm, which is a common collision-free solution used in robotics [134,135]. Using a local map generated by a SLAM algorithm, an UAV operating in a WTB could navigate in the blade by flying in close proximity to a wall of the blade as a means of path planning. However, this approach poses the risk of collision with the walls of the WTB if the blade starts oscillating. Moreover, Bug algorithm based path planning lacks optimization with respect to aerodynamic disturbances and distance travelled. To optimize the path generation based on the local map, MPC could be used instead [130]. The map generated by SLAM could be used as prediction horizon and constraints, providing optimal solution for an UAV inside the WTB based on the information that the cameras and sensors acquire. The MPC can update the path when new information is available [108,136,137,138], while also taking the aerodynamic disturbances or the UAV’s dynamics into consideration. Finally, with large enough map of the environment, MPC could generate the quickest or least power consuming collision-free path for the UAV to take while performing the interior WTB inspection.

4.3. Damage Detection

To perform the inspection, an UAV operating inside a WTB should also have NDT instrumentation to detect the damages in the blade. Various NDT tools are used for exterior WT inspections [64,70,139,140] and could be implemented on an internal WTB inspection UAV. However, given the structure of a WTB, the NDT instrumentation used on an UAV needs to be small, lightweight, have relatively low power consumption and perform well without sunlight. Several popular NDT tools have been implemented on exterior WTB inspection UVs before and could be considered as a potential damage detection instrumentation to be carried by an internal WTB inspection UAV. A summary of these methods can be seen in Table 1. RGB cameras have been used by UAVs [22,40,45], but their damage detection capabilities are limited to surface damages only [70] and require proper illumination. In [54], optical thermography has been used on a helicopter for exterior WTB inspections, allowing to detect subsurface damages as well. Ultrasound NDT has been used in manual inspections [19,141] and for exterior WT inspections using an UAV in [55]. However, the ultrasound NDT tool used in [55] changed the body mass centre, making stabilization challenging. Moreover, sensor readings were affected by the precision of the UAV as well as the electrical noise produced by the motors. Sherography is another NDT tool used for WT inspections and has been implemented on a robotic exterior WT inspection platform in [56]. However, subsurface defect detection by sheorography is limited [70]. Radiography for exterior WT inspections has been implemented on a robotic platform in [53], however, radiation emitted by radiographic NDT is dangerous to humans and require special permits to operate, increasing operating costs and risks. While more commonly used in agriculture, multi-spectral cameras have been fitted on UAVs [33,142] and could prove to be a viable NDT instrumentation for an internal WTB inspection UAV. However, the cost of multi-spectral cameras are a limiting factor, with near infrared multi-spectral camera prices ranging between $4500 [143] and $16,000 [144] and more depending on the spectrum range.
Defect detection rates could be improved using AI and deep learning to automatically classify and localize the damages observed in inspection footage. In water pipeline inspections, it is estimated that around 20% to 25% of defects are missed due to human errors, which are attributed to stress and difference in inspection worker skill levels [145,146,147,148,149,150,151]. In water pipeline inspections, AI-based defect detection rates were displayed to be comparable to human detection rates [145,146,147,148,149,150,151,152,153,154,155]. Deep learning is utilised in exterior WT inspections [22,45,80] and could be utilised in interior WTB inspections as well, having an AI classify, localize defects within the inspection frame and infer bounding boxes for the defects in the inspection footage. Reports for WT owners could then be generated automatically, displaying the defect, defect’s location relative to the WTB and the defect’s type. However, the challenge of designing and training an accurate deep learning model is having a large, diverse and consistent training set based on samples from previous inspections with defects manually labeled beforehand [150].

5. Discussion and Future of Interior WTB Inspections

In previous sections, we described some of the challenges in designing and implementing an UAV for interior WTB inspection. Among other issues, autonomous interior UAVs have been limited by the computing power in the embedded computers [118]. To partially alleviate this problem, cameras such as Intel Realsense [156] or OpenMV [157] could be used. Intel’s Realsense cameras come with a vision processing unit, a microprocessor used to help with machine vision computations. The rate, at which the data is processed on the camera and is given to the flight controller, allows the UAV to meet real-time requirements for navigational data. While Intel Realsense cameras have been successfully applied in UAVs [158,159], the downside of these cameras is the weight and limited access to the code. While weaker than Intel’s Realsense cameras, OpenMV is an open source camera that can be used to implement custom machine vision tasks directly on the camera. Such cameras would allow the UAV to receive already processed vision-based sensor information, such as V-SLAM outputs, and instead focus on applying the data in real-time. As these cameras mature, we can expect to see more lightweight cameras that can perform complex vision-based calculations in real-time.
With other hardware also evolving, the UAVs could become lighter due to advances in battery technology, materials used in UAV frame or propellers or more powerful motors. Giving the UAVs ability to carry more weight would allow the reduction of size or the usage of more sensors, cameras and computers. Conversely, as the research in WTs continues, the size of the WTBs continues to grow [101]. A larger blade would allow an UAV with longer propellers to operate without colliding with the walls of the blade. Longer propellers would produce more thrust, allowing the UAV to both carry heavier equipment and travel faster, at the cost of increased power consumption and size. Alternatively, the UAV could be equipped with more motors to achieve similar results while also introducing some redundancy. As the demand for confined space UAVs grows, specialized algorithms could also be developed. For example, having a better understanding of aerodynamic disturbances and other forces affecting UAVs in confined spaces could allow the UAV to harness these forces to its advantage.
If the market for interior WTB inspections using UAVs starts to develop, newly built WTs could be built with special accommodation for these UAVs. For instance, charging stations proposed in [83] could be included in the nacelle, allowing the UAV to recharge its battery between blade inspections without the need to leave the WT. Moreover, these charging stations would allow the UAV to remain inside the WT at all times, so the inspections could be performed remotely, even if the weather conditions are unfavourable for the crew to be present. Relay stations could also be built in the nacelle, boosting the signal strength. These signals would allow the data transmitter by the UAV to be received in better quality and at larger distances, potentially allowing an inspection of offshore WT from land with nearly instantaneous feedback. WTBs could also be equipped with special markers, such as Apriltags [160] or emitters similar to the ones used in [133], allowing the UAV to navigate the blade more easily by following a known path instead. Furthermore, if interior WTB inspection using UAVs becomes the new standard, WT owners could also upgrade existing WTs to better accommodate the UAVs. This can be compared to the electrical vehicle infrastructure, that continues to expand as the market grows [161,162]. However, in order for WT owners to consider making changes to new or existing WTs to better accomodate the inspection UAVs, the inspection method needs to be proven viable and affordable. Therefore, autonomous or manually controlled UAVs that rely only on their own hardware and information need to be developed first, in order to prove the viability of interior WTB inspections using UAVs and attract enough attention.
Therefore, interior WTB inspection UAVs can benefit from further research in machine vision hardware and software. Better understanding of aerodynamic disturbances and other forces affecting UAVs in confined space could also bring benefits. As the demand increases, WTs could include infrastructure that benefits the interior WTB inspection UAVs.

6. Conclusions

Given the risks of manned interior WTB inspections and the benefits seen in the usage of UVs for exterior WT inspections, in this paper we analysed the benefits and the challenges of using UAVs for interior WTB inspections. UAVs are potentially capable of handling the constrained and uneven space of the WTB better than wheeled or robotic UVs. However, operating an UAV in an enclosed, constrained space also produces aerodynamic disturbances in terms of ground, wall and ceiling effects. Enclosed space with no sunlight and uneven low-texture surfaces are some of the challenges for localization and navigation, and the size of the blade was identified to be a limiting factor for the size of the UAV. A brief comparison between manually controlled and autonomous UAVs has revealed that autonomous UAVs have an advantage of more efficient and cheaper inspections at the price of higher production cost and design challenges. On the other hand, navigation and localization in an enclosed and constrained space for UAVs is a broad topic and the implementation is a challenging task, even for manually controlled UAVs. Several potential methods for navigation and localization can be considered, however, due to unique environment of a WTB interior, the viability and efficiency of these methods is unknown. However, interior WTB inspection UAVs could create a new market, attracting more researchers and investors. Future WTs could be constructed with accommodation of interior inspection UAVs in mind, improving the UAVs’ capabilities. Therefore, the potential benefits of using manually controlled or autonomous UAVs for the task of interior WTB inspection warrant further research.

Author Contributions

Conceptualization, A.K. and P.D.; methodology, A.K.; validation, A.K., P.D. and D.O.-A.; formal analysis, A.K.; investigation, A.K., P.D. and D.O.-A.; writing—original draft preparation, A.K.; writing—review and editing, P.D. and D.O.-A.; visualization, A.K., P.D.; supervision, P.D.; project administration, P.D.; funding acquisition, P.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research is part of a project that was funded by Innovation Fund Denmark grant number 9077-01507B.

Acknowledgments

We would like to thank the project partner and leader René Merrild from Clobotics A/S for his administrative and technical support, and Clobotics A/S for their Co-funding of this project.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
WTWind turbine
WTBWind turbine blade
GNSSGlobal navigation satellite system
IPSIndoor positioning System
AIArtificial intelligence
UVUnmanned vehicle
UAVUnmanned aerial vehicle
ToFTime-of-flight
NDTNon-destructive testing
SLAMSimultaneous localization and mapping
V-SLAMVisual simultaneous localization and mapping
LiDARLight detection and ranging
MPCModel predictive control

References

  1. Price, T.J. James Blyth—Britain’s first modern wind power pioneer. Wind. Eng. 2005, 29, 191–200. [Google Scholar] [CrossRef]
  2. Righter, R.W. Wind Energy in America—A History; University of Oklahoma Press: Norman, OK, USA, 1996. [Google Scholar]
  3. Wind Energy International. Global Wind Installations. Available online: https://library.wwindea.org/global-statistics/ (accessed on 31 July 2020).
  4. U.S. Department of Energy. Projected Growth of the Wind Industry From Now Until 2050. 2011. Available online: https://www.energy.gov/maps/map-projected-growth-wind-industry-now-until-2050 (accessed on 7 July 2020).
  5. Wind Energy International. World Wind Capacity at 650,8 GW, Corona Crisis will Slow Down Markets in 2020, Renewables to be Core of Economic Stimulus Programmes. Available online: https://wwindea.org/blog/2020/04/16/world-wind-capacity-at-650-gw/ (accessed on 31 July 2020).
  6. Office of Energy Efficiency and Renewable Energy. Wind Vision Study Scenario Viewer. Available online: https://openei.org/apps/wv_viewer/ (accessed on 31 July 2020).
  7. Sheng, S.; O’Connor, R. Chapter 15—Reliability of Wind Turbines. In Wind Energy Engineering; Academic Press: Cambridge, MA, USA, 2017; pp. 299–327. [Google Scholar]
  8. Sareen, A.; Sapre, C.A.; Selig, M.S. Effects of leading edge erosion on wind turbine blade performance. Wind Energy 2014, 17, 1531–1542. [Google Scholar] [CrossRef]
  9. Haselbach, P.; Bitsche, R.; Branner, K. The effect of delaminations on local buckling in wind turbine blades. Renew. Energy 2016, 85, 295–305. [Google Scholar] [CrossRef]
  10. Castorrini, A.; Corsini, A.; Rispoli, F.; Venturini, P.; Takizawa, K.; Tezduyar, T.E. Computational analysis of wind-turbine blade rain erosion. Advances in Fluid-Structure Interaction. Comput. Fluids 2016, 141, 175–183. [Google Scholar] [CrossRef]
  11. Lin, Y.; Tu, L.; Liu, H.; Li, W. Fault analysis of wind turbines in China. Renew. Sustain. Energy Rev. 2016, 55, 482–490. [Google Scholar] [CrossRef]
  12. Carroll, J.; McDonald, A.; McMillan, D. Failure rate, repair time and unscheduled O&M cost analysis of offshore wind turbines. Wind Energy 2016, 19, 1107–1119. [Google Scholar]
  13. GE Renewable Energy. GROWTH AND POTENTIAL The Onshore Wind Power Industry. Available online: https://www.ge.com/renewableenergy/wind-energy/onshore-wind (accessed on 7 July 2020).
  14. Megavind. Strategy for Extending the Useful Lifetime of a Wind Turbine. Available online: https://megavind.winddenmark.dk/sites/megavind.windpower.org/files/media/document/Strategy%20for%20Extending%20the%20Useful%20Lifetime%20of%20a%20Wind%20Turbine.pdf (accessed on 7 July 2020).
  15. Andrawus, J.A.; Watson, J.; Kishk, M. Wind Turbine Maintenance Optimisation: Principles of Quantitative Maintenance Optimisation. Wind Eng. 2007, 31, 101–110. [Google Scholar] [CrossRef]
  16. Mishnaevsky, L.; Branner, K.; Petersen, H.; Beauson, J.; McGugan, M.; Sørensen, B. Materials for Wind Turbine Blades: An Overview. Materials 2017, 10, 1285. [Google Scholar] [CrossRef] [Green Version]
  17. Sandia National Laboratories. Cost Study for Large Wind Turbine Blades: WindPACT Blade System Design Studies; Technical Report; Sandia National Laboratories: Warren, RI, USA, 2003. [Google Scholar]
  18. Bortolotti, P.; Berry, D.; Murray, R.; Gaertner, E.; Jenne, D.; Damiani, R.; Barter, G.; Dykes, K. A Detailed Wind Turbine Blade Cost Model; Technical Report; National Renewable Energy Laboratory: Golden, CO, USA, 2019. [Google Scholar]
  19. Juengert, A. Damage Detection in Wind Turbine Blades Using Two Different Acoustic Techniques. NDT Database J. 2008. Available online: https://www.ndt.net/article/v13n12/juengert.pdf (accessed on 7 July 2020).
  20. Forum, C.W.I. Summary of Wind Turbine Accident Data to 31 March 2020. 2020. Available online: http://www.caithnesswindfarms.co.uk (accessed on 7 July 2020).
  21. Deign, J. Fully Automated Drones Could Double Wind Turbine Inspection Rates. 2016. Available online: https://analysis.newenergyupdate.com/wind-energy-update/fully-automated-drones-could-double-wind-turbine-inspection-rates (accessed on 29 July 2020).
  22. Skyspecs. Autonomous Inspection. Available online: https://skyspecs.com/skyspecs-solutions/autonomous-inspection/ (accessed on 7 July 2020).
  23. Dolan, S.L.; Heath, G.A. Life Cycle Greenhouse Gas Emissions of Utility-Scale Wind Power. J. Ind. Ecol. 2012, 16, S136–S154. [Google Scholar] [CrossRef]
  24. Rhodes, J. Nuclear and Wind Power Estimated to have Lowest Levelized CO2 Emissions. 2017. Available online: https://energy.utexas.edu/news/nuclear-and-wind-power-estimated-have-lowest-levelized-co2-emissions (accessed on 3 August 2020).
  25. U.S. Energy Information Administration. How Much Carbon Dioxide is Produced Per Kilowatthour of U.S. Electricity Generation? 2020. Available online: https://www.eia.gov/tools/faqs/faq.php?id=74&t=11 (accessed on 3 August 2020).
  26. Whitaker, M.; Heath, G.A.; O’Donoughue, P.; Vorum, M. Life Cycle Greenhouse Gas Emissions of Coal-Fired Electricity Generation. J. Ind. Ecol. 2012, 16, S53–S72. [Google Scholar] [CrossRef]
  27. McLean, A. Horns Rev 3 Offshore Wind Farm Technical Report no.22—AIR EMISSIONS; Technical Report; Energinet.dk: Fredericia, Denmark, 2014. [Google Scholar]
  28. Inagaki, Y.; Ikeda, H.; Takeuchi, P.K.; Yato, Y.; Sawai, T. An effective measure for evaluating sewer condition: UAV screening in comparison with CCTVS and manhole cameras. Water Pract. Technol. 2020, 15, 482–488. [Google Scholar] [CrossRef]
  29. Gao, J.; Yan, Y.; Wang, C. Research on the Application of UAV Remote Sensing in Geologic Hazards Investigation for Oil and Gas Pipelines; ASCE: Reston, VA, USA, 2011. [Google Scholar]
  30. Birk, A.; Wiggerich, B.; Bülow, H.; Pfingsthorn, M.; Schwertfeger, S. Safety, Security, and Rescue Missions with an Unmanned Aerial Vehicle (UAV). J. Intell. Robot. Syst. 2011, 64, 57–76. [Google Scholar] [CrossRef]
  31. Alotaibi, E.T.; Alqefari, S.S.; Koubaa, A. LSAR: Multi-UAV Collaboration for Search and Rescue Missions. IEEE Access 2019, 7, 55817–55832. [Google Scholar] [CrossRef]
  32. Gonzalez, L.; Lee, D.; Walker, R.; Periaux, J. Optimal Mission Path Planning (MPP) For An Air Sampling Unmanned Aerial System. In Proceedings of the 2009 Australasian Conference on Robotics & Automation, Sydney, Australia, 2–4 December 2009. [Google Scholar]
  33. Navia, J.; Mondragon, I.; Patino, D.; Colorado, J. Multispectral mapping in agriculture: Terrain mosaic using an autonomous quadcopter UAV. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 1351–1358. [Google Scholar]
  34. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  35. Greenwood, W.W.; Lynch, J.P.; Zekkos, D. Applications of UAVs in Civil Infrastructure. J. Infrastruct. Syst. 2019, 25, 04019002. [Google Scholar] [CrossRef]
  36. TSRWind. Improving the Wind—High Technology for Wind Turbine Maintenance. Available online: http://tsrwind.com/en (accessed on 23 September 2020).
  37. Cornis. Onshore & Offshore Panoblade Inspection—External Blade Inspection Made Easy! Available online: https://home.cornis.fr/panoblade/ (accessed on 23 September 2020).
  38. Arthwind. Available online: http://arthwind.com.br/ (accessed on 23 September 2020).
  39. Aetos Drones. Available online: http://www.aetosdrones.be/ (accessed on 23 September 2020).
  40. ABJDrones. Drone Wind Turbine and Blade Inspection—Or Offshore and Onshore Wind Farms. Available online: https://abjdrones.com/drone-wind-turbine-inspection/ (accessed on 23 September 2020).
  41. Terra-Drone Europe. Available online: https://terra-drone.eu/en/ (accessed on 23 September 2020).
  42. Sulzer Schmid. Available online: https://www.sulzerschmid.ch/ (accessed on 23 September 2020).
  43. Blade Edge. Available online: https://bladeedge.net (accessed on 23 September 2020).
  44. Alerion. Available online: https://www.aleriontec.com (accessed on 23 September 2020).
  45. Clobotics. Available online: https://www.clobotics.com/ (accessed on 23 September 2020).
  46. Aero-Enterprise. Available online: https://www.aero-enterprise.com/ (accessed on 23 September 2020).
  47. Aerial Tronics. Available online: https://www.aerialtronics.com (accessed on 23 September 2020).
  48. Radii Robotics. Available online: https://www.radiirobotics.com/ (accessed on 23 September 2020).
  49. Flytbase. Available online: https://flytbase.com/ (accessed on 23 September 2020).
  50. Prodrone. Available online: https://www.pro-drone.eu/ (accessed on 23 September 2020).
  51. CyberHawk. Available online: https://thecyberhawk.com/ (accessed on 23 September 2020).
  52. Force Technology. Available online: https://forcetechnology.com/ (accessed on 23 September 2020).
  53. Sattar, T.; Leon Rodriguez, H.; Bridge, B. Climbing ring robot for inspection of offshore wind turbines. Ind. Robot. 2009, 36, 326–330. [Google Scholar] [CrossRef]
  54. Galleguillos, C.; Zorrilla, A.; Jimenez, A.; Diaz, L.; Montiano, A.L.; Barroso, M.; Viguria, A.; Lasagni, F. Thermographic non-destructive inspection of wind turbine blades using unmanned aerial systems. Plast. Rubber Compos. 2015, 44, 98–103. [Google Scholar] [CrossRef]
  55. Zhang, D.; Watson, R.; Dobie, G.; MacLeod, C.; Pierce, G. Autonomous Ultrasonic Inspection Using Unmanned Aerial Vehicle. In Proceedings of the 2018 IEEE International Ultrasonics Symposium (IUS), Kobe, Japan, 22–25 October 2018; pp. 1–4. [Google Scholar]
  56. Seton, J.; Frosas, V.; Gao, J. Report on Demonstrations in a Working Environment. 2019. Available online: https://ec.europa.eu/research/participants/documents/downloadPublic?documentIds=080166e5c6ba24d6&appId=PPGMS (accessed on 25 July 2020).
  57. Bladena and KIRT x THOMSEN. The Blade Handbook. 2019. Available online: https://www.bladena.com/uploads/8/7/3/7/87379536/cortir_handbook_2019.pdf (accessed on 31 July 2020).
  58. Corten, G.; Veldkamp, H. Insects can halve wind-turbine power. Nature 2001, 412, 41–42. [Google Scholar] [CrossRef]
  59. Bladena and Vattenfall and EON and Statkraft and KIRT x THOMSEN. INSTRUCTION—Blade Inspections. 2018. Available online: https://www.bladena.com/uploads/8/7/3/7/87379536/blade_inspections_report.pdf (accessed on 31 July 2020).
  60. Force Technology. Successful Corrosion Protection of Offshore Wind Farms. Available online: https://forcetechnology.com/en/articles/successful-corrosion-protection-of-offshore-wind-farms (accessed on 6 November 2020).
  61. Juengert, A.; Grosse, C.U. Inspection techniques for wind turbine blades using ultrasound and sound waves. In Proceedings of the Non-Destructive Testing in Civil Engineering, Nantes, France, 30 June–3 July 2009. [Google Scholar]
  62. Force Technology. Services—NDT on Wind Turbines. Available online: https://forcetechnology.com/en/services/ndt-on-wind-turbines (accessed on 31 July 2020).
  63. Windnostics. Windnostics—Services. Available online: https://www.windnostics.com/services (accessed on 31 July 2020).
  64. Amenabar, I.; Mendikute, A.; López-Arraiza, A.; Lizaranzu, M.; Aurrekoetxea, J. Comparison and analysis of non-destructive testing techniques suitable for delamination inspection in wind turbine blades. Compos. Part B Eng. 2011, 42, 1298–1305. [Google Scholar] [CrossRef]
  65. iPEK International. Rovion—Internal Inspectionof Wind Rotor Blades. 2014. Available online: https://www.ipek.at/fileadmin/FILES/downloads/brochures-datasheets/brochures/iPEK-Industrial-Application-Wind_EN_web.pdf (accessed on 10 August 2020).
  66. Sørensen, B.F.; Lading, L.; Sendrup, P.; McGugan, M.; Debel, C.P.; Kristensen, O.J.; Larsen, G.C.; Hansen, A.M.; Rheinländer, J.; Rusborg, J.; et al. Fundamentals for Remote Structural Health Monitoring of Wind Turbine Blades—A Preproject; DTU Library: Roskilde, Denmark, 2002. [Google Scholar]
  67. Jonkman, J.; Butterfield, S.; Musial, W.; Scott, G. Definition of a 5-MW Reference Wind Turbine for Offshore System Development; Technical Report; National Renewable Energy Laboratory: Golden, CO, USA, 2009. [Google Scholar]
  68. Froese, M. LM Wind Unveils 107-m Turbine Blade, Currently the World’s Largest. 2019. Available online: https://www.windpowerengineering.com/lm-wind-unveils-107-m-turbine-blade-currently-the-worlds-largest/ (accessed on 19 August 2020).
  69. GE Renewable Energy. GE’s Haliade 150-6MW High Yield Offshore Wind Turbine. 2015. Available online: https://www.ge.com/renewableenergy/sites/default/files/related_documents/wind-offshore-haliade-wind-turbine.pdf (accessed on 31 July 2020).
  70. Fauteux, L.; Jolin, N. Drone Solutions for Wind Turbine Inspections; Technical Report; Nergica: Gaspe, QC, Canada, 2018. [Google Scholar]
  71. Flyability. Confined Spaces Inspection. Available online: https://www.flyability.com/articles-and-media/confined-spaces-inspection (accessed on 7 July 2020).
  72. Davis, M.S.; Madani, M.R. Investigation into the Effects of Static Electricity on Wind Turbine Systems. In Proceedings of the 2018 6th International Renewable and Sustainable Energy Conference (IRSEC), Rabat, Morocco, 5–8 December 2018; pp. 1–7. [Google Scholar]
  73. Stojković, A. Occupational Safety In Hazardous Confined Space. Inženjerstvo Zaštite 2013, 137. [Google Scholar] [CrossRef]
  74. Lim, S.; Park, C.; Hwang, J.; Kim, D.; Kim, T. The inchworm type blade inspection robot system. In Proceedings of the 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Daejeon, Korea, 26–28 November 2012; pp. 604–607. [Google Scholar]
  75. Netland, Ø.; Jenssen, G.; Schade, H.M.; Skavhaug, A. An Experiment on the Effectiveness of Remote, Robotic Inspection Compared to Manned. In Proceedings of the 2013 IEEE International Conference on Systems, Man, and Cybernetics, Manchester, UK, 13–16 October 2013; pp. 2310–2315. [Google Scholar]
  76. Schäfer, B.E.; Picchi, D.; Engelhardt, T.; Abel, D. Multicopter unmanned aerial vehicle for automated inspection of wind turbines. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016; pp. 244–249. [Google Scholar]
  77. Stokkeland, M.; Klausen, K.; Johansen, T.A. Autonomous visual navigation of Unmanned Aerial Vehicle for wind turbine inspection. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 998–1007. [Google Scholar]
  78. Jung, S.; Shin, J.; Myeong, W.; Myung, H. Mechanism and system design of MAV(Micro Aerial Vehicle)-type wall-climbing robot for inspection of wind blades and non-flat surfaces. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea, 13–16 October 2015; pp. 1757–1761. [Google Scholar]
  79. Shihavuddin, A.; Chen, X.; Fedorov, V.; Nymark Christensen, A.; Andre Brogaard Riis, N.; Branner, K.; Bjorholm Dahl, A.; Reinhold Paulsen, R. Wind Turbine Surface Damage Detection by Deep Learning Aided Drone Inspection Analysis. Energies 2019, 12, 676. [Google Scholar] [CrossRef] [Green Version]
  80. Martinez, C.; Asare Yeboah, F.; Herford, S.; Brzezinski, M.; Puttagunta, V. Predicting Wind Turbine Blade Erosion using Machine Learning. SMU Data Sci. Rev. 2019, 2, 17. [Google Scholar]
  81. Fritz, P.J.; Harding, K.G.; Song, G.; Yang, Y.; Tao, L.; Wan, X. System and Method for Performing an Internal Inspection on A Wind Turbine Rotor Blade. U.S. Patent Application 13/980,345, 14 November 2013. [Google Scholar]
  82. Murphy, J.T.; Mishra, D.; Silliman, G.R.; Kumar, V.P.; Mandayam, S.T.; Sharma, P. Method and System for Wind Turbine Inspection. U.S. Patent Application 13/021,056, 31 May 2012. [Google Scholar]
  83. Pedersen, H. Internal Inspection of a Wind Turbine. European Patent EP3287367A1, 28 February 2018. [Google Scholar]
  84. Rizos, C. Locata: A Positioning System for Indoor and Outdoor Applications Where GNSS Does Not Work. In Proceedings of the 18th Association of Public Authority Surveyors Conference (APAS2013), Canberra, Australia, 12–14 March 2013; pp. 73–83. [Google Scholar]
  85. Lachapelle, G. GNSS Indoor Location Technologies. J. Glob. Position. Syst. 2004, 3, 2–11. [Google Scholar] [CrossRef] [Green Version]
  86. Cornis. Onshore & Offshore Intrablade Inspection—Internal Blade Inspection Made Easy! Available online: https://home.cornis.fr/intrablade/ (accessed on 31 July 2020).
  87. TSRWind. CERBERUS—Internal Blade Inspection. Available online: https://tsrwind.com/cerberus/ (accessed on 27 December 2020).
  88. ROBINS Project. Crawler Platform for NDT Measurements. Available online: https://www.robins-project.eu/geir-crawler-platform/ (accessed on 7 July 2020).
  89. Nichols, G. A Drone Designed to Fly in Dark, Confined Spaces. Available online: https://www.zdnet.com/article/a-drone-designed-to-fly-in-dark-confined-spaces/ (accessed on 7 July 2020).
  90. WInspector. Oscillation Measurement on Wind Turbine Blade. 2016. Available online: http://www.winspector.eu/news-and-events/oscillation-measurement-on-wind-turbine-blade/ (accessed on 25 July 2020).
  91. David. Hexacopter vs. Quadcopter: The Pros and Cons. Available online: https://skilledflyer.com/hexacopter-vs-quadcopter/ (accessed on 29 September 2020).
  92. Ebeid, E.; Skriver, M.; Terkildsen, K.H.; Jensen, K.; Schultz, U.P. A survey of Open-Source UAV flight controllers and flight simulators. Microprocess. Microsyst. 2018, 61, 11–20. [Google Scholar] [CrossRef]
  93. BetaFPV. Beta65S BNF Micro Whoop Quadcopter. Available online: https://betafpv.com/products/beta65s-bnf-micro-whoop-quadcopter (accessed on 29 July 2020).
  94. Emax USA. Tinyhawk Indoor FPV Racing Drone BNF. Available online: https://emax-usa.com/products/tinyhawk2 (accessed on 29 July 2020).
  95. DJI. DJI Mavic 2. Available online: https://www.dji.com/dk/mavic-2 (accessed on 29 July 2020).
  96. Flyability. Elios 2—Indoor Drone for Confined Space Inspections. Available online: https://www.flyability.com/elios-2 (accessed on 21 July 2020).
  97. MIT Electric Vehicle Team. A Guide to Understanding Battery Specifications. 2018. Available online: http://web.mit.edu/evt/summary_battery_specifications.pdf (accessed on 10 September 2020).
  98. Gibiansky, A. Quadcopter Dynamics and Simulation. 2012. Available online: https://andrew.gibiansky.com/blog/physics/quadcopter-dynamics/ (accessed on 19 August 2020).
  99. Liang, O. Why Mini Quad Motors Getting too Hot? Available online: https://oscarliang.com/mini-quad-motors-overheat/ (accessed on 30 September 2020).
  100. Siemens Gamesa. SWT-6.0-154 Offshore Wind Turbine. Available online: https://www.siemensgamesa.com/en-int/products-and-services/offshore/wind-turbine-swt-6-0-154 (accessed on 31 July 2020).
  101. Wiser, R.; Hand, M.; Seel, J.; Paulos, B. The Future of Wind Energy, Part 3: Reducing Wind Energy Costs through Increased Turbine Size: Is the Sky the Limit? 2016. Available online: https://emp.lbl.gov/news/future-wind-energy-part-3-reducing-wind (accessed on 27 August 2020).
  102. Sanchez-Cuevas, P.J.; Heredia, G.; Ollero, A. Experimental Approach to the Aerodynamic Effects Produced in Multirotors Flying Close to Obstacles. In Proceedings of the ROBOT 2017: Third Iberian Robotics Conference, Seville, Spain, 22–24 November 2017; pp. 742–752. [Google Scholar]
  103. Conyers, S.A. Empirical Evaluation of Ground, Ceiling, and Wall Effect for Small-Scale Rotorcraft. Master’s Thesis, University of Denver, Denver, CO, USA, 2019. [Google Scholar]
  104. Cheeseman, I.; Bennett, W. The effect of the ground on a helicopter rotor. R & M 1957, 3021. [Google Scholar]
  105. Powers, C.; Mellinger, D.; Kushleyev, A.; Kothmann, B.; Kumar, V. Influence of aerodynamics and proximity effects in quadrotor flight. Exp. Robot. 2013, 289–302. [Google Scholar] [CrossRef]
  106. Sharf, I.; Nahon, M.; Harmat, A.; Khan, W.; Michini, M.; Speal, N.; Trentini, M.; Tsadok, T.; Wang, T. Ground effect experiments and model validation with Draganflyer X8 rotorcraft. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 1158–1166. [Google Scholar]
  107. Matus-Vargas, A.; Rodríguez-Gómez, G.; Martínez-Carranza, J. Aerodynamic Disturbance Rejection Acting on a Quadcopter Near Ground. In Proceedings of the 2019 6th International Conference on Control, Decision and Information Technologies (CoDIT), Paris, France, 23–26 April 2019; pp. 1516–1521. [Google Scholar]
  108. Hentzen, D.; Stastny, T.; Siegwart, R.; Brockers, R. Disturbance estimation and rejection for high-precision multirotor position control. arXiv 2019, arXiv:1908.03166. [Google Scholar]
  109. McKinnon, C.D.; Schoellig, A.P. Estimating and reacting to forces and torques resulting from common aerodynamic disturbances acting on quadrotors. Robot. Auton. Syst. 2020, 123, 103314. [Google Scholar] [CrossRef]
  110. FrSky. FrSky 900MHz Long Range RC System—R9&R9M. 2017. Available online: https://www.frsky-rc.com/frsky-900mhz-long-range-rc-system-r9r9m/ (accessed on 29 July 2020).
  111. Raspberry Pi Foundation. FAQs—Raspberry Pi Documentation. Available online: https://www.raspberrypi.org/documentation/faqs/ (accessed on 27 December 2020).
  112. Intel Realsense Technology. Intel RealSense Product Family D400 Series. Available online: https://www.intelrealsense.com/wp-content/uploads/2020/06/Intel-RealSense-D400-Series-Datasheet-June-2020.pdf (accessed on 27 December 2020).
  113. Intel Realsense Technology. Intel RealSense LiDAR Camera L515. Available online: file:///F:/Users/user/Downloads/Intel_RealSense_LiDAR_L515_Datasheet_Rev002.pdf. (accessed on 27 December 2020).
  114. Huang, B.; Zhao, J.; Liu, J. A Survey of Simultaneous Localization and Mapping. arXiv 2019, arXiv:1909.05214. [Google Scholar]
  115. Einsiedler, J.; Radusch, I.; Wolter, K. Vehicle indoor positioning: A survey. In Proceedings of the 2017 14th Workshop on Positioning, Navigation and Communications (WPNC), Bremen, Germany, 25–26 October 2017; pp. 1–6. [Google Scholar]
  116. Thrun, S.; Burgard, W.; Fox, D.; Arkin, R. Probabilistic Robotics; Intelligent Robotics and Autonomous Agents Series; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
  117. Droeschel, D.; Behnke, S. Efficient Continuous-Time SLAM for 3D Lidar-Based Online Mapping. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 5000–5007. [Google Scholar]
  118. Abouzahir, M.; Elouardi, A.; Latif, R.; Bouaziz, S.; Tajer, A. Embedding SLAM algorithms: Has it come of age? Robot. Auton. Syst. 2018, 100, 14–26. [Google Scholar] [CrossRef]
  119. Naik, N.; Kadambi, A.; Rhemann, C.; Izadi, S.; Raskar, R.; Bing Kang, S. A Light Transport Model for Mitigating Multipath Interference in Time-of-Flight Sensors. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  120. Luo, S. Laser SLAM vs. VSLAM. Available online: https://www.linkedin.com/pulse/laser-slam-vs-vslam-selina-luo/ (accessed on 29 November 2020).
  121. Skoda, J.; Bartak, R. Camera-Based Localization and Stabilization of a Flying Drone. In Proceedings of the Twenty-Eighth International Florida Artificial Intelligence Research Society Conference, Hollywood, FL, USA, 18–20 May 2015. [Google Scholar]
  122. García, S.; López, M.E.; Barea, R.; Bergasa, L.M.; Gómez, A.; Molinos, E.J. Indoor SLAM for Micro Aerial Vehicles Control Using Monocular Camera and Sensor Fusion. In Proceedings of the 2016 International Conference on Autonomous Robot Systems and Competitions (ICARSC), Bragança, Portugal, 4–6 May 2016; pp. 205–210. [Google Scholar]
  123. Von Stumberg, L.; Usenko, V.; Engel, J.; Stückler, J.; Cremers, D. From monocular SLAM to autonomous drone exploration. In Proceedings of the 2017 European Conference on Mobile Robots (ECMR), Paris, France, 6–8 September 2017; pp. 1–8. [Google Scholar]
  124. Tiemann, J.; Ramsey, A.; Wietfeld, C. Enhanced UAV Indoor Navigation through SLAM-Augmented UWB Localization. In Proceedings of the 2018 IEEE International Conference on Communications Workshops (ICC Workshops), Kansas City, MO, USA, 20–24 May 2018; pp. 1–6. [Google Scholar]
  125. Taketomi, T.; Uchiyama, H.; Ikeda, S. Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Trans. Comput. Vis. Appl. 2017, 9, 16. [Google Scholar] [CrossRef]
  126. Fuentes-Pacheco, J.; Ruiz-Ascencio, J.; Rendón-Mancha, J.M. Visual simultaneous localization and mapping: A survey. Artif. Intell. Rev. 2015, 43, 55–81. [Google Scholar] [CrossRef]
  127. Zhao, S.; Fang, Z. Direct depth SLAM: Sparse geometric feature enhanced direct depth SLAM system for low-texture environments. Sensors 2018, 18, 3339. [Google Scholar] [CrossRef] [Green Version]
  128. Ouellette, R.; Hirasawa, K. A comparison of SLAM implementations for indoor mobile robots. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 1479–1484. [Google Scholar]
  129. Debeunne, C.; Vivet, D. A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping. Sensors 2020, 20, 2068. [Google Scholar] [CrossRef] [Green Version]
  130. Hoy, M.; Matveev, A.S.; Savkin, A.V. Algorithms for collision-free navigation of mobile robots in complex cluttered environments: A survey. Robotica 2015, 33, 463–497. [Google Scholar] [CrossRef] [Green Version]
  131. Wang, F.; Wang, K.; Lai, S.; Phang, S.K.; Chen, B.M.; Lee, T.H. An efficient UAV navigation solution for confined but partially known indoor environments. In Proceedings of the 11th IEEE International Conference on Control Automation (ICCA), Taichung, Taiwan, 18–20 June 2014; pp. 1351–1356. [Google Scholar]
  132. Phang, S.K.; Lai, S.; Wang, F.; Lan, M.; Chen, B.M. UAV calligraphy. In Proceedings of the 11th IEEE International Conference on Control Automation (ICCA), Taichung, Taiwan, 18–20 June 2014; pp. 422–428. [Google Scholar]
  133. Ciftler, B.S.; Tuncer, A.; Guvenc, I. Indoor uav navigation to a rayleigh fading source using q-learning. arXiv 2017, arXiv:1705.10375. [Google Scholar]
  134. Lumelsky, V.; Stepanov, A. Dynamic path planning for a mobile automaton with limited information on the environment. IEEE Trans. Autom. Control 1986, 31, 1058–1063. [Google Scholar] [CrossRef]
  135. Maravall, D.; de Lope, J.; Fuentes, J.P. Navigation and Self-Semantic Location of Drones in Indoor Environments by Combining the Visual Bug Algorithm and Entropy-Based Vision. Front. Neurorobot. 2017, 11, 46. [Google Scholar] [CrossRef] [Green Version]
  136. Kamel, M.; Burri, M.; Siegwart, R. Linear vs nonlinear MPC for trajectory tracking applied to rotary wing micro aerial vehicles. IFAC Pap. 2017, 50, 3463–3469. [Google Scholar] [CrossRef]
  137. Brooks, A.; Kaupp, T.; Makarenko, A. Randomised MPC-based motion-planning for mobile robot obstacle avoidance. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 3962–3967. [Google Scholar]
  138. Sun, Z.; Dai, L.; Liu, K.; Xia, Y.; Johansson, K.H. Robust MPC for tracking constrained unicycle robots with additive disturbances. Automatica 2018, 90, 172–184. [Google Scholar] [CrossRef]
  139. Roach, D.; Rice, T.; Neidigk, S.; Duvall, R. Non-Destructive Inspection of Blades. 2015. Available online: https://www.osti.gov/servlets/purl/1244426 (accessed on 14 July 2020).
  140. Yang, R.; He, Y.; Zhang, H. Progress and trends in nondestructive testing and evaluation for wind turbine composite blade. Renew. Sustain. Energy Rev. 2016, 60, 1225–1250. [Google Scholar] [CrossRef]
  141. Raišutis, R.; Jasiūnienė, E.; Žukauskas, E. Ultrasonic NDT of wind turbine blades using guided waves. Ultragarsas Ultrasound 2008, 63, 7–11. [Google Scholar]
  142. Corrigan, F. Multispectral Imaging Camera Drones in Farming Yield Big Benefits. 2020. Available online: https://www.dronezon.com/learn-about-drones-quadcopters/multispectral-sensor-drones-in-farming-yield-big-benefits/ (accessed on 21 July 2020).
  143. Vespadrones. Tetracam ADC Lite. Available online: http://vespadrones.com/product/tetracam-adc-lite/ (accessed on 15 September 2020).
  144. MAIA. MAIA WV—the Multispectral Camera. Available online: https://www.spectralcam.com/maia-tech/ (accessed on 15 September 2020).
  145. Kumar, S.S.; Wang, M.; Abraham, D.M.; Jahanshahi, M.R.; Iseley, T.; Cheng, J.C.P. Deep Learning—Based Automated Detection of Sewer Defects in CCTV Videos. J. Comput. Civ. Eng. 2020, 34, 04019047. [Google Scholar] [CrossRef]
  146. Li, D.; Cong, A.; Guo, S. Sewer damage detection from imbalanced CCTV inspection data using deep convolutional neural networks with hierarchical classification. Autom. Constr. 2019, 101, 199–208. [Google Scholar] [CrossRef]
  147. Cheng, J.C.; Wang, M. Automated detection of sewer pipe defects in closed-circuit television images using deep learning techniques. Autom. Constr. 2018, 95, 155–171. [Google Scholar]
  148. Kumar, S.S.; Abraham, D.M.; Jahanshahi, M.R.; Iseley, T.; Starr, J. Automated defect classification in sewer closed circuit television inspections using deep convolutional neural networks. Autom. Constr. 2018, 91, 273–283. [Google Scholar] [CrossRef]
  149. Hassan, S.I.; Dang, L.M.; Mehmood, I.; Im, S.; Choi, C.; Kang, J.; Park, Y.S.; Moon, H. Underground sewer pipe condition assessment based on convolutional neural networks. Autom. Constr. 2019, 106, 102849. [Google Scholar]
  150. Meijer, D.; Scholten, L.; Clemens, F.; Knobbe, A. A defect classification methodology for sewer image sets with convolutional neural networks. Autom. Constr. 2019, 104, 281–298. [Google Scholar]
  151. Myrans, J.; Everson, R.; Kapelan, Z. Automated detection of fault types in CCTV sewer surveys. J. Hydroinform. 2018, 21, 153–163. [Google Scholar] [CrossRef]
  152. Moselhi, O.; Shehab-Eldeen, T. Automated detection of surface defects in water and sewer pipes. Autom. Constr. 1999, 8, 581–588. [Google Scholar] [CrossRef]
  153. Moselhi, O.; Shehab-Eldeen, T. Classification of Defects in Sewer Pipes Using Neural Networks. J. Infrastruct. Syst. 2000, 6, 97–104. [Google Scholar] [CrossRef]
  154. Chae, M.J.; Abraham, D.M. Neuro-Fuzzy Approaches for Sanitary Sewer Pipeline Condition Assessment. J. Comput. Civ. Eng. 2001, 15, 4–14. [Google Scholar] [CrossRef]
  155. Yang, M.D.; Su, T.C. Automated diagnosis of sewer pipe defects based on machine learning approaches. Expert Syst. Appl. 2008, 35, 1327–1337. [Google Scholar] [CrossRef]
  156. Intel. Intel RealSense Technology. Available online: https://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html (accessed on 9 November 2020).
  157. OpenMV. Machine Vision with Python. Available online: https://openmv.io/ (accessed on 9 November 2020).
  158. Chien, W.Y. Stereo-Camera Occupancy Grid Mapping. Master’s Thesis, Pennsylvania State University, State College, PA, USA, 2020. [Google Scholar]
  159. Aluckal, C.; Mohan, B.K.; Turkar, Y.; Agarwadkar, Y.; Dighe, Y.; Surve, S.; Deshpande, S.; Daga, B. Dynamic real-time indoor environment mapping for Unmanned Autonomous Vehicle navigation. In Proceedings of the 2019 International Conference on Advances in Computing, Communication and Control (ICAC3), Mumbai, India, 20–21 December 2019; pp. 1–6. [Google Scholar]
  160. AprilRobotics. AprilTag 3. Available online: https://github.com/AprilRobotics/apriltag (accessed on 9 November 2020).
  161. IDTechEx. Charging Infrastructure for Electric Vehicles 2020–2030. Available online: https://www.idtechex.com/en/research-report/charging-infrastructure-for-electric-vehicles-2020-2030/729 (accessed on 9 November 2020).
  162. Howell, D.; Boyd, S.; Cunningham, B.; Gillard, S.; Slezak, L. Enabling Fast Charging: A Technology Gap Assessment; Technical Report; U.S. Department of Energy: Idaho Falls, ID, USA, 2017.
Figure 1. Dimensions of a sample 61.5 m Wind Turbine Blades (WTB), using the NREL 5MW blade as a reference [67].
Figure 1. Dimensions of a sample 61.5 m Wind Turbine Blades (WTB), using the NREL 5MW blade as a reference [67].
Energies 14 00294 g001
Figure 2. Cross-section of the WTB with different shear web formations.
Figure 2. Cross-section of the WTB with different shear web formations.
Energies 14 00294 g002
Table 1. Summary of various Non-Destructive Testing (NDT) methods for wind turbine (WT) inspections.
Table 1. Summary of various Non-Destructive Testing (NDT) methods for wind turbine (WT) inspections.
NDT MethodAdvantagesDisadvantagesUsed in UVs
RGB camerasEasy-to-use, cheapSurface damages only, requires proper illuminationUAVs [22,40,45]
Optical thermographySub-surface damage detection-UAVs [54]
UltrasoundSub-surface damage detectionHeavy equipment that requires contact; Sensitive to noise and aim accuracyUAVs [55]
Sherography-Limited sub-surface detectionRobotic platform [56]
RadiographySub-surface damage detectionDangerous to useRobotic platforms [53]
Multi-spectral camerasVarious spectral rangesExpensiveUAVs [33,142]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kulsinskas, A.; Durdevic, P.; Ortiz-Arroyo, D. Internal Wind Turbine Blade Inspections Using UAVs: Analysis and Design Issues. Energies 2021, 14, 294. https://doi.org/10.3390/en14020294

AMA Style

Kulsinskas A, Durdevic P, Ortiz-Arroyo D. Internal Wind Turbine Blade Inspections Using UAVs: Analysis and Design Issues. Energies. 2021; 14(2):294. https://doi.org/10.3390/en14020294

Chicago/Turabian Style

Kulsinskas, Andrius, Petar Durdevic, and Daniel Ortiz-Arroyo. 2021. "Internal Wind Turbine Blade Inspections Using UAVs: Analysis and Design Issues" Energies 14, no. 2: 294. https://doi.org/10.3390/en14020294

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop