Next Article in Journal
Image Generation for 2D-CNN Using Time-Series Signal Features from Foot Gesture Applied to Select Cobot Operating Mode
Previous Article in Journal
Wheelchair-Mounted Upper Limb Robotic Exoskeleton with Adaptive Controller for Activities of Daily Living
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Raptor: A Design of a Drain Inspection Robot

by
M. A. Viraj J. Muthugala
1,
Povendhan Palanisamy
1,
S. M. Bhagya P. Samarakoon
1,*,
Saurav Ghante Anantha Padmanabha
1,
Mohan Rajesh Elara
1 and
Dylan Ng Terntzer
2
1
Engineering Product Development Pillar, Singapore University of Technology and Design, 8 Somapah Rd, Singapore 487372, Singapore
2
LionsBot International Pte. Ltd., #03-02, 11 Changi South Street 3, Singapore 486122, Singapore
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(17), 5742; https://doi.org/10.3390/s21175742
Submission received: 3 August 2021 / Revised: 21 August 2021 / Accepted: 23 August 2021 / Published: 26 August 2021
(This article belongs to the Section Sensors and Robotics)

Abstract

:
Frequent inspections are essential for drains to maintain proper function to ensure public health and safety. Robots have been developed to aid the drain inspection process. However, existing robots designed for drain inspection require improvements in their design and autonomy. This paper proposes a novel design of a drain inspection robot named Raptor. The robot has been designed with a manually reconfigurable wheel axle mechanism, which allows the change of ground clearance height. Design aspects of the robot, such as mechanical design, control architecture and autonomy functions, are comprehensively described in the paper, and insights are included. Maintaining the robot’s position in the middle of a drain when moving along the drain is essential for the inspection process. Thus, a fuzzy logic controller has been introduced to the robot to cater to this demand. Experiments have been conducted by deploying a prototype of the design to drain environments considering a set of diverse test scenarios. Experiment results show that the proposed controller effectively maintains the robot in the middle of a drain while moving along the drain. Therefore, the proposed robot design and the controller would be helpful in improving the productivity of robot-aided inspection of drains.

1. Introduction

Drains are a crucial infrastructure in every contemporary city to mitigate flooding at the surface and to remove excess water or sullage in an obscured and efficient way. Stormwater drains mitigate water flow while sewage mitigates sullage flow. Sometimes, drains are combined to transfer stormwater and sullage [1,2]. Flooding can damage private property and public water supply infrastructure causing contamination in water reservoirs [1,2]. Drainage networks in most countries are mostly underground. The benefits of drainage systems are the continuous water supply, distribution, and storage with minimal wastage, and controlled sullage disposal. However, drainage systems also come with many challenges [3]. Drains require regular inspections to monitor slit level, structural maintenance due to exposure or damage and routine dredging for free-flowing drains [4]. It is common to find clogging, even in new drains, that can result in water stagnation and create mosquito breeding sites and thus they need to be cleaned frequently to reduce the risk of malaria and dengue [5]. Coupled with pathogenic pollution in drainage systems, the human labour needed to maintain them may be deemed unsafe and costly [6].
Frequent inspections should be conducted to mitigate the issues mentioned above. Several robotics solutions were introduced to resolve issues associated with the maintenance and inspection of built environments [7,8,9,10]. Similarly, robots have been used for the inspection of confined spaces such as pipes and air ducts [11,12]. In [13], the authors introduced several types of inspection robots that are used for pipe inspections, such as long-distance inspection for sewer pipes. Apart from these robotics solutions, robot aided inspection methods play a vital role in urban infrastructure inspection and maintenance [14].
Drain inspection robots are specifically designed for internal drainage system geometry and can be classified into three forms, namely: tracked, wheeled, and legged [15]. Over time, drain inspection robots have been evolved into autonomous and semi-autonomous robots to reduce human involvement in inspection and cleaning. For example, PIRAT [16] is a semi-autonomous pipe vehicle that surveys sewer systems quantitatively in real-time using artificial intelligence to measure the drain geometry and detect defects. KARO [17] is another semi-autonomous wheeled tethered robot with sophisticated multi-sensors for smart sensor-based sewage inspection. KANTARO [18] is an untethered autonomous wheeled platform that can access straight and even bendy pipes without the use of sensors or controllers using a specific mechanism known as the “naSIR Mechanism”. KURT [19] is an untethered autonomous six-wheeled vehicle that can fit into 600 mm diameter pipelines. MAKRO [20] is an untethered autonomous worm-shaped wheel, multiple segments, and an autonomous engine for the navigation of drainage systems. However, these robots are unable to adapt to varying drain dimensions and morphology. Tarantula [15,21] is a self-configurable drain mapping robot that can adapt and morph according to a drain that has multiple level shifts. This robot has a complex mechanical design with many actuators that lead to a high cost of implementation and are difficult to control. In addition, the robot’s dimensions are large where the manoeuvrability within some drain segments is difficult.
The condition assessment or the classification of the fault can be generally divided into sewer and water pipelines [22]. Sewer pipeline faults are subdivided into blockages, cracks, sag, corrosion, collapsed mortar, open-joints, and root intrusion, while water pipeline faults are subdivided into deformations, leakage, dislocation, corrosion and fractures. This hierarchy can serve as a general guideline when developing algorithms to identify faults in drainage systems. The development of deep learning techniques in computer vision have proven to be effective in these condition assessments. In this regard, Deep Convolutional Neural Networks (CNNs) have been explored to recognize cracks and defects [23]. In the cited work, a CNN is used to construct high-level features from low-level features that are then utilized by a multi-layer perceptron to perform detection. An alternative approach was to use only CNNs to recognize cracks in the absence of the conjugation of image processing techniques. However, this model needed a large dataset and was computationally straining [24]. To keep costs down, vision cameras are often used as opposed to expensive Lidar or ultrasound sensors [25]. Stereoscopical and monocular camera-based vision systems are also used for object detection. A stereoscopical algorithm in the field of autonomous vehicle navigation was created by Broggi et al. [26] using V-disparity image computation, pitch estimation, disparity image computation, obstacles localization, and real-world coordinates mapping. This algorithm was robust in computing the camera pitch angle during data acquisition. However, it needed a further development of individual color channels to reduce false positives. Notably, a monocular camera-based algorithm for Unmanned Aerial Vehicles (UAVs) was suggested by Al-Kaff et al. [27] to detect object size change, and the ratio of the size expansion is computed using UAV movement, which performed well only under uniform lighting conditions. The limitations of these algorithms seem to be the lack of general usability in lighting variations. However, the scope of the work cited above is limited to the development of algorithms for computer vision aided condition classification and detection, and the design and development of robots and control features are not discussed.
This paper proposes a novel design of a drain inspection robot, Raptor, equipped with the autonomous drain following functionality. Reconfigurability of a robot can eliminate some of its limitations and increases the robot’s capabilities [28,29]. Thus, the mechanical system of the robot is designed to facilitate manual reconfigurability that can be used to adjust the ground clearance per the requirements. The robot has been designed in such a way that it satisfies all the essential features required for a drain inspection robot where the existing robots have limitations. This design consideration is the core contribution of the proposed design with respect to the state-of-the-art. The autonomous drain functionality has been developed using a fuzzy controller, which makes control actions maintain the robot in the middle of a drain while moving along the drain. The particulars of the mechanical, control, and autonomy design aspects of the robot are presented in Section 2. The fuzzy controller proposed to navigate the robot in the middle of a drain is explained in Section 3. Experimental validation is discussed in Section 4. The conclusions of the work are given in Section 5.

2. Robot Design

In this section, the details of a robot platform designed for drain inspection purposes are discussed. The design principles, reconfigurability, structural analysis, control, and autonomy layers of the robot are also briefed.

2.1. Mechanical Design

2.1.1. Design Principle

The design requirements considered for the drain inspection robot, Raptor, are derived based on the literature study of existing drain inspection robots, their limitations, and design considerations. The requirements are as follows: (a) ability to manoeuvre in narrow drains; (b) suitable to travel in various terrain; (c) lightweight and compact; (d) capability to make Sharp turns; (e) flexibility to mount sensors and actuators; and (f) capability to carry a payload. All these features were considered while designing the platform to be versatile and robust for drain inspection applications. In contrast, the existing robot designs are not in compliance with all these requirements. Comparisons between the existing robots and the proposed design based on these design considerations are given in Table 1 to highlight the improvements of the proposed design.
Figure 1 shows an exploded view of Raptor, and the primary components are annotated. The backbone structure of Raptor, the chassis, is 3D printed in Nylon for greater flexibility, durability, and higher tensile strength. The chassis is designed with a cylindrical carriage for payload at the centre. All other components are designed around it to stabilize the platform by placing the centre of gravity in the middle. The structure holding the wheels to the chassis is manufactured with additional thickness to bear the direct impulse forces from the ground. All the wheels are individually powered by geared motors. The 12 V DC motors with carbon brushes and a 391:1 gear ratio deliver 5 kg-cm torque. With a diameter of 20 mm, the motor is compact and power-packed. All-wheel drive enables faster turning and easier obstacle crossing.
Twenty cm rubber wheels with thicker grips offer the friction stick-slip grip required for the wet drain terrain. Magnetic wheel encoders are fixed with the drive motors to take velocity and position feedback. With a 390 × 350 × 200 mm platform dimension and form factor, the Raptor can cope with a maximum vertical gradient of 20–25 . The push-button mechanism and the pull lever mechanism work in combination to reconfigure the position of the wheels. The Lidar module acts as the primary sensor for perception and control.

2.1.2. Reconfiguration Module

As depicted in Figure 2, the reconfiguration for the Tuck-In and Tuck-out mechanisms is triggered through a combination of the push-button mechanism and the pull lever mechanism. Step (a), press the button when the spring is pushed down to open the provision to insert and hold the pin in place; in the next step (b), when the pull lever mechanism is triggered, it pulls the lock open on both wheels; (c) now the wheel assemblies are free to be rotated around the reconfiguration axis as shown in Figure 2; finally, when the pin is in place, both the mechanisms are released, and the wheels are secured with a modified height and length with respect to the Raptor. The ground clearance for the Tuck-In mechanism is 7 cm and for the Tuck-out mechanism it is 11 cm. With this variable ground clearance option, the configuration is used appropriately depending on the situational demands.
The reconfiguration ability allows the robot to adapt to different drains. Various deployments, including drains with flat floors (as shown in Figure 3a) which are designed for excessive flood water drainage, open drains with a curved bottom as shown in Figure 3b, and cut-off drains as shown in Figure 3c have been considered to demonstrate the adaptability of Raptor. The demonstrations show that flat drains with concrete floors need more traction, control and speed. For this application, Raptor with tucked-in configurations is identified as more suitable because of its rigid dynamics and the structural stability of the configuration. For the cases of drains with wide separation and water flow in the middle, the tuck-out configuration is identified as being more suitable with its structural flexibility.

2.1.3. Kinematics

This section describes how high-level velocity commands are converted to low-level wheel angular velocities through the kinematic model. The free body diagram of Raptor representing the linear velocity, v, and angular velocity, θ ˙ , of the robot are shown in Figure 4. L represents the distance between the left and the right wheels, and V W L and V W L represent the linear velocities of the left and right wheels, respectively.
Since the wheels on the side are driven in parallel, the angular velocity of Raptor is varied through the difference in angular velocity for the right side ω R and left side wheels ω L . The two wheels on the inside spin with a lower velocity compared to the outer wheels when making a turn. The difference in velocity determines the radius of curvature of the turn. For an in-pivot turn, that is, rotating around its centre, both wheels spin in the opposite direction with the same magnitude. Angular velocities of individual wheels for a given velocity command v and θ ˙ are calculated through the simple kinematic model given in (1), where r W is the wheel radius. These velocity commands are provided by the fuzzy logic controller explained in Section 3.
ω R = v r W + θ ˙ 2 r w L ω L = v r w θ ˙ 2 r w L .

2.1.4. Structural Analysis

The structural load-bearing capacity of the platform is an essential factor for a rigid and robust platform design in-order for robust operation in the drain environment. It is identified that the critical scenario for failure is the head on collision of Raptor with the drain walls. This event could impose severe stress on the structure and could potentially break the weakest link. Hence, this scenario is simulated with Finite Element Analysis and cross-checked for maximum stress and strain of the structure. For this simulation, the CAD model was meshed with a tetrahedral mesh as shown in Figure 5. Boundary conditions were applied with fixed constraints on all the four wheels and a static load of 10N was applied on the front impact region of the vehicle. A static structural analysis was performed in order to analyse the structure based on displacements (refer to Figure 5b), maximum von-mises (refer to Figure 5c) stress and enduring strain (refer to Figure 5d).
The tensile strength of Nylon material is 60 MPa, and the results show that the max stress concentration is safely lower than the maximum allowable stress.

2.2. Control Architecture

2.2.1. Electronics and Control

The electronics and control architecture of the robot are depicted in Figure 6. The locomotion module consists of four DC metal-gear motors (with 391:1 gear ratio and the dimension of 20D × 46L mm). Motor pairs on each side mounted to the chassis structure are controlled by separate motor drivers (RoboClaw) with separate addresses through the Universal Serial Bus (USB) protocol. This communication is to facilitate the four-wheel independent drive motor control, and the USB protocol allows us to publish data to motors and subscribe the feedback from encoders. The leg provision in the chassis provides a secure mounting for magnetic wheel encoders. The perception for Raptor is primarily through 2D Lidar (RPLidar A1M8) with the USB protocol. Lidar facilitates the mapping of the environmental situation and navigation in drains. A 4-cell Lithium-ion battery (14.4 V, 2800 mAh) is used to power the 5 V Single board computer (SBC) and 12 V motor drivers through respective voltage regulators. The remaining sensors are either powered through USB or Input/Output (I/O) circuit connections through the SBC. The SBC was chosen carefully based on multiple considerations. A RaspberryPi microprocessor with a clock speed of 1.2 GHz outperforms the Arduino microcontroller, which has a clock speed of 16 MHz. Furthermore, our application demands a microprocessor with inbuilt CPU processor, RAM, Graphics card, and connector pins. In addition to performing the high level ROS autonomy processing, RaspberryPi also enables direct communication with motor drivers and IMU through the USB and I/O communication protocols. Even though more powerful microprocessors like Jetson Nano, with its high graphical processing capabilities, easily overpowers RaspberryPi, a more optimised SBC solution must be chosen considering power consumption, cost and processing requirements. Since the data being processed is of moderate heaviness, RaspberryPi Model B is chosen as the appropriate SBC. A 435i RealSense camera unit is mounted to RaspberryPi, and the data are directly streamed to the ROS master unit on a high processing Personal computer (PC) with a Graphics Processing Unit (GPU) for the machine learning layer. RaspberryPi firmware is installed with Ubuntu 18.4. All the sensors, SBC, and the GUI control module are integrated through the ROS (Robot Operating System) framework. ROS operates with all the sensors (slave node) and SBC (master node) acting as nodes communicating through ROS topics with customized message formats.

2.2.2. Autonomy Layer and Graphical User Interface (GUI)

Raptor is designed to be teleoperated by users through mobile or laptop applications connected to the same network as the ROS master. This multi-device compatible Graphical User Interface (GUI) application is developed using Unity 2019.4, a cross-platform engine for developing graphical programs. The communication between robot and control devices happens via a Message Queuing Telemetry Transport (MQTT) bridge through ROS. In the communication protocol, ROS messages are serialized by JavaScript Object Notation (JSON) for MQTT communication and deserialized back for ROS messages. As shown in Figure 7, the GUI counsel has a provision for video feed and control buttons that equip basic navigation operations based on the visual input through the camera feed from the robot. The user can also choose to navigate along a drain autonomously. In this regard, the user activates the drain following functionality through the GUI. In that case, the proposed fuzzy logic controller (explained in Section 3) is activated, and the robot can autonomously move along the drain while maintaining its position in the middle. This autonomous behaviour avoids the overhead for the operator where the users have to perform teleoperation only for supervisory control actions such as pausing, stopping, and changing the direction required for returning, careful inspections of vigilant locations, in a junction of a drain, and unforeseen events. A 2D Lidar sensor is used for perceiving the environment to ensure obstacle detection and avoidance. An additional safety layer is added to avoid unintentional obstacle collision due to the carelessness of the operator. Measurements of wheel encoders in combination with the Inertial Measurement Unit (IMU) are used for the position and heading angle estimation. This method provides a reliable localization for Raptor.

3. Navigating in the Middle of a Drain

The main intention of the robot is to inspect the drainage. The robot should be capable of navigating in the middle of the drainage for safety and better inspection of the surrounding. A robot operates in a drain often subjected to a lot of external disturbances due to the uneven nature of the terrain. Furthermore, there is a limited space in a drain, and the robot should maintain a proper clearance with side walls while moving. Thus, the robot has been incorporated with a fuzzy logic controller to maintain the robot in the middle of a drain during the navigation.
Fuzzy logic is considered to be an intelligent technique to cope with a system with unknown dynamics [30,31,32]. This technique can be applied for a complex system defined with linguistic expressions that has a non-linear relation with the input and output [32,33,34]. According to the literature, fuzzy logic systems can be considered universal approximators [35]. The drainage robot “Raptor” dynamic model cannot be fully modeled due to the drainage frictions and the liquids, which can be taken as environmental factors. Furthermore, the dynamics of the robot vary with the wheel assembly reconfiguration. One of the advantages of the fuzzy logic controller is that it can perform well in scenarios where it has less knowledge of the underlying dynamics of the robot or the environment [36,37,38,39].
The architecture of the proposed fuzzy controller is depicted in Figure 8. The aim of the controller is to maintain the robot in the middle of the drain for safe navigation and proper inspection coverage. The clearance of the robot with the side walls of a drain are measured based on Lidar readings. A window of 30 in the right and left sides of the robot, as shown in Figure 9, is considered to calculate the average distance of each side, d r and d l . The position of the robot inside a drain is estimated from these clearance measurements. In the event of the robot being perfectly in the middle of the drain, the clearance from the two sides are the same where d r = d l . The deviation of the robot from the middle of the drain can be calculated as in (2) for the current time step, t. The inputs of the fuzzy logic controller are the present deviation from the middle (defined as e) and the difference of the deviation during successive steps of the controlling loop, δ e . δ e is calculated as in (3). The two parameters are chosen as the input since these parameters reflect the present and the trend of deviation of the robot from the middle of a drain. The output of the fuzzy system is the angular velocity of the robot (i.e., θ ˙ ), which steers the robot to sides. The angular velocities of the right and left wheels corresponding to the reference θ ˙ are determined from the kinematic model of the robot.
e ( t ) = d l ( t ) d r ( t )
δ e ( t ) = e ( t ) e ( t 1 ) .
Fuzzification of the two inputs, e and δ e , is executed according to the input membership functions depicted in Figure 10a,b. μ e and μ δ e are the fuzzified inputs that correspond to e and δ e , respectively. Fuzzy rules defined for the robot for the required control actions are given in Table 2. The rule base of the fuzzy logic system has been defined such that the robot’s actions counter the error of the robot from the middle of the drain. For example, if the robot is on the right side from the middle, the robot’s angular velocity is set to turn toward the right and vice versa. When the deviation is slight, then the amount of angular velocity for turning is also reduced. The change of error is also analyzed in the control rule to minimize the oscillations caused by overshooting. For example, if an error change indicates a quick correction, the effective robot’s corrective action is reduced to avoid overshooting. The rule base of the fuzzy inference system has been defined based on expert knowledge to reflect this example control behavior. The input fuzzy sets are mapped with the output fuzzy sets using the rule base during the interfacing. The output membership functions are depicted in Figure 10c. Minimum fuzzy t-norm and maximum t-conorm are considered in the fuzzy logic system. In the firing strength i th rule, f i can be expressed as in (4). Mamdani’s minimum operation rule leads to the fuzzy consequent of the i th rule, μ θ ˙ i given in (5). The accumulated fuzzy consequents can be obtained from (6), where n is the number of rules. The accumulated set of output fuzzy consequents is defuzzified in the defuzzification layer. The defuzzified crisp output, θ ˙ * , can be obtained from (7). The linear velocity is a constant. However, if the angular velocity exceeds the threshold θ ˙ T , the linear velocity will change according to (8). These values were set that satisfy the limits of the kinematic model. The expected variation of the output of the fuzzy logic system with the inputs is shown in Figure 11.
f i = μ e i μ δ e i
μ θ ˙ i = f i μ θ ˙ i
μ θ ˙ = μ θ ˙ 1 μ θ ˙ 2 . . . μ θ ˙ n
θ ˙ * = θ ˙ μ θ ˙ d θ ˙ μ θ ˙ d θ ˙
v = k v if | θ ˙ | < θ ˙ T 0 otherwise .

4. Experiments and Results

4.1. Experimental Setup

A prototype of the proposed robot design has been developed. Experiments were conducted to validate the proposed controller designed to maintain the robot in the middle of a drain during navigation. In this regard, four test scenarios were considered by deploying the robot in a drain. Two test scenarios created in a laboratory setting were used as additional cases. The robot’s parameters were configured as follows for all the test scenarios of the experiments: The experiments were conducted by activating the autonomous drain following functionality through the GUI after placing the robot at the respective starting position. The proposed controller was used to autonomously move the robot during the test scenarios without interventions of the user other than starting and stopping. The default linear velocity of the robot, k v , was configured to 0.08 m/s, while the threshold, θ ˙ T , was set to 0.03 rad/s. The controller was found to be run at 9.1 Hz. An explanatory video that demonstrates the test scenarios is given as a multimedia attachment in Supplementary Materials.

4.2. Results

In test scenario ‘a’, the robot was placed in the drain with no heading error or offset from the middle, as shown in Figure 12a. Then, the navigation of the robot was initiated. The corresponding variations of the inputs and output of the controller in this scenario are given in Figure 13a. According to the plots, the error of the robot, e, and a few oscillations of e could be noted. However, the magnitude of the oscillations was trivial, leading to a small Root Mean Squared Error (RMSE) (RMSE = 3.8 cm). Thus, the performance of the controller was acceptable in this scenario. The traced path of the robot in this scenario is shown in Figure 14a. This path also confirms that the robot was properly moved in the middle of the drain.
Test scenario ‘b’ was designed to validate the ability of the controller to recover a possible offset of the robot from the middle during an inspection process. In this regard, the robot was initially placed in the drain with an offset with the middle. However, the initial heading of the robot was parallel to the drain. The sequence of snapshots of the robot taken during this case is shown in Figure 12b. The variation of e along with other inputs and outputs of the controller is given in Figure 13b. Initially, the robot had a higher e. With the actions of the controller, e was lowered quickly. After that, a few oscillations of e similar to test scenario ‘a’ can be seen. The traced path of the robot shown in Figure 14b also agrees with these observations. The robot RMSE in this case was also low at 5.9 cm, indicating the proposed controller’s ability to recover from a possible offset from the middle during operations.
There can be situations of heading errors with no offsets. Scenario ‘c’ was designed to evaluate the recovery ability of such situations. Thus, the robot was initially placed in the middle with a heading nonparallel to the drain, as shown in Figure 12c. A behaviour similar to scenario ‘b’ could be observed in variations of e (see Figure 13c), where the robot successfully recovered from the situation and moved along the middle of the drain (see the robot path given in Figure 14c). These results confirm the ability of the controller to recover from incorrect heading scenarios.
In scenario ‘d’, the robot was initially placed in the drain with both offset and heading error since the robot should have the ability to effectively cope with such situations (see Figure 12d). Similar to earlier cases, the robot successfully recovered from the initial errors and then moved along the middle of the drain, as shown in Figure 14d. The RMSE was 7.4 cm in here. These results confirm the proposed controller’s ability to maintain the robot in the middle of a drain even though it had to recover from an offset and heading error.
A drain segment with an angled direction was considered as scenario ‘e’ to evaluate the behaviour of the controller in such events. This scenario was conducted considering a simulated drain setting constructed in a laboratory setting, as shown in Figure 12e. According to the obtained results, the robot was capable of turning toward the new direction when it entered the angled segment (see Figure 14f). However, the robot had an offset and heading error when it entered into this segment. This segment is similar to scenario ‘d’ where the robot was successful with coping. Thus, it can be considered that the proposed control is capable of coping with angled drain directions.
Test scenario ‘f’ represented a situation where there is a sudden change in the width of a drain, as shown in Figure 12f. This scenario was also constructed within the laboratory setting. According to the plot shown in Figure 13f, a sudden change of e could be noted when the robot passed through the sudden width change. However, the robot could successfully manage the spikes of e without triggering undesired control actions and moved in the middle, as shown in Figure 14f.
In all test scenarios considered in the experiments, the proposed controller successfully navigated the robot in the middle of the drain. The considered test scenarios span most of the probable situations often encountered by a robot intended for drain inspection. Therefore, the proposed controller’s effectiveness in maintaining the robot in the middle of the drain when moving for the inspection can be concluded. Therefore, the proposed robot design and the controller would be helpful in improving the productivity of the robot-aided inspection process of drains, which is crucial for public health, safety, and for avoiding flash floods.

4.3. Discussion

Due to the rough nature of the drain walls, the range sensor information could include noises. The high-frequency variations of e observed in the plots are due to the noises. Especially, very large sudden variations of e could be observed in test scenario ‘f’ between 21–25 s even though the robot moved in the middle (robot’s movement in the middle without such variations can be confirmed from the map shown in Figure 14f). These variations were due to the sensor noises caused by uncertain characteristics of the side walls such as vertical slants, reflections, unevenness and impurities. Even though the proposed controller experienced imprecise input information from the sensors, the controller was capable of adequately navigating the robot in the middle of the drain. These observations confirm the sensor uncertainty handling ability of the proposed controller.
The fuzzy logic controller has no novelty from the perspective of fuzzy mathematics. However, this work uses a Mamdani-type fuzzy inference system as a framework for a novel application where a compact drain inspection robot is autonomously navigated along a drain while maintaining its position in the middle of the drain. The proposed controller determines the control actions of the robot, and the angular and linear velocity of the robot, based on the range sensor information, which consists of side clearances. Moreover, this work produces a novel control criterion for a drain inspection robot to improve its productivity. Very little work has been conducted on developing drain inspection robots, and a method with similar capabilities that can be used for performance comparison could not be found. Therefore, a performance comparison of the proposed robot and the existing work is not feasible. The movement of the robot along the drain with a sufficient side clearance to avoid collisions was considered the criterion that defined the controller’s success in a given scenario. In addition to that, the RMSE was used as a performance indicator where the average RMSE during the test cases was 6.6 cm. This amount of average deviation from the middle of the drain is acceptable in terms of the robot’s dimensions and the drain. Therefore, the proposed robot with the controller is helpful for complementing the drain inspections.

5. Conclusions

Robots have been explored for drain inspections to resolve the shortcomings of human labour-based approaches. A robot designed for drain inspections should cope with harsh environmental conditions such as rough terrain and confined space availability. Thus, the development of robots for drain inspection is challenging, and the existing robotics solutions for drain inspections require improvements.
This paper proposed a novel design of a drain inspection robot. The robot is equipped with a manually reconfigurable wheel axel mechanism that can be used to adapt the robot clearance height and length for the terrain conditions of a drain. A fuzzy controller is introduced to position the robot in the middle while moving along a drain to complement the inspection process.
Experiments have been conducted to validate the proposed robot design and the controller by deploying the robot to a drain setting. According to the experimental results, the proposed controller is effective in maintaining the robot in the middle while moving for inspections. Therefore, the findings of this work would be beneficial for improving drain inspection. It is expected to conduct experiments on drain settings with a stream of water in the future to evaluate the behaviour and determine the required adaptations. Furthermore, explorations on multirobot coordination for drain inspection are proposed for future work.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/s21175742/s1.

Author Contributions

Conceptualization, M.A.V.J.M., S.M.B.P.S. and M.R.E.; methodology, M.A.V.J.M., P.P. and S.M.B.P.S.; software, M.A.V.J.M. and P.P.; investigation, M.A.V.J.M. and P.P.; resources, D.N.T.; writing—original draft preparation, M.A.V.J.M., P.P., S.M.B.P.S. and S.G.A.P.; writing—review and editing, M.R.E.; visualization, M.A.V.J.M.; supervision, M.R.E.; funding acquisition, M.R.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Robotics Programme under its Robotics Enabling Capabilities and Technologies (Funding Agency Project No. 192 25 00051), National Robotics Programme under its Robot Domain Specific (Funding Agency Project No. 192 22 00058), National Robotics Programme under its Robotics Domain Specific (Funding Agency Project No. 192 22 00108), and administered by the Agency for Science, Technology and Research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Luan, I.O.B. Singapore water management policies and practices. In Asian Perspectives on Water Policy; Routledge: Oxfordshire, UK, 2013; pp. 109–124. [Google Scholar]
  2. World Health Organisation. Water Sanitaion Health-Hygiene. Available online: https://www.who.int/water_sanitation_health/hygiene/settings/hvchap5.pdf?ua=1 (accessed on 20 May 2021).
  3. Parkinson, J. Urban drainage in developing countries-challenges and opportunities. Waterlines 2002, 20, 2–5. [Google Scholar] [CrossRef]
  4. Singapore National Water Agency. Drain Cleansing Maintenance. 2020. Available online: https://www.pub.gov.sg/drainage/network/ (accessed on 18 May 2021).
  5. Seidahmed, O.M.E.; Lu, D.; Chong, C.S.; Ng, L.C.; Eltahir, E.A.B. Patterns of Urban Housing Shape Dengue Distribution in Singapore at Neighborhood and Country Scales. GeoHealth 2018, 2, 54–67. [Google Scholar] [CrossRef]
  6. Doshi, J.J.M. An Investigation of leaky Sewers As a Source of Fecal Contamination in the Stormwater Drainage System in Singapore. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2012. [Google Scholar]
  7. Rea, P.; Ottaviano, E. Design and development of an Inspection Robotic System for indoor applications. Robot. Comput.-Integr. Manuf. 2018, 49, 143–151. [Google Scholar] [CrossRef]
  8. Muthugala, M.A.V.J.; Vega-Heredia, M.; Mohan, R.E.; Vishaal, S.R. Design and Control of a Wall Cleaning Robot with Adhesion-Awareness. Symmetry 2020, 12, 122. [Google Scholar] [CrossRef] [Green Version]
  9. Samarakoon, S.M.B.P.; Muthugala, M.A.V.J.; Le, A.V.; Elara, M.R. HTetro-infi: A reconfigurable floor cleaning robot with infinite morphologies. IEEE Access 2020, 8, 69816–69828. [Google Scholar] [CrossRef]
  10. Quenzel, J.; Nieuwenhuisen, M.; Droeschel, D.; Beul, M.; Houben, S.; Behnke, S. Autonomous MAV-based indoor chimney inspection with 3D laser localization and textured surface reconstruction. J. Intell. Robot. Syst. 2019, 93, 317–335. [Google Scholar] [CrossRef]
  11. Ko, H.; Yi, H.; Jeong, H.E. Wall and ceiling climbing quadruped robot with superior water repellency manufactured using 3D printing (UNIclimb). Int. J. Precis. Eng. Manuf.-Green Technol. 2017, 4, 273–280. [Google Scholar] [CrossRef]
  12. Ito, F.; Kawaguchi, T.; Kamata, M.; Yamada, Y.; Nakamura, T. Proposal of a Peristaltic Motion Type Duct Cleaning Robot for Traveling in a Flexible Pipe. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 6614–6621. [Google Scholar]
  13. Ogai, H.; Bhattacharya, B. Pipe Inspection Robots for Structural Health and Condition Monitoring; Springer: New Delhi, India, 2018. [Google Scholar]
  14. Yu, L.; Yang, E.; Ren, P.; Luo, C.; Dobie, G.; Gu, D.; Yan, X. Inspection Robots in Oil and Gas Industry: A Review of Current Solutions and Future Trends. In Proceedings of the 2019 25th International Conference on Automation and Computing (ICAC), Lancaster, UK, 5–7 September 2019; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  15. Hayat, A.A.; Elangovan, K.; Rajesh Elara, M.; Teja, M.S. Tarantula: Design, Modeling, and Kinematic Identification of a Quadruped Wheeled Robot. Appl. Sci. 2019, 9, 94. [Google Scholar] [CrossRef] [Green Version]
  16. Kirkham, R.; Kearney, P.D.; Rogers, K.J.; Mashford, J. PIRAT—A system for quantitative sewer pipe assessment. Int. J. Robot. Res. 2000, 19, 1033–1053. [Google Scholar] [CrossRef]
  17. Kuntze, H.; Schmidt, D.; Haffner, H.; Loh, M. KARO-A flexible robot for smart sensor-based sewer inspection. In Proceedings of the International Conference on No Dig’95, Dresden, Germany, 19–22 September 1995; pp. 367–374. [Google Scholar]
  18. Nassiraei, A.A.; Kawamura, Y.; Ahrary, A.; Mikuriya, Y.; Ishii, K. Concept and design of a fully autonomous sewer pipe inspection mobile robot “kantaro”. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 136–143. [Google Scholar]
  19. Kirchner, F.; Hertzberg, J. A prototype study of an autonomous robot platform for sewerage system maintenance. Auton. Robot. 1997, 4, 319–331. [Google Scholar] [CrossRef]
  20. Streich, H.; Adria, O. Software approach for the autonomous inspection robot MAKRO. In Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’04, New Orleans, LA, USA, 26 April–1 May 2004; Volume 4, pp. 3411–3416. [Google Scholar]
  21. Parween, R.; Muthugala, M.; Heredia, M.V.; Elangovan, K.; Elara, M.R. Collision Avoidance and Stability Study of a Self-Reconfigurable Drainage Robot. Sensors 2021, 21, 3744. [Google Scholar] [CrossRef] [PubMed]
  22. Rayhana, R.; Jiao, Y.; Zaji, A.; Liu, Z. Automated Vision Systems for Condition Assessment of Sewer and Water Pipelines. IEEE Trans. Autom. Sci. Eng. 2020, 1–18. [Google Scholar] [CrossRef]
  23. Makantasis, K.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.; Loupos, C. Deep Convolutional Neural Networks for efficient vision based tunnel inspection. In Proceedings of the 2015 IEEE International Conference on Intelligent Computer Communication and Processing (ICCP), Cluj-Napoca, Romania, 3–5 September 2015; pp. 335–342. [Google Scholar] [CrossRef]
  24. Cha, Y.; Choi, W.; Büyüköztürk, O. Deep Learning-Based Crack Damage Detection Using Convolutional Neural Networks. Comput. Aided Civ. Infrastruct. Eng. 2017, 32, 361–378. [Google Scholar] [CrossRef]
  25. Selvaraj, V. Autonomous Drain Inspection: A Deep Learning Approach. Master’s Thesis, National University of Singapore, Singapore, 2019. [Google Scholar]
  26. Broggi, A.; Caraffi, C.; Fedriga, R.; Grisleri, P. Obstacle Detection with Stereo Vision for Off-Road Vehicle Navigation. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 21–23 September 2015; p. 65. [Google Scholar] [CrossRef]
  27. Al-Kaff, A.; Garcia, F.; Martín Gómez, D.; de la Escalera, A.; Armingol, J. Obstacle Detection and Avoidance System Based on Monocular Camera and Size Expansion Algorithm for UAVs. Sensors 2017, 17, 1061. [Google Scholar] [CrossRef] [PubMed]
  28. Zhou, F.; Xu, X.; Xu, H.; Chang, Y.; Wang, Q.; Chen, J. Implementation of a Reconfigurable Robot to Achieve Multimodal Locomotion Based on Three Rules of Configuration. Robotica 2020, 38, 1478–1494. [Google Scholar] [CrossRef]
  29. Samarakoon, S.; Muthugala, M.; Elara, M.R. Toward Pleomorphic Reconfigurable Robots for Optimum Coverage. Complexity 2021, 2021, 3705365. [Google Scholar] [CrossRef]
  30. De Silva, C.W. Intelligent Control: Fuzzy Logic Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  31. Samarakoon, S.B.P.; Muthugala, M.V.J.; Elara, M.R. Toward obstacle-specific morphology for a reconfigurable tiling robot. J. Ambient. Intell. Humaniz. Comput. 2021, 1–13. [Google Scholar] [CrossRef]
  32. Ross, T.J. Fuzzy Logic with Engineering Applications; John Wiley & Sons: West Sussex, UK, 2005. [Google Scholar]
  33. Ibarra, L.; Webb, C. Advantages of fuzzy control while dealing with complex/unknown model dynamics: A quadcopter example. In New Applications of Artificial Intelligence; InTech: Rijeka, Croatia, 2016; p. 93. [Google Scholar]
  34. Muthugala, M.V.J.; Samarakoon, S.B.P.; Elara, M.R. Tradeoff between area coverage and energy usage of a self-reconfigurable floor cleaning robot based on user preference. IEEE Access 2020, 8, 76267–76275. [Google Scholar] [CrossRef]
  35. Nguyen, H.T.; Walker, C.L.; Walker, E.A. A First Course in Fuzzy Logic; CRC Press: New York, NY, USA, 2018. [Google Scholar]
  36. Ren, Q.; Bigras, P. A highly accurate model-free motion control system with a Mamdani fuzzy feedback controller Combined with a TSK fuzzy feed-forward controller. J. Intell. Robot. Syst. 2017, 86, 367–379. [Google Scholar] [CrossRef]
  37. Masmoudi, M.S.; Krichen, N.; Masmoudi, M.; Derbel, N. Fuzzy logic controllers design for omnidirectional mobile robot navigation. Appl. Soft Comput. 2016, 49, 901–919. [Google Scholar] [CrossRef]
  38. Kumar, N.; Takács, M.; Vámossy, Z. Robot navigation in unknown environment using fuzzy logic. In Proceedings of the 2017 IEEE 15th International Symposium on Applied Machine Intelligence and Informatics (SAMI), Herl’any, Slovakia, 26–28 January 2017; pp. 000279–000284. [Google Scholar]
  39. Omrane, H.; Masmoudi, M.S.; Masmoudi, M. Fuzzy logic based control for autonomous mobile robot navigation. Comput. Intell. Neurosci. 2016, 2016, 9548482. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Mechanical components.
Figure 1. Mechanical components.
Sensors 21 05742 g001
Figure 2. Steps of the reconfiguration. (a) Push button pressing to open the provision to insert and hold the pin (b) Pulling the lock open when triggering the pull lever (c) Adjusting the wheel assemblies (d) New wheel assembly configuration
Figure 2. Steps of the reconfiguration. (a) Push button pressing to open the provision to insert and hold the pin (b) Pulling the lock open when triggering the pull lever (c) Adjusting the wheel assemblies (d) New wheel assembly configuration
Sensors 21 05742 g002
Figure 3. Adaptability of the robot to various drain scenarios. (a) In a drain with a flat floor (b) In a drain with a curved bottom (c) In a cut-off drain
Figure 3. Adaptability of the robot to various drain scenarios. (a) In a drain with a flat floor (b) In a drain with a curved bottom (c) In a cut-off drain
Sensors 21 05742 g003
Figure 4. Kinematics. (a) Top view (b) Side view
Figure 4. Kinematics. (a) Top view (b) Side view
Sensors 21 05742 g004
Figure 5. Static structural analysis (a) Tetrahedral mesh of Raptor 3D model (b) Static Displacement analysis (c) Strain Analysis results (d) Von Mises stress in N/m 2 .
Figure 5. Static structural analysis (a) Tetrahedral mesh of Raptor 3D model (b) Static Displacement analysis (c) Strain Analysis results (d) Von Mises stress in N/m 2 .
Sensors 21 05742 g005
Figure 6. Control Architecture.
Figure 6. Control Architecture.
Sensors 21 05742 g006
Figure 7. Overview for a Raptor drain inspection system.
Figure 7. Overview for a Raptor drain inspection system.
Sensors 21 05742 g007
Figure 8. Architecture of the fuzzy logic system.
Figure 8. Architecture of the fuzzy logic system.
Sensors 21 05742 g008
Figure 9. Principle of navigating in the middle of a drain.
Figure 9. Principle of navigating in the middle of a drain.
Sensors 21 05742 g009
Figure 10. Input and output membership functions of fuzzy inference system. (a,b) The input membership functions. (c) The output membership function.The fuzzy labels are defined as R: Right, LR: Little Right, M: Middle, LL: Little Left, L: Left, NM: Negative Medium, NS: Negative Small, Z: Zero, PS: Positive Small, and PM: Positive Medium. It should be noted that the membership functions are given in normalized scales.
Figure 10. Input and output membership functions of fuzzy inference system. (a,b) The input membership functions. (c) The output membership function.The fuzzy labels are defined as R: Right, LR: Little Right, M: Middle, LL: Little Left, L: Left, NM: Negative Medium, NS: Negative Small, Z: Zero, PS: Positive Small, and PM: Positive Medium. It should be noted that the membership functions are given in normalized scales.
Sensors 21 05742 g010
Figure 11. Decision surface of the fuzzy logic system
Figure 11. Decision surface of the fuzzy logic system
Sensors 21 05742 g011
Figure 12. Arrangement of the test scenarios and the sequence of the robot’s movement. (a): Scenario ‘a’ where no initial heading and offset, (b): Scenario ‘b’ where there was an initial offset with no heading error, (c): Scenario ‘c’ where there was an initial error with no offset, (d): Scenario ‘d’ where the initial offset accompanied by a heading error, (e): Scenario ‘e,’ where a simulated situation of an angled drain, and (f): Scenario ‘f’ where the environment has a sudden change in width. It should be noted that Scenarios ‘a’ to ’d’ were conducted on a drain, and Scenarios ‘e’ and ‘f’ were conducted in a mock drain environment constructed in a laboratory setting.
Figure 12. Arrangement of the test scenarios and the sequence of the robot’s movement. (a): Scenario ‘a’ where no initial heading and offset, (b): Scenario ‘b’ where there was an initial offset with no heading error, (c): Scenario ‘c’ where there was an initial error with no offset, (d): Scenario ‘d’ where the initial offset accompanied by a heading error, (e): Scenario ‘e,’ where a simulated situation of an angled drain, and (f): Scenario ‘f’ where the environment has a sudden change in width. It should be noted that Scenarios ‘a’ to ’d’ were conducted on a drain, and Scenarios ‘e’ and ‘f’ were conducted in a mock drain environment constructed in a laboratory setting.
Sensors 21 05742 g012
Figure 13. Variation of inputs and outputs of decision-making process of navigating in the middle of a drain in the considered test scenarios. (a) Scenario ‘a’, (b) Scenario ‘b’, (c) Scenario ‘c’, (d) Scenario ‘d’, (e) Scenario ‘e’, and (f) Scenario ‘f’.
Figure 13. Variation of inputs and outputs of decision-making process of navigating in the middle of a drain in the considered test scenarios. (a) Scenario ‘a’, (b) Scenario ‘b’, (c) Scenario ‘c’, (d) Scenario ‘d’, (e) Scenario ‘e’, and (f) Scenario ‘f’.
Sensors 21 05742 g013
Figure 14. Traced path of the robot during the test scenarios. (a) Scenario ‘a’, (b) Scenario ‘b’, (c) Scenario ‘c’, (d) Scenario ‘d’, (e) Scenario ‘e’, and (f) Scenario ‘f’.
Figure 14. Traced path of the robot during the test scenarios. (a) Scenario ‘a’, (b) Scenario ‘b’, (c) Scenario ‘c’, (d) Scenario ‘d’, (e) Scenario ‘e’, and (f) Scenario ‘f’.
Sensors 21 05742 g014
Table 1. Comparison of the proposed design with the existing drain inspection robot.
Table 1. Comparison of the proposed design with the existing drain inspection robot.
RobotDesign Requirements *
abcdef
PIRAT [16]HighLowYesHighYesYes
KARO [17]HighLowYesHighYesYes
KANTARO [18]HighLowYesHighYesYes
KURT [19]HighLowYesHighYesYes
MAKRO [20]MediumLowNo (lengthy)LowYesYes
Tarantula [15]LowHighNo (bulky)LowYesYes
Raptor (proposed design)HighHighYesHighYesYes
* The labels are defined as follows; a: ability to manoeuvre in narrow drains, b: suitable to travel in various terrain, c: lightweight and compact, d: capability to make Sharp turns, e: flexibility to mount sensors and actuators, and f: capability to carry a payload.
Table 2. Rule base of the fuzzy logic system.
Table 2. Rule base of the fuzzy logic system.
δ e \ e LLLMLRR
NMRRRLRM
NSRRLRMLL
ZRLRMLLL
PSLRMLLLL
PMMLLLLL
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Muthugala, M.A.V.J.; Palanisamy, P.; Samarakoon, S.M.B.P.; Padmanabha, S.G.A.; Elara, M.R.; Terntzer, D.N. Raptor: A Design of a Drain Inspection Robot. Sensors 2021, 21, 5742. https://doi.org/10.3390/s21175742

AMA Style

Muthugala MAVJ, Palanisamy P, Samarakoon SMBP, Padmanabha SGA, Elara MR, Terntzer DN. Raptor: A Design of a Drain Inspection Robot. Sensors. 2021; 21(17):5742. https://doi.org/10.3390/s21175742

Chicago/Turabian Style

Muthugala, M. A. Viraj J., Povendhan Palanisamy, S. M. Bhagya P. Samarakoon, Saurav Ghante Anantha Padmanabha, Mohan Rajesh Elara, and Dylan Ng Terntzer. 2021. "Raptor: A Design of a Drain Inspection Robot" Sensors 21, no. 17: 5742. https://doi.org/10.3390/s21175742

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop