Next Article in Journal
Spectral and Wavelet Analysis in the Assessment of the Impact of Corrosion on the Structural Integrity of Mining Equipment
Previous Article in Journal
Balancing Staff Finishing Times vs. Minimizing Total Travel Distance in Home Healthcare Scheduling
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics

by
Sidharth Jeyabal
,
Charan Vikram
,
Prithvi Krishna Chittoor
* and
Mohan Rajesh Elara
Engineering Product Development Pillar, Singapore University of Technology and Design, Singapore 487372, Singapore
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(16), 7382; https://doi.org/10.3390/app14167382 (registering DOI)
Submission received: 24 July 2024 / Revised: 18 August 2024 / Accepted: 20 August 2024 / Published: 21 August 2024
(This article belongs to the Section Robotics and Automation)

Abstract

:
Effective pest management in urban areas is critically challenged by the rapid proliferation of mosquito breeding sites. Traditional fumigation methods expose human operators to harmful chemicals, posing significant health risks ranging from respiratory problems to long-term chronic conditions. To address these issues, a novel fumigation robot equipped with sensor fusion technology for optimal pest control in urban landscapes is proposed. The proposed robot utilizes light detection and ranging data, depth camera inputs processed through the You Only Look Once version 8 (YOLOv8) algorithm for precise object recognition, and inertial measurement unit data. These technologies allow the robot to accurately identify and localize mosquito breeding hotspots using YOLOv8, achieving a precision of 0.81 and a mean average precision of 0.74. The integration of these advanced sensor technologies allows for detailed and reliable mapping, enhancing the robot’s navigation through complex urban terrains and ensuring precise targeting of fumigation efforts. In a test case, the robot demonstrated a 62.5% increase in efficiency by significantly reducing chemical usage through targeted hotspot fumigation. By automating the detection and treatment of breeding sites, the proposed method boosts the efficiency and effectiveness of pest management operations and significantly diminishes the health risks associated with chemical exposure for human workers. This approach, featuring real-time object recognition and dynamic adaptation to environmental changes, represents a substantial advancement in urban pest management, offering a safer and more effective solution to a persistent public health issue.

1. Introduction

Mosquito-borne illnesses are still a significant cause of death worldwide, threatening public health [1]. These insects are carriers of viruses, transmitting illnesses such as dengue, chikungunya, dirofilariasis, malaria, and Zika [2,3,4]. Due to global warming, temperatures are rising, and heat spikes are occurring more often. Researchers have found that mosquitoes adapting to heat spikes have become more pesticide-resistant [5]. The National Environmental Agency (NEA) of Singapore has declared various types of open and closed drains, discarded containers, clogged gutters, and roadside drainage grates as a few of the mosquitoes’ most common breeding sites, as illustrated in Figure 1 [6,7,8]. As a method of population control, fumigation is typically used alongside other methods, such as mosquito traps and lab-grown mosquitoes, which mate with female mosquitoes and make them infertile [9]. Active and passive traps [10] combat increasing populations of mosquitoes. Active traps use gaseous chemicals such as carbon dioxide (CO2) and visual attractants such as light, while passive traps use sticky surfaces to trap mosquitoes. Some examples of traps that have been used and studied are the Biogents sentinel trap, which uses CO2 [11], light traps [12], ovitraps, which serve as a place for mosquitos to deposit eggs into larvicide, effectively killing the hatched eggs [13], and gravitraps, which function like ovitraps [14]. However, each of these traps has its limitations. The Biogents Sentinal trap is much more effective when paired with another attractant, such as the BG Lure [15]. Light traps are inefficient when there are high mosquito densities [16]. Ovitraps are not as efficient as other traps, such as the host-seeking female traps [17]. Lastly, gravitraps are not as efficient as suction fan traps and have the limitation of trapping male mosquitoes [18]. Thus, fumigation has proven to be very effective in mosquito population control. Manual fumigation by humans is a laborious task, and there is a chance that humans may overlook specific breeding grounds. According to the NEA, hydrogen cyanide, methyl bromide, and hydrogen phosphide are common fumigants [19]. Overexposure to fumigants can impact the central nervous system [20,21]. Thus, robots can assist in automating mosquito breeding ground detection and fumigation.
In previous works, researchers have demonstrated an Unmanned Aerial Vehicle (UAV)-based fumigation robot that identifies possible mosquito breeding grounds [22]. However, it is not easy for UAVs to fumigate small areas on the ground as UAVs are better designed for widespread area coverage during spraying, as used in the agriculture industry. Due to the height from which the fumigants are sprayed, there is a chance that the fumigants may not reach the intended mosquito breeding ground. Environmental factors, such as wind, may also blow the fumigants away before they reach the hotspot, deeming it ineffective [23,24]. Ground robots are safer as there is no risk of collisions with birds in the sky or the robot failing mid-air and falling to the ground. Furthermore, a land robot’s energy expenditure is better spent on locomotion. Researchers [25] employed deep learning models, specifically various YOLO-based architectures, to identify and target specific areas within tobacco fields for treatment. This approach optimizes the application of agrochemicals, reducing waste and environmental impact, and addresses challenges such as pressure fluctuations during spraying [25,26,27,28]. The study developed by [29] presents an electric sprayer with a crop perception system that calculates leaf density using a support vector machine (SVM). This system, tested with a dataset created for the community, achieved an accuracy between 80% and 85%, enhancing spraying accuracy and precision. This emphasizes the effectiveness of integrating machine learning for precise chemical applications. Researchers [30] enhanced a YOLOv5 model for precise plant detection, which significantly improved the accuracy and efficiency of a precision spraying robot. Integrating an attention mechanism and the C3-Ghost-bottleneck module boosted performance, increasing the mean average precision (mAP) by 3.2%. The work presented in [31] introduces a robotic weeding system that minimizes herbicide usage through precise application. It featured a stereo camera, an inertial measurement unit, and spray nozzles controlled by a binary linear programming-based algorithm for optimal coverage. A study [32] presented a deep learning-based detection model that distinguished weeds from cotton seedlings with high accuracy by using a convolutional block attention module (CBAM), a Bidirectional Feature Pyramid Network structure (BiFPN), and a bilinear interpolation algorithm.
Existing systems, while advanced, primarily focus on agricultural settings with structured environments and often lack the capability to navigate the complex, dynamic, and GPS-denied environments typical of urban landscapes. A comprehensive review of the existing literature (Table 1) shows a notable deficit in real-time autonomous dynamic hotspot mapping and fumigating robots. This absence underscores the novelty and importance of the proposed contribution, which introduces an innovative system specifically tailored for urban settings. The proposed fumigation robot aims to fill this gap by utilizing advanced sensor fusion and AI-based detection algorithms [33], ensuring precise and effective pest control in challenging urban terrains. Furthermore, integrating AI-based detection in precision fumigation enhances the robotic system’s ability to perform targeted actions effectively. The system ensures that interventions are precise and efficient by leveraging advanced AI algorithms for detection and identification. This approach is particularly beneficial in densely populated urban areas where mosquito control must be meticulously managed to maximize impact while minimizing chemical usage, highlighting the robot’s potential to transform urban pest management practices. From the literature survey, the following highlights are identified:
The main contributions of this paper are as follows:
  • The development of a precision fumigation robot for urban landscapes: This is an original contribution, as no existing autonomous robots navigate urban environments for precision fumigation applications. This novel development addresses the need for precise, automated solutions in urban pest control.
  • The development of a LiDAR-Vision-IMU fusion algorithm: Inspired by existing research [34,35], this contribution enhances traditional sensor fusion techniques to fit an autonomous fumigation robot, improving its ability to identify and map mosquito hotspots in real time. This adaptation enables more effective data collection and targeting of potential breeding hotspots.

2. Development of the Fumigation Robot

The fumigation robot shown in Figure 2 is designed to identify and fumigate mosquito hotspots. The robot’s motion is based on differential drive, allowing it to maneuver easily during fumigation. The robot uses 2D and 3D LiDARs and an inertial measurement unit for autonomous navigation. The robot is also equipped with a spray gun and a chemical tank. The spray gun’s top and bottom ends are connected to one end of the linear actuator. The opposite end of the linear actuator is attached to the bottom end of the metal shaft. The system is on top of a stepper motor to allow for the panning motion of the gun. The linear actuator facilitates the tilt motion of the spray gun. The adjustable gun can rotate 360° and fumigate up to 4.5 m from the ground and 2.5 m to 6 m away from the robot. The specifications of the major components are listed in Table 2.
In this paper, mosquito breeding grounds are referred to as hotspots as these are considered the “ground zero” for the growth of the mosquito population. The main aim of the fumigation robot is to identify possible mosquito hotspots and fumigate them. For the robot to work efficiently, daily remapping is necessary so that the robot can locate new hotspots and fumigate them while being able to identify when a hotspot is no longer active and halt fumigation for the area.

2.1. Setting Up the Navigation Stack

In the proposed system, fumigation is performed autonomously around an area, and those hotspots are mapped in real time in the robot’s map database. For autonomous navigation, there are different levels of execution. The first step is the mapping of the environment. Various sensors for mapping the environment are depth cameras, 3D Light Detection and Ranging (LiDAR), and 2D-LiDAR. In this research, a 3D-LiDAR is used instead of other sensors to map the environment. Three-dimensional-LiDAR is better for mapping than the passive mapping method using depth cameras [36]. Moreover, depth cameras are affected by illumination, and many details are lost depending on how well-lit the environment is. Two-dimensional-LiDARs have a very limited field of view, as any object below or above its scan area is undetected. Hence, considering all these points, a 3D-LiDAR is used for this application. There are different mapping algorithms used in most 3D-LiDAR integrated systems, namely Lidar Odometry and Mapping (LOAM), Cartographer by Google, High-Definition LiDAR (HDL) graph simultaneous localization and mapping (SLAM), and Lidar Inertial Odometry via Smoothing and Mapping (LIO-SAM). Compared to the existing mapping algorithms, LOAM was the best until a few years ago. It was even the top-ranked LiDAR-based method in the Karlsruhe Institute of Technology and Toyota Technological Institute (KITTI) dataset benchmark site. However, drifting in large-scale tests came with a significant drawback. Hence, LIO-SAM [37] is used contrary to the LOAM in the proposed system. In the experimental results, the LIO-SAM showed a translation error of 0.96 compared to LOAM, which had a translation error of 47.31%. Following the creation of the detailed map, navigation must be carried out. The navigation history goes back to the Devish robot, designed to carry out navigation inside office ecosystems [38]. These navigation stacks used finite state machine (FSM) for less complex environments; hence, a behavior tree was used inside navigation stacks to make decisions on complex tasks. For example, in [39], shooting tactics in soccer games were decided using a decision tree, which would never have been possible with FSM. These decision trees are used in navigation. The navigation stack by the robot operating system (ROS) is the most widely used, and its successor is Navigation2. It was built for ROS2, which uses Data Distribution Service (DDS) for communication. DDS is an industrial communication protocol that offers secure data transmission between robots. Navigation2 stack has been released for different motions of robots, namely differential, holonomic, legged, and Ackermann, for a wide spectrum of environments. In the proposed system, the Navigation2 stack is used for navigation.

2.2. Training Hotspots Using YOLOv8

Any autonomous robot that navigates must perceive the environment and obtain details to act accordingly. To serve this purpose, there are different detection algorithms, such as Faster Region-based Convolutional Neural Network (Faster R-CNN) [40], Faster R-CNN VGG-16 [41], faster deformable part model (Fastest DPM) [42], and YOLO [43]. In [43], experimental results on the visual object classes dataset showed that Fast-YOLO, one of the successors of the well-performing model YOLO, was able to perform better with an mAP score of 63.4% and 155 frames per second, which was, on average, two times faster than other detection models. There have been many versions of YOLO, and for the proposed system, the YOLOv8 algorithm was used, and it proved to be faster and lighter than its predecessors. The detected objects using YOLOv8 are put inside the map to identify the fumigation hotspots. The YOLOv8 is a cutting-edge object detection model known for its exceptional speed and accuracy, making it ideal for real-time applications. Building on the strengths of its predecessors, YOLOv8 introduces advanced features, such as anchor-free detection, which simplifies the model and enhances its generalization capabilities. It leverages the latest advancements in deep learning architectures, including Cross-Stage Partial (CSP) connections and Path Aggregation Network (PAN), for superior feature extraction and aggregation.
The architecture of YOLOv8, as shown in Figure 3, is designed for efficient and precise object detection, comprising a backbone, neck, and head. The backbone incorporates EfficientRep blocks inspired by MobileNet and EfficientNet, utilizing depthwise separable convolutions and squeeze-and-excitation modules to optimize performance. It also includes Reparameterized VGG block (RepVGG) blocks, simplifying complex structures for improved inference efficiency. The neck features convolutional blocks enhanced with activation functions like leaky rectified linear unit, which introduce non-linearity and assist in primary feature extraction. RepBlocks, configured differently for training and inference, facilitate better learning while reducing computational load. The upsampling layers increase the spatial dimensions of the feature maps, which is crucial for detecting smaller objects.
The head of YOLOv8 features an Efficient Decoupled Head, which separates classification and localization tasks into distinct branches, leading to more accurate detections. This architecture ensures that the model identifies and classifies objects, making it a powerful tool for real-time detection applications. A database keeps track of the fumigation locations. The robot uses this information to generate a prioritized sequence order to navigate from one point to another and fumigate in the shortest time. All the operations on the robot side that perform navigation and fumigation are controlled using ROS. As the robot is deployed in semi-outdoor and outdoor scenarios, SLAM algorithms like High-Definition LiDAR (HDL) graph slam map the environment and later localized using HDL localization for precise accuracy.

3. Mapping and Hotspot Identification

3.1. Autonomous Exploration of the Robot

The fumigation robot developed aims to find mosquito hotspots. In the proposed approach, the robot explores the environment, identifies the hotspots, and saves the position of those hotspots. Before identifying hotspots, a map of the environment is needed for the robot to navigate through it. The proposed system opted for an occupancy grid-based 2D map. Different occupancy grid mapping techniques exist, such as hector slams, gmapping, and cartography. The map produced consists of cells. Each cell has cost values associated with it, signifying the presence and absence of the object. If the object is present, then a high-cost value is given to the cell. Usually, black spots are considered obstacles, and grey areas are considered accessible areas, as shown in Figure 4. The occupancy grid map was made using a SICK TIM351 LiDAR. The ROS package used for this was gmapping. This package takes in laser-scan data from the sick_tim package and converts the data into an occupancy grid map. The map of the environment generated from the 2D and 3D LiDARs is illustrated in Figure 4.

3.2. Hotspot Identification Training

The main objective of the robot is to find the hotspot using an object detection model. Detection models like Faster R-CNN, ResNet, RetinaNetV2, and YOLO series detection models exist. YOLO has been used for research, specifically YOLOv8, because the YOLO series of object detection models perform well for videos, and YOLOv8 being computationally light makes it the first choice for our purpose. A comparative analysis of YOLOv8 was carried out with YOLOv2, YOLOv3, YOLOv4, and YOLOv5 versions in [44]. According to [45], YOLOv8 is the same as YOLOv5, with some minor differences between them:
  • The C3 module is replaced with the C2f module,
  • YOLOv5 does not have convolution layers 10 and 14,
  • The bottleneck layer was tampered, where 1 × 1 layer was replaced with 3 × 3 convolution layers and the decoupled head was used instead of the objectness step [44].
YOLOv8 introduces several improvements over YOLOv5, enhancing its effectiveness in object-detection tasks. Replacing the C3 module with the C2f module enhances feature-processing capabilities, which increases accuracy. Removing specific convolution layers leads to a more streamlined architecture, reducing computational demands. Changes in the bottleneck layer, such as replacing 1 × 1 with 3 × 3 convolution layers, allow for better feature capture. Additionally, using a decoupled head instead of the objectness step improves the precision of both localization and classification tasks, making YOLOv8 a robust option for real-time applications. Table 3 illustrates the details of the IPC used for training. YOLOv8 performed comparatively better with a lesser number of datasets and computational requirements. Compared to its previous counterparts, this model is different. YOLOv8 is an anchor-free model, accounting for lower bonding box predictions that result in faster Non-Maximum Suppression. YOLOv8 was trained on different hotspots, namely dustbins, coolers, drains, plants, pots, buckets, and toilets [46]. In total, 5000 images were annotated to create a diverse dataset. Image-level data augmentation, such as sheer, rotation, cut-out, and noise, was carried out before training, with the training and validation images split in an 80:20 ratio.

3.3. Dynamic Map Update

The generated map has a lot of obstacles, and robots must avoid them during the exploration. While doing so, the robot looks for hotspots for fumigation. In addition to saving the hotspot location, it is displayed as a marker inside the map in real time. Transformation plays a major role in updating the map. It is a concept through which points in one frame can be converted with respect to another frame. Here, all the objects detected are published with respect to the camera frame. However, to put those objects inside the map, it must be published with respect to the map frame.
The system has two nodes running in parallel; one node called Detection.py, with its pseudocode presented in Algorithm 1, is responsible for detecting the objects and converting the location obtained into the map frame, then another node called Marker.py converts the object’s location into a visualization marker. The Detection.py node, apart from localizing objects, also scores the object; thus, during multiple instances, if the same object is detected, the object’s location is provided with a score signifying the importance of the hotspot. The overall workings of the proposed system are shown in Figure 5 and Figure 6.
Algorithm 1: Detection.py
Initialize parameters and variables
Define read_csv function
Define write_csv function
Main program execution:
Start RealSense pipeline
Initialize ROS node and publishers
Initialize variables for pose tracking and detection
Start main loop:
  Obtain color and depth frames
  Detect objects in color frame using YOLO
  Iterate over detected results:
   Extract bounding box coordinates
   Calculate object center
   Estimate object depth
   Transform object pose to map frame
   Read existing data from CSV
   Compare current pose with existing entries:
    If match found:
     Update score for object
    If no match found:
     Append new entry with object’s pose and
        prev_score
   Write updated data back to CSV
   Visualize detected objects and scores on color image
  Display color image with annotations
  Wait for user input to exit program
Figure 7 and Figure 8 outline the process where the fumigation robot updates hotspot markers in real time by detecting potential mosquito breeding hotspots and recording their positions. Initially, the robot opens the test location map and initializes its sensors, including a 2D LiDAR, 3D LiDAR, and depth camera. These sensors scan the environment to identify potential hotspots and help avoid obstacles that are hard to detect [47]. The identified hotspots are then marked on the map. The robot employs the YOLOv8 algorithm to detect specific objects within these hotspots, extracting their x and y coordinates. LiDAR data are used to determine the z coordinate, creating a complete 3D localization of the detected objects. This information is used to update the map with the precise locations of the hotspots. The markers for these hotspots are then overlaid on the real-time map, with a weight assigned to each x, y, and z position to indicate whether it is a temporary or permanent hotspot. This system allows for dynamic 3D visualization and recording of hotspot areas, enhancing the robot’s ability to target fumigation efforts effectively.

4. Results and Discussion

4.1. Performance Metric Analysis

The YOLOv8 detection model is trained for five hotspots: dustbin, cooler, drain, toilet, and pot. The performance analysis of the detection model was carried out using the most trusted parameters, such as F1 score, mAP, recall, and precision. A high F1 score means the detection model balances recall and precision well; mAP provides a comprehensive performance analysis across various classes and localization accuracies. Recall is the measure of detecting objects even at the risk of false positives, and precision focuses on the model’s objective to avoid false positives. Each of the parameters is calculated based on false positives (fp), false negatives (fn), true negatives (tn), and true positives (tp). The formulae for each parameter are given below.
P r e c i s i o n   ( P ) = t p t p + f p  
R e c a l l   ( R ) = t p t p + f n  
F m e a s u r e ( F 1 ) = 2 × p r e c i s i o n × r e c a l l p r e c i s i o n + r e c a l l  
m A P = 1 n i = 1 n A P i
It can be seen in Table 4 that the model trained has a high precision value of 0.81, which means there are fewer chances of false positives. Since the proposed work is not strict in identifying the nature of the hotspot, a slightly low recall value is good enough, which is evident from the F1 score. The detection for this current research was conducted on dustbins and pots. Though the dustbin has a mAP value lower than that of other classes, it gave us a fair performance during our experiment.
A performance comparison test was performed to check the effectiveness of the proposed model compared to other detection models. The proposed system is more particular about the model’s input as video; thus, the other models are eliminated, and a performance evaluation test is performed among the YOLO series of models. Each of these models was trained with the same number of datasets. Figure 9 shows the overall result of the trained model. Figure 10 shows the confusion matrix and detection results from the YOLOv8.
YOLOv8 achieves a Precision of 0.81, a Recall of 0.71, F1 score of 0.75, and a mAP of 0.74, outperforming the other models across all metrics. Specifically, compared to Faster RCNN (P: 0.564, R: 0.61, F1: 0.58, mAP: 0.57), YOLOv8 shows a 43.7% improvement in Precision, a 15% increase in Recall, a 27.3% boost in F1 score, and a 29.8% enhancement in mAP. When compared to YOLOv5 (P: 0.73, R: 0.60, F1: 0.65, mAP: 0.69), YOLOv8 demonstrates an 11% improvement in Precision, an 18.3% increase in Recall, a 15.4% boost in F1 score, and a 7.2% enhancement in mAP. Similarly, compared to YOLOv7 (P: 0.70, R: 0.58, F1: 0.63, mAP: 0.61), YOLOv8 shows a 15.7% improvement in Precision, a 22.4% increase in Recall, a 19% boost in F1 score, and a 21.3% enhancement in mAP. These quantitative improvements indicate that YOLOv8 is more reliable, offering higher precision and recall rates, translating to fewer fp and fn. The efficiency gains in terms of F1 score and mAP also suggest that YOLOv8 provides a more balanced and accurate detection.
It can be inferred from Table 5 that the selected YOLO model has good precision compared to other models, which means it is the best at avoiding fp. The mAP score (0.74) for YOLOv8 is better than that of the other models, which means it is better at localizing objects. Considering the above points, YOLOv8 was chosen as the object detection model.

4.2. Test Site Description

The site chosen for the experiment is a pathway in Level 6 of Building 2 at the Singapore University of Technology and Design. The pathway is well-lit with natural light. A small portion of the building was taken for experimentation, as shown in Figure 11. The objects identified along the pathway were dustbins and pots.

4.3. Hotspots Identification and Plotting on the Map

The potential breeding grounds for mosquitoes were identified during the robot’s exploration phase. As depicted in Figure 12, the robot navigates through the environment, utilizing its sensors to detect and mark hotspots on the map. During its traversal, the robot leverages its integrated LiDAR, depth camera, and YOLOv8 algorithm to precisely identify locations that are likely to serve as breeding grounds for mosquitoes. In this particular scenario, the chosen hotspots include plants and dustbins.
A comparative analysis (Table 6) of chemical usage in two fumigation scenarios using a precision fumigation robot was conducted, as illustrated in Figure 13. The robot, equipped with a 10 L capacity spray gun discharging at 330 mL per minute, was tested over a 10 m × 2 m area divided into 1 m2 unit cells. The first scenario involved continuous fumigation of the entire area, with the robot moving at a speed of 0.5 m/s. The total time required to fumigate the 20 m2 was 40 s, resulting in a chemical usage of 220 mL. In the second scenario, the robot discreetly fumigated identified hotspots within the same area. Three hotspots were identified, with each being sprayed for 5 s. The total time spent fumigating these hotspots was 15 s, leading to a chemical usage of 82.5 mL. The comparison demonstrates that precision hotspot fumigation significantly reduces chemical usage, achieving a savings of approximately 62.5% compared to continuous fumigation. This efficiency is due to the targeted application of the fumigant, conserving resources, and minimizing environmental impact.

4.4. Comparison of Existing Robots with the Proposed Robot

A comparison between the fumigation operation of existing precision spray robots and the proposed precision fumigation robot is illustrated in Table 7. Most precision fumigation robots developed [29,30,31,32] so far are tailored for agricultural settings, focusing on optimizing pesticide application in controlled crop environments. These systems, such as those utilizing YOLOv5 and other AI techniques, excel in targeted spraying but lack the adaptability for complex urban landscapes. They are not designed to navigate dynamic urban environments, where the challenges of real-time mapping, precise hotspot detection, and autonomous navigation are significantly different. The proposed fumigation robot addresses this gap by integrating advanced sensor fusion and YOLOv8 for dynamic hotspot localization, designed explicitly for urban pest control. It offers precise real-time mapping, sophisticated spatial analysis, and adaptability to the complexities of urban navigation, making it a crucial innovation for effective mosquito management in urban settings.

4.5. Challenges, Possible Solutions, and Future Research Directions

Many challenges were faced during the development of the robot and the detection algorithm. Some of these challenges can be taken up as future research directions. A few of the challenges and their potential solutions are as follows:
  • Navigation in complex environments: The differential drive robot is built for navigating in urban environments, where the surface is even and spacious for robot movements. Tracked platforms are effective in handling multiple terrains. However, these platforms consume more energy because they are heavy and bulky, requiring frequent recharging. A modular robot is another future research direction that enables it to change its locomotion system according to the environment.
  • Battery life is a major limitation restricting the robot’s operational time, necessitating frequent recharging. One possible solution is to have frequent, preferably wireless charging stations that allow full autonomy.
  • Renewable energy sources for charging robots: Researching the use of renewable energy sources, such as solar power, to extend the operational time of robots and reduce their environmental impact, designing more energy-efficient robots to improve their operational longevity and reduce the need for frequent recharging.
  • Varied environments: Different environments may require different fumigation strategies, such as discrete fumigating motion, continuous fumigation motion, and 360° fumigating motion. Using advanced machine learning techniques and identifying the environment, a decision can be made to opt for the required strategy.
  • Navigating multiple floors in a building: The proposed challenge can be solved by integrating the robot’s control with the building’s management system, such as access to calling lifts and doors. However, this requires planning from the initial stage to make the building’s infrastructure robot-friendly.

4.6. Limitations

The limitations of the proposed work with the integration of YOLOv8 in the precision fumigation robot are as follows:
  • The trained model may underperform in scenarios significantly different from its training environment, restricting its effectiveness in unfamiliar urban landscapes.
  • YOLOv8 needs robust computational resources, which could limit the deployment of the robot in settings with limited processing capabilities.
  • Optimizing for real-time performance compromises the detection accuracy, which is critical for precise localization of mosquito breeding sites.
  • The model’s effectiveness decreases with smaller objects, which could be crucial in identifying less conspicuous breeding grounds.
  • Vibrations during locomotion affect detection accuracy, leading to blurred images being sent for object recognition.

5. Conclusions

The development of a fumigation robot utilizing LiDAR data, depth camera data with YOLOv8 for object recognition, and IMU data represents a significant advancement in targeting mosquito breeding hotspots. Fusing these sensor inputs enables precise localization and comprehensive data collection, ensuring that potential breeding sites are accurately identified and treated. This technology enhances the efficiency of fumigation efforts and significantly reduces human exposure to harmful chemicals, thereby improving safety and operational outcomes. Integrating advanced sensors and algorithms within this robotic system marks a critical step forward in urban pest management, offering a robust solution to a pervasive public health challenge. Further research and development can optimize these systems, making them more adaptable to diverse environments and improving their efficacy in real-world applications.

Author Contributions

Conceptualization, S.J. and P.K.C.; methodology, S.J. and P.K.C.; software, S.J. and C.V.; Writing, S.J., C.V. and P.K.C.; resources, M.R.E.; supervision, M.R.E.; project administration, M.R.E.; funding acquisition M.R.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Robotics Programme under its National Robotics Programme (NRP) BAU, Ermine III: Deployable Reconfigurable Robots, Award No. M22NBK0054 and also supported by SUTD Growth Plan (SGP) Grant, Grant Ref. No. PIE-SGP-DZ-2023-01.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Dahmana, H.; Mediannikov, O. Mosquito-borne diseases emergence/resurgence and how to effectively control it biologically. Pathogens 2020, 9, 310. [Google Scholar] [CrossRef] [PubMed]
  2. Franklinos, L.H.; Jones, K.E.; Redding, D.W.; Abubakar, I. The effect of global change on mosquito-borne disease. Lancet Infect. Dis. 2019, 19, e302–e312. [Google Scholar] [CrossRef] [PubMed]
  3. Huang, Y.; Higgs, S.; Vanlandingham, D. Arbovirus-mosquito vector-host interactions and the impact on transmission and disease pathogenesis of arboviruses. Front Microbiol. 2019, 10, 22. [Google Scholar] [CrossRef] [PubMed]
  4. Nguyen-Tien, T.; Lundkvist, Å.; Lindahl, J. Urban transmission of mosquito-borne flaviviruses–a review of the risk for humans in Vietnam. Infect. Ecol. Epidemiol. 2019, 9, 1660129. [Google Scholar] [CrossRef] [PubMed]
  5. Meng, S.; Delnat, V.; Stoks, R. Mosquito larvae that survive a heat spike are less sensitive to subsequent exposure to the pesticide chlorpyrifos. Environ. Pollut. 2020, 265, 114824. [Google Scholar] [CrossRef] [PubMed]
  6. Singapore Government. National Environment Agency. Available online: https://www.nea.gov.sg/ (accessed on 2 January 2022).
  7. Dom, N.C.; Ahmad, A.H.; Ismail, R. Habitat characterization of Aedes sp. breeding in urban hotspot area. Procedia-Soc. Behav. Sci. 2013, 85, 100–109. [Google Scholar] [CrossRef]
  8. Wilke, A.B.; Vasquez, C.; Carvajal, A.; Moreno, M.; Fuller, D.O.; Cardenas, G.; Petrie, W.D.; Beier, J.C. Urbanization favors the proliferation of Aedes aegypti and Culex quinquefasciatus in urban areas of Miami-Dade County, Florida. Sci. Rep. 2021, 11, 22989. [Google Scholar] [CrossRef]
  9. Liew, C.; Soh, L.T.; Chen, I.; Ng, L.C. Public sentiments towards the use of Wolbachia-Aedes technology in Singapore. BMC Public Health 2021, 21, 1417. [Google Scholar] [CrossRef]
  10. Ahmed, T.; Hyder, M.Z.; Liaqat, I.; Scholz, M. Climatic conditions: Conventional and nanotechnology-based methods for the control of mosquito vectors causing human health issues. Int. J. Environ. Res. Public Health 2019, 16, 3165. [Google Scholar] [CrossRef] [PubMed]
  11. Wilke, A.B.; Vasquez, C.; Carvajal, A.; Moreno, M.; Petrie, W.D.; Beier, J.C. Evaluation of the effectiveness of BG-Sentinel and CDC light traps in assessing the abundance, richness, and community composition of mosquitoes in rural and natural areas. Parasites Vectors 2022, 15, 51. [Google Scholar] [CrossRef] [PubMed]
  12. Jhaiaun, P.; Panthawong, A.; Saeung, M.; Sumarnrote, A.; Kongmee, M.; Ngoen-Klan, R.; Chareonviriyaphap, T. Comparing Light—Emitting—Diodes light traps for catching anopheles mosquitoes in a forest setting, Western Thailand. Insects 2021, 12, 1076. [Google Scholar] [CrossRef]
  13. Barrera, R. New tools for Aedes control: Mass trapping. Curr. Opin. Insect Sci. 2022, 52, 100942. [Google Scholar] [CrossRef] [PubMed]
  14. Ong, J.; Chong, C.-S.; Yap, G.; Lee, C.; Abdul Razak, M.A.; Chiang, S.; Ng, L.-C. Gravitrap deployment for adult Aedes aegypti surveillance and its impact on dengue cases. PLoS Negl. Trop. Dis. 2020, 14, e0008528. [Google Scholar] [CrossRef]
  15. Bertola, M.; Fornasiero, D.; Sgubin, S.; Mazzon, L.; Pombi, M.; Montarsi, F. Comparative efficacy of BG-Sentinel 2 and CDC-like mosquito traps for monitoring potential malaria vectors in Europe. Parasites Vectors 2022, 15, 160. [Google Scholar] [CrossRef] [PubMed]
  16. Namango, I.H.; Marshall, C.; Saddler, A.; Ross, A.; Kaftan, D.; Tenywa, F.; Makungwa, N.; Odufuwa, O.G.; Ligema, G.; Ngonyani, H. The Centres for Disease Control light trap (CDC-LT) and the human decoy trap (HDT) compared to the human landing catch (HLC) for measuring Anopheles biting in rural Tanzania. Malar. J. 2022, 21, 181. [Google Scholar] [CrossRef] [PubMed]
  17. Jaffal, A.; Fite, J.; Baldet, T.; Delaunay, P.; Jourdain, F.; Mora-Castillo, R.; Olive, M.-M.; Roiz, D. Current evidences of the efficacy of mosquito mass-trapping interventions to reduce Aedes aegypti and Aedes albopictus populations and Aedes-borne virus transmission. PLoS Negl. Trop. Dis. 2023, 17, e0011153. [Google Scholar] [CrossRef]
  18. Pan, C.-Y.; Cheng, L.; Liu, W.-L.; Su, M.P.; Ho, H.-P.; Liao, C.-H.; Chang, J.-H.; Yang, Y.-C.; Hsu, C.-C.; Huang, J.-J. Comparison of fan-traps and gravitraps for aedes mosquito surveillance in Taiwan. Front. Public Health 2022, 10, 778736. [Google Scholar] [CrossRef]
  19. Singapore Government. National Environment Agency. Available online: https://www.nea.gov.sg/our-services/pest-control/fumigation (accessed on 21 July 2024).
  20. Park, M.-G.; Choi, J.; Hong, Y.-S.; Park, C.G.; Kim, B.-G.; Lee, S.-Y.; Lim, H.-J.; Mo, H.-h.; Lim, E.; Cha, W. Negative effect of methyl bromide fumigation work on the central nervous system. PLoS ONE 2020, 15, e0236694. [Google Scholar] [CrossRef]
  21. Nelsen, J.A.; Yee, D.A. Mosquito larvicides disrupt behavior and survival rates of aquatic insect predators. Hydrobiologia 2022, 849, 4823–4835. [Google Scholar] [CrossRef]
  22. Bravo, D.T.; Lima, G.A.; Alves, W.A.L.; Colombo, V.P.; Djogbenou, L.; Pamboukian, S.V.D.; Quaresma, C.C.; de Araujo, S.A. Automatic detection of potential mosquito breeding sites from aerial images acquired by unmanned aerial vehicles. Comput. Environ. Urban Syst. 2021, 90, 101692. [Google Scholar] [CrossRef]
  23. Hanif, A.S.; Han, X.; Yu, S.-H. Independent control spraying system for UAV-based precise variable sprayer: A review. Drones 2022, 6, 383. [Google Scholar] [CrossRef]
  24. Oğuz-Ekim, P. TDOA based localization and its application to the initialization of LiDAR based autonomous robots. Robot. Auton. Syst. 2020, 131, 103590. [Google Scholar] [CrossRef]
  25. Nasir, F.E.; Tufail, M.; Haris, M.; Iqbal, J.; Khan, S.; Khan, M.T. Precision agricultural robotic sprayer with real-time Tobacco recognition and spraying system based on deep learning. PLoS ONE 2023, 18, e0283801. [Google Scholar] [CrossRef]
  26. Sun, Q.; Chen, J.; Zhou, L.; Ding, S.; Han, S. A study on ice resistance prediction based on deep learning data generation method. Ocean Eng. 2024, 301, 117467. [Google Scholar] [CrossRef]
  27. Preethi, P.; Mamatha, H.R. Region-based convolutional neural network for segmenting text in epigraphical images. Artif. Intell. Appl. 2023, 1, 119–127. [Google Scholar] [CrossRef]
  28. Akande, T.O.; Alabi, O.O.; Ajagbe, S.A. A deep learning-based CAE approach for simulating 3D vehicle wheels under real-world conditions. Artif. Intell. Appl. 2022, 1–11. [Google Scholar] [CrossRef]
  29. Baltazar, A.R.; Santos, F.N.d.; Moreira, A.P.; Valente, A.; Cunha, J.B. Smarter robotic sprayer system for precision agriculture. Electronics 2021, 10, 2061. [Google Scholar] [CrossRef]
  30. Wang, B.; Yan, Y.; Lan, Y.; Wang, M.; Bian, Z. Accurate detection and precision spraying of corn and weeds using the improved YOLOv5 model. IEEE Access 2023, 11, 29868–29882. [Google Scholar] [CrossRef]
  31. Hu, C.; Xie, S.; Song, D.; Thomasson, J.A.; Hardin IV, R.G.; Bagavathiannan, M. Algorithm and system development for robotic micro-volume herbicide spray towards precision weed management. IEEE Robot. Autom. Lett. 2022, 7, 11633–11640. [Google Scholar]
  32. Fan, X.; Chai, X.; Zhou, J.; Sun, T. Deep learning based weed detection and target spraying robot system at seedling stage of cotton field. Comput. Electron. Agric. 2023, 214, 108317. [Google Scholar]
  33. Hassan, M.U.; Ullah, M.; Iqbal, J. Towards autonomy in agriculture: Design and prototyping of a robotic vehicle with seed selector. In Proceedings of the 2016 2nd International Conference on Robotics and Artificial Intelligence (ICRAI), Rawalpindi, Pakistan, 1–2 November 2016; pp. 37–44. [Google Scholar]
  34. Zhang, C.; Lei, L.; Ma, X.; Zhou, R.; Shi, Z.; Guo, Z. Map Construction Based on LiDAR Vision Inertial Multi-Sensor Fusion. World Electr. Veh. J. 2021, 12, 261. [Google Scholar] [CrossRef]
  35. Liu, Z.; Li, Z.; Liu, A.; Shao, K.; Guo, Q.; Wang, C. LVI-Fusion: A Robust Lidar-Visual-Inertial SLAM Scheme. Remote Sens. 2024, 16, 1524. [Google Scholar] [CrossRef]
  36. Lee, J.; Hwang, S.; Kim, W.J.; Lee, S. SAM-Net: LiDAR depth inpainting for 3D static map generation. IEEE Trans. Intell. Transp. Syst. 2021, 23, 12213–12228. [Google Scholar] [CrossRef]
  37. Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
  38. Nourbakhsh, I.; Powers, R.; Birchfield, S. DERVISH an office-navigating robot. AI Mag. 1995, 16, 53. [Google Scholar]
  39. Abiyev, R.H.; Günsel, I.; Akkaya, N.; Aytac, E.; Çağman, A.; Abizada, S. Robot soccer control using behaviour trees and fuzzy logic. Procedia Comput. Sci. 2016, 102, 477–484. [Google Scholar] [CrossRef]
  40. Girshick, R. Fast r-cnn. In Proceedings of the 2015 IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
  41. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. IEEE Trans. Patt. Analy. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
  42. Yan, J.; Lei, Z.; Wen, L.; Li, S.Z. The fastest deformable part model for object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 2497–2504. [Google Scholar]
  43. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  44. Talaat, F.M.; ZainEldin, H. An improved fire detection approach based on YOLO-v8 for smart cities. Neural. Comput. Appl. 2023, 35, 20939–20954. [Google Scholar] [CrossRef]
  45. What is YOLOv8? The Ultimate Guide. Available online: https://blog.roboflow.com/whats-new-in-yolov8/#what-is-yolov8 (accessed on 10 July 2024).
  46. Minakshi, M.; Bhuiyan, T.; Kariev, S.; Kaddumukasa, M.; Loum, D.; Stanley, N.B.; Chellappan, S.; Habomugisha, P.; Oguttu, D.W.; Jacob, B.G. High-accuracy detection of malaria mosquito habitats using drone-based multispectral imagery and Artificial Intelligence (AI) algorithms in an agro-village peri-urban pastureland intervention site (Akonyibedo) in Unyama SubCounty, Gulu District, Northern Uganda. J. Public Health Epidemiol. 2020, 12, 202–217. [Google Scholar]
  47. Jeyabal, S.; Sachinthana, W.; Bhagya, S.; Samarakoon, P.; Elara, M.R.; Sheu, B.J. Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera. IEEE Sens. J. 2024, 24, 24690–24698. [Google Scholar] [CrossRef]
Figure 1. Common mosquito breeding sites in urban landscapes.
Figure 1. Common mosquito breeding sites in urban landscapes.
Applsci 14 07382 g001
Figure 2. Fumigation robot and its major components.
Figure 2. Fumigation robot and its major components.
Applsci 14 07382 g002
Figure 3. YOLOv8 architecture for detecting hotspots.
Figure 3. YOLOv8 architecture for detecting hotspots.
Applsci 14 07382 g003
Figure 4. Comparative maps of the test environment using 2D and 3D LiDAR scans. The 2D scan provides a flat, top–down view, while the 3D scan offers a detailed, multi-dimensional representation, capturing height and depth for enhanced spatial awareness and navigation.
Figure 4. Comparative maps of the test environment using 2D and 3D LiDAR scans. The 2D scan provides a flat, top–down view, while the 3D scan offers a detailed, multi-dimensional representation, capturing height and depth for enhanced spatial awareness and navigation.
Applsci 14 07382 g004
Figure 5. Overview of the proposed system’s workflow, from robot initialization and path assignment to executing detection and marker scripts, navigation control, map generation, and robot visualization via RViz.
Figure 5. Overview of the proposed system’s workflow, from robot initialization and path assignment to executing detection and marker scripts, navigation control, map generation, and robot visualization via RViz.
Applsci 14 07382 g005
Figure 6. Flowchart illustrating the process of hotspot identification and dynamic updating over time. The system initializes, captures image data, detects and localizes hotspots, and updates the map by comparing new and previous locations, ensuring accurate and current hotspot mapping for targeted interventions.
Figure 6. Flowchart illustrating the process of hotspot identification and dynamic updating over time. The system initializes, captures image data, detects and localizes hotspots, and updates the map by comparing new and previous locations, ensuring accurate and current hotspot mapping for targeted interventions.
Applsci 14 07382 g006
Figure 7. The process flow of updating hotspot markers on the real-time map.
Figure 7. The process flow of updating hotspot markers on the real-time map.
Applsci 14 07382 g007
Figure 8. Fusion of YOLOv8, LiDAR frame, and depth camera frame for precision identification and localization of fumigation hotspot.
Figure 8. Fusion of YOLOv8, LiDAR frame, and depth camera frame for precision identification and localization of fumigation hotspot.
Applsci 14 07382 g008
Figure 9. Training and validation metrics for YOLOv8, showing the progression of losses (box, classification, and objectness) and performance metrics (precision, recall, [email protected], and [email protected]:0.95) over 100 epochs. The graphs illustrate the model’s improvement in accuracy and reduction in errors as training progresses, with consistent convergence observed in both training and validation phases.
Figure 9. Training and validation metrics for YOLOv8, showing the progression of losses (box, classification, and objectness) and performance metrics (precision, recall, [email protected], and [email protected]:0.95) over 100 epochs. The graphs illustrate the model’s improvement in accuracy and reduction in errors as training progresses, with consistent convergence observed in both training and validation phases.
Applsci 14 07382 g009
Figure 10. (a) Confusion matrix for YOLOv8 on the given dataset, illustrating the model’s performance across different classes such as cooler, drain, toilet, dustbin, and pot. (b) Object detection results using YOLOv8, identifying trees and drains in various urban environments. The model successfully detects and labels these objects across different lighting conditions and angles, demonstrating its robustness and accuracy in real-world scenarios.
Figure 10. (a) Confusion matrix for YOLOv8 on the given dataset, illustrating the model’s performance across different classes such as cooler, drain, toilet, dustbin, and pot. (b) Object detection results using YOLOv8, identifying trees and drains in various urban environments. The model successfully detects and labels these objects across different lighting conditions and angles, demonstrating its robustness and accuracy in real-world scenarios.
Applsci 14 07382 g010
Figure 11. (a) Two-dimensional map of the test site and (b) the placement of trained objects for the robot to detect potential hotspots.
Figure 11. (a) Two-dimensional map of the test site and (b) the placement of trained objects for the robot to detect potential hotspots.
Applsci 14 07382 g011
Figure 12. Sequential updates of the hotspot map during the robot’s exploration. In panel (A), no hotspots are detected. Panel (B) shows the detection and mapping of “Plant-1”, marked on the map and visually confirmed in the environment. Panel (C) adds “Plant-2” to the map, illustrating the robot’s ability to identify and log new hotspots continuously. Finally, panel (D) shows the detection of a “Dustbin”, further updating the map. Each detected hotspot is localized in the mapped environment and the corresponding real-world image, demonstrating the system’s effectiveness in real-time hotspot identification and mapping.
Figure 12. Sequential updates of the hotspot map during the robot’s exploration. In panel (A), no hotspots are detected. Panel (B) shows the detection and mapping of “Plant-1”, marked on the map and visually confirmed in the environment. Panel (C) adds “Plant-2” to the map, illustrating the robot’s ability to identify and log new hotspots continuously. Finally, panel (D) shows the detection of a “Dustbin”, further updating the map. Each detected hotspot is localized in the mapped environment and the corresponding real-world image, demonstrating the system’s effectiveness in real-time hotspot identification and mapping.
Applsci 14 07382 g012
Figure 13. Comparison between continuous fumigation (left) and precision fumigation (right) using a precision fumigation robot. The continuous fumigation approach covers the entire area indiscriminately, whereas the precision fumigation approach targets specific hotspots identified within the area, such as potted plants and dustbins. The blue path indicates the robot’s movement, while the red-shaded areas represent the regions being fumigated.
Figure 13. Comparison between continuous fumigation (left) and precision fumigation (right) using a precision fumigation robot. The continuous fumigation approach covers the entire area indiscriminately, whereas the precision fumigation approach targets specific hotspots identified within the area, such as potted plants and dustbins. The blue path indicates the robot’s movement, while the red-shaded areas represent the regions being fumigated.
Applsci 14 07382 g013
Table 1. Comparison table contrasting the existing methods with the proposed work.
Table 1. Comparison table contrasting the existing methods with the proposed work.
AspectExisting MethodsProposed Work
Technology Used
  • Manual fumigation, breeding infertile mosquitoes
  • Biogents sentinel trap, light traps, ovitraps, and gravitraps
  • Sensor fusion (LiDAR, depth camera with YOLOv8, IMU)
  • 3D-LiDAR for mapping and navigation
Primary
Limitations
  • Overexposure to chemicals affecting health
  • Inefficiency in high mosquito densities
  • Limited field of view (2D-LiDAR)
  • Requires sophisticated technology and initial setup
EfficiencyVaries significantly with manual fumigation and trap effectivenessHigh efficiency due to automated, precise detection and fumigation of hotspots
Health ImpactPotential health risks due to chemical exposureReduced risk to human operators by automating chemical fumigation
Environmental
Impact
Potential for chemical dispersal affecting non-target areasFocused application of chemicals, reducing environmental footprint
Navigation and
Mapping
Limited in non-open areas like indoor environments or dense urban settingsAdvanced navigation using 3D-LiDAR and LIO-SAM algorithm, improving accuracy in complex environments
Hotspot
Identification
Relies heavily on manual inspection and stationary trapsAutomated real-time identification and remapping, increasing responsiveness to changing conditions
Operational
Strategy
Static with periodic manual
adjustments
Dynamic, with ongoing adjustments based on real-time data collection
CostLower initial cost but higher due to labor and repeated interventionsHigher initial investment but lower ongoing costs due to automation
AdaptabilityLimited adaptability to new breeding grounds without manual interventionHigh adaptability with continuous learning and updating capabilities
Table 2. Major components of the proposed precision fumigation robot.
Table 2. Major components of the proposed precision fumigation robot.
ProductSpecifications
Oriental motorsBLHM450KC-30
IMUVectornav VN-100
Voltage regulatorDDR-480C-24, DDR-240C-24
2D LiDARSICK TiM581-2050101
3D LiDARHesai QT128
Depth cameraIntel RealSense D435i
Industrial PC (IPC)Nuvo-10108GC-RTX3080
Battery48 V, 25 Ah, Lithium Iron Phosphate
Fogging unit10 L tank, 50-micron droplet size, and flow rate 330 mL/min
Table 3. Detailed specifications of the IPC used for training the hotspot dataset.
Table 3. Detailed specifications of the IPC used for training the hotspot dataset.
ComponentDetailed Specification
CPUIntel i7 12th-Gen Core 65 W LGA1700 CPU
RAM64 GB DDR5 4800 MHz
GraphicsNVIDIA RTX 4080 16 GB
SSDNVMe SSD 2 TB Gen4 M.2 2280
TemperatureRugged, −25 °C to 60 °C operation
DC Input3-pin + 4-pin pluggable terminal block for 8 V to 48 V DC input with ignition control, Humidity: 10~90%, non-condensing
Vibration and Shock absorptionMIL-STD-810H, Method 514.8, Category 4 (with damping bracket)
Table 4. Performance metrics of the YOLOv8 detection model used in the proposed precision fumigation robot.
Table 4. Performance metrics of the YOLOv8 detection model used in the proposed precision fumigation robot.
TypePrecisionRecallF1 Score[email protected] (%)
All0.810.690.740.74
Cooler0.820.840.830.93
Drain0.950.910.930.94
Toilet0.870.880.880.94
Dustbin0.750.510.610.62
Pot0.9910.990.99
Table 5. Performance comparison with different YOLO versions and faster RCNN detection models for the dataset used in training the mosquito hotspots.
Table 5. Performance comparison with different YOLO versions and faster RCNN detection models for the dataset used in training the mosquito hotspots.
ModelPrecision
(P)
Recall
(R)
F1 Score (F1)[email protected] (%)
Faster RCNN0.560.610.580.57
YOLOv50.730.600.650.69
YOLOv70.700.580.630.61
YOLOv8 (current model)0.810.710.750.74
Table 6. Comparative analysis of continuous fumigation versus precision fumigation using a precision fumigation robot. The table highlights key differences in area coverage, fumigation time, chemical usage, environmental impact, and overall efficiency.
Table 6. Comparative analysis of continuous fumigation versus precision fumigation using a precision fumigation robot. The table highlights key differences in area coverage, fumigation time, chemical usage, environmental impact, and overall efficiency.
AspectContinuous FumigationPrecision Fumigation
Area CoveredEntire area (20 m2)Identified hotspots (3 hotspots)
Robot Speed0.5 m/s0.5 m/s
Fumigation Time per m22 s5 s per hotspot
Total Fumigation Time40 s15 s
Chemical Usage Rate330 mL/min330 mL/min
Total Chemical Used220 ml82.5 ml
Chemical SavingsN/A62.5% (137.5 mL less than continuous)
Environmental ImpactHigher due to full area coverageLower due to targeted application
EfficiencyLower, as it treats the entire areaHigher, with focused treatment of hotspots
Table 7. Comparison between the fumigation operation of existing precision spray robots and proposed precision fumigation robot.
Table 7. Comparison between the fumigation operation of existing precision spray robots and proposed precision fumigation robot.
AspectRef. [29]Ref. [30]Ref. [31]Ref. [32]Proposed Robot
Robot
developed
Applsci 14 07382 i001Applsci 14 07382 i002Applsci 14 07382 i003Applsci 14 07382 i004Applsci 14 07382 i005
AI
Integration
SVM classifier for leaf densityImproved YOLOv5 model with attention mechanismsNovel scene representation and motion planningDeep learning model with CBAM and BiFPNYOLOv8 for
dynamic hotspot
localization
Urban
Navigation (Complex Environment)
No
(focused on
agricultural fields)
No
(focused on agricultural fields)
No
(designed for early-stage crops)
No
(agricultural fields only)
Yes
(specifically designed for urban landscapes)
Real-Time
Operation
Yes, but limited to agricultural fieldsYes, real-time performance with
30 ms/frame
detection speed
Yes, sub-centimeter precision in sprayingYes, real-time detection and sprayingYes, with real-time mapping and fumigation
Mapping
Capabilities
NoNoLimitedNoDynamic mapping
Precision
Targeting
High precision in spraying based on leaf densityHigh precision for corn and weed
identification
High precision for micro-
volume
spraying
High precisionHigh precision for mosquito hotspots
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jeyabal, S.; Vikram, C.; Chittoor, P.K.; Elara, M.R. Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Appl. Sci. 2024, 14, 7382. https://doi.org/10.3390/app14167382

AMA Style

Jeyabal S, Vikram C, Chittoor PK, Elara MR. Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics. Applied Sciences. 2024; 14(16):7382. https://doi.org/10.3390/app14167382

Chicago/Turabian Style

Jeyabal, Sidharth, Charan Vikram, Prithvi Krishna Chittoor, and Mohan Rajesh Elara. 2024. "Revolutionizing Urban Pest Management with Sensor Fusion and Precision Fumigation Robotics" Applied Sciences 14, no. 16: 7382. https://doi.org/10.3390/app14167382

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop