1. Introduction
Early real-time pest detection and identification can improve pest management, resulting in the reduction in crop damage, and pesticide costs. Using a multi-spectral machine vision as a component of an integrated pest-management (IPM) system, we can provide a more robust and flexible solution for the pest scouting of a wide range of invertebrates (e.g., the pink bollworm). Thermal imaging, as a component of the same machine vision system, can be used to detect and identify diverse vertebrates (e.g., wild boars and rodents). This demands real-time pest scouting, which is a tedious and non-trivial task for a human pest scouter, involving conventional sweep nets that use traps or beat-sheet methods to sample pests in the fields. Therefore, pest/vertebrate scouting through an autonomous aerial vehicle, e.g., a quad-copter equipped with a multi-spectral and thermal imaging system, can be utilized to scout the cotton crop in day or night. The real-time autonomous scouting will also provide information on the population size and the location of pests and vertebrates. In turn, specific pesticides can be sprayed in the small, affected area, instead of using a broad-spectrum pesticide spray in the whole field. Consequently, appropriate measures can be taken against vertebrates. A modern IPM and precision agriculture system using a multispectral and thermal machine vision can improve crop production with minimal damage and pesticide cost. Furthermore, use of pesticides also kills beneficial species such as bees, and predators of pests [
1].
Liu et al. [
2] achieved good pest-detection results by using Near Infrared (NIR) images in the range of 700–1500 nm, and soft X-rays images between 0.1 nm and 10 nm to detect invertebrates. It was found that the selection of weights was crucial, and more errors can be introduced if weights are set to either extreme. Boissard and martin et al. [
3] proposed an early pest-detection algorithm using a cognitive vision approach. Their work is limited due to the usage of a sensor that captures still images. Ye and Sakai et al. [
4] explored the possibility of using a ground-based hyperspectral imaging system. Their work is limited to small, captured images of the scene acquired by the sensors, hence disabling the applications of the wide-scale vegetation. Feng et al. [
5] used an RGB imaging system for plant-disease diagnosis, observable in color imaging range, that often yielded impaired results for disease detection. This system can be improved with multispectral imaging which can provide better results. Sanchez et al. [
6] proposed a machine-vision technique for scouting whiteflies in greenhouse settings. The efficiency of their system can be increased by adding different types of pests and performing the same experiments in the lab and fields.
In this article, we propose a machine-vision-based multispectral pest-detection algorithm which does not require supervised network training. Multispectral images are used in this study, because these can deal with dynamic environmental conditions (i.e., sunlight variation, partial occlusion, etc.) better than RGB images. The UV light between 100 nm and 400 nm has not yet been explored for invertebrate detection; therefore, our approach is towards the evaluation of the UV spectrum of green leaves, on nine invertebrate species for their detection and identification.
2. Materials and Methods
Multispectral images are used for the detection of pests with UV, RGB and NIR wavebands. The color balance of the input images is the first step for estimating the contribution of each wavelength image for the identification of invertebrates on plants as shown in
Figure 1.
The next step was to align the coordinates of the RGB and NIR images with the coordinates of the UV images, using the Speeded-Up Robust Feature (SURF) algorithm which identified the interest points on the RGB, UV and NIR images simultaneously. An image transform model was then estimated using the least median of square regression (LMEDS) approach. An affine transform was used to counter-check the displacement error after fusing the coordinates of the RGB, NIR images onto the UV coordinates. After the feature fusion, in the final step, three indices UV, Normalized Difference Vegetation Index (NDVI), and hue in Hue, Saturation, Value (HSV) were used for the identification of invertebrates on host plants. To evaluate the contribution of these indices, probability maps, i.e., PUV, PHUE, PNDVI, were drawn based on their intensity values. In order to increase the reliability of probability, the maps were combined. By selecting a proper threshold value T, a pixel can be recognized as belonging to an invertebrate or plant.
3. Results
The performance of the proposed pest-detection algorithm was evaluated using the multispectral pest images taken from [
2]. Nine sets of images, with three images in each set (RGB, UV, and NIR), were obtained to identify invertebrates on leaves and on the stems of different plants.
Figure 2, as an example, shows the multispectral images of Theba pisana invertebrates, along with a ground truth image. The computed probability maps are shown in
Figure 3.
Figure 4 shows that how efficiently the proposed algorithm predicted the invertebrate. The contribution of the UV image was also determined with its different weights. The effect of UV spectral weights on the proposed system prediction accuracy, in form of Type I, II, and total errors, is shown in
Figure 5. The results of the proposed algorithm surpass the state-of-art works in terms of Type I, and II, and the total errors, which were reduced significantly with the proposed algorithm.
Mean errors for Type I, II and the total, were calculated for the weightage of the UV (W
UV = 0, 6.672 and 100%). A comprehensive comparison between a previously published paper [
2] and the proposed technique is provided in
Table 1. Type I errors indicated the area of an image identified as an insect, and where it belongs on the leaf. The reduction in Type I error means an increase in the probability of identifying insects. The average Type I error was reduced from 22 to 3.75, at 0% involvement with UV. At 6.672% weightage of UV, the Type I error was reduced to 1.62. Average total errors were reduced from 19 to 4.78 at 0% involvement with UV. By introducing a 6.672% and 33% weightage of the UV, the average total of errors reduced to 3.26 and 5.51, respectively. Type II errors showed an increase due to insufficient information exhibited by the multispectral images. Each input image was cropped from [
2].
4. Conclusions
In this paper, a multispectral pest-detection algorithm was proposed. The images from three spectrums (i.e., RGB, UV, and NIR) were used to identify nine different pest species. The proposed approach performed exceptionally, and attained the lowest Type I, II, and total errors, of values 1.62, 40.27, and 3.26, respectively. The performance of the proposed algorithm can be further validated with a wider range of pests, as a future work.