Next Article in Journal
Salt Tolerance of Six Switchgrass Cultivars
Next Article in Special Issue
Performance of Pinoxaden on the Control of Diclofop-Resistant Italian Ryegrass (Lolium perenne L. ssp. multiflorum) in Winter Wheat
Previous Article in Journal
Relationship of Date Palm Tree Density to Dubas Bug Ommatissus lybicus Infestation in Omani Orchards
Previous Article in Special Issue
Effect of Simulated Tillage in Combination with Post-Shattering Temperature Conditions on Senna obtusifolia and Xanthium strumarium Seed Survival, Seedling Emergence and Seedbank Potential
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Temporal Site-Specific Weed Control of Cirsium arvense (L.) Scop. and Rumex crispus L. in Maize and Sugar Beet Using Unmanned Aerial Vehicle Based Mapping

1
Department of Weed Science, Institute of Phytomedicine, University of Hohenheim, 70599 Stuttgart, Germany
2
Faculty Geomatics, Computer Science and Mathematics, University of Applied Sciences Stuttgart, 70174 Stuttgart, Germany
*
Author to whom correspondence should be addressed.
Agriculture 2018, 8(5), 65; https://doi.org/10.3390/agriculture8050065
Submission received: 23 March 2018 / Revised: 23 April 2018 / Accepted: 25 April 2018 / Published: 29 April 2018
(This article belongs to the Special Issue Weed Management)

Abstract

:
Sensor-based weed mapping in arable fields is a key element for site-specific herbicide management strategies. In this study, we investigated the generation of application maps based on Unmanned Aerial Vehicle imagery and present a site-specific herbicide application using those maps. Field trials for site-specific herbicide applications and multi-temporal image flights were carried out in maize (Zea mays L.) and sugar beet (Beta vulgaris L.) in southern Germany. Real-time kinematic Global Positioning System precision planting information provided the input for determining plant rows in the geocoded aerial images. Vegetation indices combined with generated plant height data were used to detect the patches containing creeping thistle (Cirsium arvense (L.) Scop.) and curled dock (Rumex crispus L.). The computed weed maps showed the presence or absence of the aforementioned weeds on the fields, clustered to 9 m × 9 m grid cells. The precision of the correct classification varied from 96% in maize to 80% in the last sugar beet treatment. The computational underestimation of manual mapped C. arvense and R. cripus patches varied from 1% to 10% respectively. Overall, the developed algorithm performed well, identifying tall perennial weeds for the computation of large-scale herbicide application maps.

1. Introduction

One of the major milestones in weed remote sensing technology research has been the implementation of Unmanned Aerial Vehicles (UAVs) as sensor carriers. Rasmussen et al. [1] presented an estimation of plant soil cover from small and inexpensive aircraft systems evaluating the efficacy of mechanical weed harrowing in barley (Hordeum vulgare L.) and chemical weed control in oilseed rape (Brassica napus L. subsp. napus). Early season site-specific weed management in sunflowers based on UAV imagery is described in Torres-Sánchez et al. [2]. Both authors conclude that UAVs are useful to map weed pressure for site-specific weed management. Several UAV imaging sensors (e.g., Red, Green and Blue (RGB) and multispectral cameras), spatial resolutions and data analysis algorithms (e.g., Object-Based Image Analysis (OBIA)) are discussed in the literature [1,2,3,4,5,6,7,8,9].
Airborne imagery enables vegetation detection using vegetation indices, as reviewed in Salamí et al. [10]. Based on the Structure from Motion technique, 3D Models and Digital Elevation Models (DEM) can be created out of UAV imagery [10,11]. Combining this information can provide biomass estimations on barley [12] or yield predictions in maize, when combined with multispectral images [13]. Further, weed identification has been performed by analysing vegetation height differences in a maize field using ground-based ultrasonic sensors [14].
Christensen et al. [15] discussed the complexity of various weed detection procedures, along with the generally low economic weed threshold levels (e.g., <5 weeds m−2 or <0.1% weed cover). The authors remarked that the field area to be treated with herbicide highly depends on the used economic weed thresholds as also presented in Hamouz et al., Keller et al. and Longchamps et al. [16,17,18]. The information about local weed infestations becomes even more important in less competitive crops (e.g., sugar beet) or perennial weeds with a development stage dependent herbicide compatibility. Thus, high-resolution weed detection and multi-temporal mapping can play a major role in weed management.
Considering these possibilities and the well-known heterogeneous nature of weeds [19,20], site-specific herbicide applications as reported by Gerhards et al. [21] and Gerhards and Oebel [22], should already be state of the art in today’s farming practise. Yet, only a few site-specific herbicide application techniques have been commercially used [23]. Fewer systems support the use of UAV weed mapping based on application maps.
It is crucial to fuse information from the different sensor systems and computing capabilities, used in modern precision farming, to crosslink all gathered field data for weed classification. Nevertheless, weed differentiation systems working in multiple crops by combining comprehensive field information have so far not been described in literature.
Consequently, the hypothesis was that the UAV mapping of herbicide-relevant weeds like creeping thistle (Cirsium arvense (L.) Scop.) can be accomplished by combining multiple sensor-based sources of precision farming information. Information on vegetation coverage and height derived from UAV imagery was connected to crop planting information (e.g., geo-coordinates of crop row locations and row space). By concatenating these data inputs, we propose a new methodology for improving the UAV weed mapping in arable fields. It is of particular interest to separate perennials like C. arvense and curly dock (Rumex crispus L.) from the rest of the fields’ vegetation cover.
The objectives of the present study were (a) the development of an algorithm for the computation of herbicide application maps based on UAV weed mapping, (b) the subsequent use of these maps for weed spot spraying. (c) Realise the above objectives in row crops like maize at the three-leaf stage and in sugar beet between the cotyledon stage and the five-leaf stage. The computed application maps were transferred to a multiple tank spot sprayer for the site-specific treatment of C. arvense and R. crispus. This additional herbicide treatment was then applied along with a uniform herbicide application against annual grasses and broadleaf weed species.

2. Materials and Methods

2.1. Trial Sites and Precision Sowing

For the current study, four experiments were chosen at the Ihinger Hof research station (48.74° N, 8.92° E, 478 m a.s.l.) of the Hohenheim University, in southwest Germany during 2016 and 2017. The average 30-year annual temperature and precipitation are 8.4 °C and 738 mm. The fields were rotated within the two years of this study and they differed in their slope as shown in Table 1. The soil types differed from Terra fusca, brown soil and pseudogley.
Sowing was realised using a Real-Time Kinematic (RTK)-corrected, Global Navigation Satellite System (GNSS)-assisted steering system on the tractor with a steady-state reference station located at the research station. The RTK-GPS enabled the pneumatic precision spaced planter (KUHN, MAXIMA 2TI, KUHN Maschinen-Vertrieb GmbH, Schopsdorf, Germany) to follow previously defined geographical coordinates for seeding with a precision of ≤2 cm.

2.2. Ground-Truth Weed Mapping

The weed plants were counted for each weed species at each centroid of a 9 m × 9 m grid square, overlaid on the trial field. The totals were 173 grid squares in maize (2017) and sugar beet (2016); and 201 in maize (2016) and 220 in sugar beet (2017). Manual weed counting was performed one to three days before the site-specific weed control measurements. The collected weed scouting data of 0.4 m2 per grid square was entered directly into a database, along with weed and crop cover and a photograph of the respective counting area. To this end weed species individuals were counted in a 0.1 m2 counting frame with four repetitions around each grid square centroid, as tested and discussed in the literature [24,25,26]. These counting results were used to get a general overview about the field weed infestation. The database directly computed the plant density per m2 for each species, using the single mapping repetitions per counting frame. The presence or absence of the later UAV-mapped C. arvense and R. crispus patches was additionally mapped outside the predefined counting points over the entire mapping area.

2.3. UAV and Sensor Setup

The weed mapping flights were performed using a hexacopter, model XR6 (geo-konzept, Adelschlag, Germany). The hexacopter was equipped with a Sony α 6000 (Sony Corporation, Minato, Japan) RGB camera and a Tetracam µMCA 4 Snap (Tetracam Inc., Chatsworth, CA, USA) multispectral camera including an electronic Incident Light Sensor (eILS).
The RGB image sensor lens focal length was 19 mm, with an image resolution of 6000 × 4000 pixels. Therefore, a Ground Sample Distance (GSD) of approximately 0.3 cm at a flight altitude of 15 m was achieved. The shutter was set to 1/1250 s and ISO between 200–1600 depending on actual lighting conditions prior to flight, resulting in an automatic aperture of (f − x) with x ∈ [7–10].
Multispectral images were taken at four narrowband spectral wavelengths of 670, 700, 740 and 780 nm using optical band-pass filters, each mounted in front of one of the camera’s four image sensors. In addition to this, the uplooking eILS was mounted on the UAV. The eILS was capturing the downwelling radiation at the same wavelengths as those used in the aforementioned image sensors below. The camera was configured with a predefined focal length (9.6 mm), aperture (f/3.2), and automatic exposure time. The multispectral image sensor gathered 8-Bit, RAW file images with a resolution of 1280 × 1024 pixels, resulting in a GSD of approximately 1.51 cm at a flight altitude of 30 m. This flight altitude was chosen due to sensor limitations (e.g., image interval, battery power), to avoid the separation of flight plans in more than two parts.
Image triggering was software-controlled via a USB connection to the copter’s flight-control-unit. At predefined points, image triggering was set as a flight plan action. Images were captured with a minimum overlap of 60% in- and cross-track to enable photogrammetric calculations. The copter’s flight altitude for use with the RGB sensor was set to 15 m and for the multispectral sensor to 30 m. If necessary, the flight plan was split into two parts for changing the two 24 V, 3700 mAh flight batteries. To overcome the decrease in the ground resolution, flight plans were adjusted to the ground morphology with a digital terrain model (DTM), using data from a previous flight. Therefore, the DTM was used to create the adjusted flight paths for the following flights.

2.4. Image Processing

The camera weed sampling flights were done at different crop growth stages, and flight dates, before and after the herbicide treatment. Weed mapping and application dates, along with the crop plant development stage, are given in Table 2. Prior to image alignment for the orthomosaic image creation, multispectral images had to pass through a pre-processing step. The alignment of the four single-channel images to the same field of view was carried out using the software PixelWrench2 (Version 1.2.3.1, Tetracam Inc., Chatsworth, CA, USA). In the same step, the images were corrected to the downwelling radiation, according to the data gathered by the eILS.
For generating georeferenced DEMs and orthomosaic images, the 3D reconstruction software PhotoScan Professional Edition (Version 1.3.4, Agisoft LLC, St. Petersburg, Russia) was used. Beside the airborne images, the copters flight information and positions of the RTK-GPS-referenced Ground Control Points (GCPs) located inside the trial and around the borders [2,13] were passed to the software.

2.5. Generating Weed Control Maps Based on UAV Imagery

To compute large area site-specific herbicide application maps, a robust and relatively fast to compute algorithm was designed. The targeted output was an application map to provide the respective geo-coordinates and decisions for one tank of a three tank site-specific sprayer. For further processing, the multispectral and RGB orthomosaic images were loaded as raster objects in the statistical computation software R [27], using the packages raster, version: 2.6-7 [28] and rgdal version: 1.2-18 [29].
The crop row locations were known from the RTK-GNSS-assisted seeding and therefore a predefined AB-line was available from the beginning of the cultivation. Based on this AB-line, all crop row positions were calculated and used to calculate a polygon layer. The crop rows were deleted out of the orthomosaic images by overlaying the crop row polygon layer. The width of the crop row polygon layer was adjusted, based on the actual crop growth stages.
For further discrimination of weeds and soil, the Normalised Difference Vegetation Index (NDVI) was calculated based on the multispectral images, adapted from Rouse et al. [30] using the wavelengths 670 and 780 nm, as shown in Equation (1). The resulting NDVI raster object was used as a mask layer as proposed by Liebisch et al. [31] and demonstrated in Roth and Streit [32] on the relative RGB image. All parts of the raster object with an NDVI > 0.2 were considered as vegetation. If multispectral data were not available, weed maps were calculated only using the RGB images. The Excessive Green Red index (ExGR, Equation (2)) was calculated out of the three layers of the processed RGB orthomosaic image (Ortho), representing the crop fields inter-row space. The ExGR combines two colour indices: the Excessive Red index (OrthoExR, Equation (3)) as shown in Meyer et al. [33] and the Excessive Green index (OrthoExG, Equation (4)) as shown in Woebbecke et. al. [34]. The result derived from the subtraction of the two aforementioned indices is shown in Equation (2). Thereby the background noise in the Excessive Green grayscale image was reduced and a zero threshold could be applied for the creation of a binary image. The ExGR binary orthomosaic image (OrthoExGR) was generated using the ExGR zero threshold method as introduced by Meyer et al. [35] and suggested by Hamuda et al. [36]. Prior to the index calculation, all single layers had to be normalized by the max value of each RGB channel.
NDVI =   780   nm   670   nm 780   nm +   670   nm
Ortho ExGR = Ortho ExG   Ortho ExR
Ortho ExR =   1.4 × Ortho red Ortho blue Ortho red +   Ortho blue
Ortho ExG =   2 ×   Ortho green   Ortho red   Ortho blue Ortho green +   Ortho red +   Ortho blue
We considered that each pixel in the classified OrthoExGR raster object can be attributed to vegetation, or to secondary substances, like soil or organic matter, based on the result of the binary map, as shown in Equation (5). The classified pixels in vegetation or soil and organic matter had the geographic coordinates north (n) and east (e).
f   :   X ( n , e ) =   { 1 ,   Ortho ExGR ( n , e ) vegetation   0 ,   Ortho ExGR ( n , e ) vegetation
Since the classification was conducted with 0 as the threshold value, the OrthoExGR image was binarized as shown in Equation (6). The result is the classified Orthoimage of Equation (5).
f   :   X ( n , e ) =   { 1 ,   Ortho ExGR ( n , e ) ( ,   0 ] 0 ,   Ortho ExGR ( n , e ) ( 0 ,   + )
Apart from the vegetation information, height information was also used for the discrimination algorithm. The separation of the inter-row vegetation into lower and taller weeds was carried out based on the Canopy Height Model (CHM). The CHM could be calculated by subtracting the Digital Terrain Model (DTM) raster object from the Digital Surface Model (DSM) raster object [11,13] as shown in Equation (7). The CHM presented the height for every single pixel in the corresponding raster object.
CHM = DSM DTM
The DTM was computed by UAV imagery gathered directly after seeding. For calculating the DSM, UAV images from the respective mapping dates were used. The general computation of this digital elevation model was performed as a prior step during the orthomosaic computation from the 2D images. The CHM threshold for separating lower and higher vegetation of the inter-row space was set to a height of 6 cm above ground as shown in Equation (8).
f :   C ( n , e ) =   {         1 ,     CHM ( n , e ) [ 0.06 , + ) 0 ,     CHM ( n , e ) ( 0 ,   0.06 )
All Vegetation pixels above the used threshold were regrouped in a new class of the raster layer. To verify CHM raster cells as vegetation, we overlaid the crop row excluding ExGR raster to the CHM raster (Equation (9)).
f   :   WHM ( n , e ) = X     ( X + C )
The resulting Weed Height Model (WHM) raster object, therefore, combined field and seed planning precision framing data with models calculated from UAV based airborne images. It provided information about the field weed infestation in two classes and the extracted information was used in the site-specific herbicide sprayer on-board computer database.

2.6. Site-Specific Herbicide Application in Field Trials

Site-specific herbicide applications were realised with a trailed multiple-tank spot-sprayer, with three separate hydraulic systems. The sprayer was equipped with RTK-GPS and section control to adjust the application of different herbicides at specific field locations [22,26]. Using three separated hydraulic systems on the sprayer, three different herbicide mixtures were stored and applied wherever it was designated from the application maps. Each hydraulic circuit had its own tank and boom. The booms had a total width of 21 m and were divided into seven sections each of 3 m width.
The sprayer’s section control turned on and off the sections used for the site-specific application via solenoid valves. The same terminal also regulated the herbicide doses for the two other hydraulic systems, applying herbicides on the complete field.
Herbicide application maps of sugar beet and maize trials were used to apply products as shown in Table 3 (sugar beet) and Table 4 (maize) for the mapped group of weeds respectively. The composition of the herbicide mixtures was the same in both study years. The herbicide selection for the post-emergence treatments of the mapped species was based on regional guidelines in Germany and is given in Table 3 and Table 4. As shown in Table 2, herbicide applications, weed mapping and flight dates were synchronized for each treatment within a three-day period. Inside each field, two of the grid squares were established as an untreated control by manually adjusting the herbicide application maps prior to spraying.
In 2016, a hexacopter with an RGB and a four channel multispectral camera was used for weed mapping. The GSD of the gathered images during these flights was insufficient for computing valuable weed maps when following the presented protocol. Therefore, site-specific weed control application was completely based on manual weed maps in 2016. The complete workflow, including the different computation steps is presented in the flowchart of Figure 1. The procedure described therein was applied to every XR6 flight dates without changes in the main algorithm.

3. Results

The automatic flight altitude adjustment by following a digital elevation model resulted in a steady GSD of the orthomosaic images and DEMs.
A binary raster was created using the workflow as described in Figure 1 on RGB and multispectral images gathered by an UAV. Figure 2 shows the different processing stages of the binary creation. The raster containing the height information is shown along with the raster containing the vegetation index and the crop row polygons, all overlaid on the starting RGB raster, showing a C. arvense patch in sugar beet at the 4 leaves crop development.
Although, overlaying different raster files from different flights, e.g., DTM and DSM for CHM calculation worked as expected, the image resolution slightly decreased during the, resampling of the thereby adjusted raster files.
In this study, the weed species distribution was heterogeneous within the experiments. In the 2017 maize trial the main weed species was corn sowthistle (Sonchus arvensis L.) with a mean plant density of 15 plants m−2 and bearbind (Fallopia convolvulus (L.) Á. Löve) with a mean plant density of 12 plants m−2. In the maize trial of 2016 it was common lambsquarters (Chenopodium album L.) with 55 plants m−2 and annual bluegrass (Poa annua L.) with 24 plants m−2. In some distinct places of the maize trial in 2016 the weed density was very high, with up to 375 P. annua and 150 C. album plants m−2. In the sugar beet trials the weed density also changed between the application dates. While the first herbicide application of 2017 had to treat weed densities of up to 450 black-grass (Alopecurus myosuroides Huds.) individuals, the number decreased to 280 plants m−2 at the second treatment. Another major weed in all trials was chamomile (Matricaria chamomilla L.) with plant densities between seven plants m−2 in the sugar beet trial of 2016 and 108 plants m−2 in the sugar beet trial 2017. In general the weed species composition was linked closer with the trial field, than with the planted crop. In 2017 patches containing C. arvense or R. crispus were manually mapped only in 16 distinct places in maize. On the other hand, up to 114 grid points with the abovementioned weeds were found in the sugar beet field of 2017. Therefore, herbicide savings concerning these patches was 90% in the maize POST I treatment, 45% in the sugar beet POST I and 43% in the sugar beet POST II treatment. The spatial distribution of the manual and UAV classified weed patches are given in Figure 3.
Weed threshold levels for herbicide application against all mapped weeds were set at 0 plants m−2 in sugar beet and maize. Therefore, site-specific treatment of the UAV classified weeds (C. arvense and R. crispus) was only possible using a GPS-controlled patch sprayer with multiple separated hydraulic circuits. Using this sprayer in the field, herbicide application against monocotyledon and dicotyledon weeds was realised using two of the three hydraulic systems of the sprayer, while the site-specific treatment of the UAV mapped weeds (C. arvense and R. crispus) was applied with the third hydraulic system.
Compared with the manual weed maps, the UAV categorized maps showed an accuracy of 96% in the maize POST I treatment, 90% in the sugar beet POST I and 80% in the POST II treatment, compared with the manual weed maps. The overall manual and UAV weed patch identification is presented in confusion matrices in Table 5, Table 6 and Table 7.
Compared to the respective manual weed map, the UAV identification overestimated weed patches of 3% in maize POST I and 10% in sugar beet POST II, shown in Table 5, Table 6 and Table 7. However, the number of UAV underestimated patches was 50% lower. The only exception to this statement was the sugar beet POST II treatment.

4. Discussion

The large-scale precision farming field experiments conducted in this study proved that the herbicide output can be reduced, when combining UAV imagery and plant seeding information, for computation of detailed application maps.
The presented workflow could be used in maize and sugar beet with generally low weed threshold levels [22,37]. Thus, the sprayer was only turned off at locations where none of the respective weeds was found. For weed recognition, similar results were found using tractor mounted, camera based, online systems [21,22,38]. Although all these studies integrated camera mapped weeds in the site-specific management, the results in the present study clearly stated the possibility of weed categorization and herbicide savings, even though the UAV data were only computed for two weed species.
While the spatial resolution of the UAV images for weed recognition was lower compared to tractor-mounted cameras, the UAV mapping has its advantages. The UAV weed mapping can be conducted, before the application and without soil compaction. Therefore, farmers can calculate the amount of herbicide needed on the field beforehand. Also, a multi-temporal, weekly surveillance can be realised for surveying actual weed infestation in order to meet the economic threshold levels for herbicide applications.
Using polygons to exclude crop rows at a known position speeds up the data analysis and reduces the total data size that not only needs to be computed, but also transferred, and stored. Although the intra-row space was not yet investigated, there was no considerable loss of data concerning the investigated weed patches due to the intra-row deletion. The generally equivalent weed coverage between inter row and crop row areas was also stated in Longchamps et al. [39], although they found a partially higher weed infestation in the crop rows. Furthermore, the total size of the dataset was reduced by up to 30% before entering the main algorithm computation. Extracting plants from the RGB orthomosaic images with the ExGR vegetation index and the use of a steady zero threshold level for binary image calculation was also possible for the UAV imagery. This complies with the proposal of Hamuda et al. and Mayer et al. [35,36]. The fixed threshold level for separating vegetation from soil is a key element towards automation of image-based weed recognition. Especially the differences in soil and vegetation appearance, e.g., wet soil, dense vegetation or the changing light conditions between flights, can cause deviations from predefined thresholds. Thus, the combination of the ExG and ExR vegetation indices in the ExGR provides an option to circumvent this problem.
Even though this robust and fast computable vegetation index for C. arvense and R. crispus worked, the loss of information in our data set did not allow the analysis of small broad-leaved and grass weeds. To realise the separation of smaller weeds at the cotyledon stage, for example by using the OBIA algorithm, higher resolution images will be needed. If this can be achieved, UAV weed mapping of different species and herbicide treatments at early weed plant stages, as needed in sugar beets, can be realised. This well-known coherence between image sensor resolution, flight altitude and vegetation index thresholds is also discussed by López-Granados et al. and Mesas-Carrascosa et al. [6,40,41]. Again, the need of high resolution image sensors becomes apparent. This was also shown in our multispectral data sets providing an almost four times lower GSD. The NDVI orthoimages that we calculated from our multispectral images could be used for pre-categorisation of our RGB orthoimages. However, single weed detection was not possible. On the other hand, the multispectral images could be corrected to the downwelling sunlight using the eILS data. Therefore, the result of the NDVI >0.2 [31] was relatively consistent.
Combining information by overlaying the RGB calculated ExGR vegetation index and CHM already provided information about the field weed infestation. It enabled the separation of single weeds in different herbicide application relevant categories. Compared to simple colour-based algorithms, a secondary information source was added. A comparable approach is also described by de Castro et al. [8] for weed detection in common sunflower (Helianthus annuus L.) and cotton (Gossypium spp.). With this additional pixel height information, plants can not only be identified by their colour but also by their height. In our case, this provided already enough information for separating C. arvense and R. crispus from other crop and weed plants, found in the crops inter-row space.
The detection of C. arvense plants at 6 cm enables the control at the plant’s compensationpoint, where the acropedal allocation of carbohydrates turns into a basipedal movement of photosynthates [42]. At this point the plant is highly susceptible to control measures, since the root system has the lowest regenerative capacity [43]. The typical appearance in patches of C. arvense, which is caused by its mainly vegetative reproduction inside arable fields [44], can also provide information of patches appearing in the near future or the next vegetation period.
From a practical point of view, it is better to overestimate the presence of C. arvense, than underestimate it. Even in this case, herbicide savings can be achieved, especially at low C. arvense infestations. Using this raster layer for OBIA analysis is not yet compatible, since no distinct shape structures of the weeds are available anymore, as shown in Figure 2.
The generation of orthomosaic images and CHMs with a steady spatial resolution is essential to follow the weed mapping workflow presented in this paper. The DTM following flight plan was a useful adjustment in our fields to follow the slope of up to 10 m. The unintended increase of the flight altitude caused by the terrain would have caused a loss in the spatial resolution. When flying the UAV at low altitudes of 15 m, a slope of 5 m, as tested in our study fields, would reduce the spatial resolution in the lowest parts of the field from a GSD of 0.3 cm to 0.4 cm, if the flight altitude is not adjusted. Furthermore, the precise overlaying of rasters was also supported by georeferencing the images using the RTK-GPS corrected GCPs, which helped to improve the accuracy of the coordinate system on the orthoimages.

5. Conclusions

(a) We can conclude that the presented aerial imagery computation procedure is able to identify the presence or absence of weeds above a defined height threshold level in row crops. The use of a field CHM in combination with the Vegetation Index ExGR and crop row geo-coordinates enabled the separation of C. arvense and R. crispus from the rest of the vegetation by their height with an overall accuracy of 96% in the maize POST I, 90% in the sugar beet POST I and 80% in the sugar beet POST II treatments. (b) The output data enables the control of site-specific weed management strategies. Resulting in herbicide savings of 90% in the maize POST I treatment and 43% in the sugar beet POST II treatment. (c) The same algorithm was able to work in two row crops. Thereby the aggregation of different precision farming data sources lead also to a reduction in data size and needed computation power, already in early processing steps. This can be a key element in agricultural fields where fast Internet connections for transferring data to high-performance computing clouds are not yet guaranteed. In order to mainstream the processing chain into a simple fly and treat application, further investigations are needed, specifically orthomosaic image georeferencing and overlaying of multi-temporal UAV orthoimage rasters, which are labour-intensive.

Author Contributions

R.M. and M.S. were responsible for the field experiments and data collection. R.M. and G.G.P. developed the main workflow and computing algorithm. R.M., A.D., G.G.P. and J.J.E. wrote and revised the manuscript. M.H. and R.G. supervised the experiment and revised the manuscript.

Funding

This project was supported by funds from the Federal Ministry of Food and Agriculture (BMEL) based on a decision of the Parliament of the Federal Republic of Germany via the Federal Office for Agriculture and Food (BLE) under the innovation support programme.

Acknowledgments

The authors would like to acknowledge the great work of the student Fernando Laureano Palacios Duarte.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  2. Torres-Sánchez, J.; López-Granados, F.; De Castro, A.; Peña-Barragán, J. Configuration and Specifications of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS ONE 2013, 8, e58210. [Google Scholar] [CrossRef] [PubMed]
  3. Borra-Serrano, I.; Peña, J.; Torres-Sánchez, J.; Mesas-Carrascosa, F.; López-Granados, F. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping. Sensors 2015, 15, 19688–19708. [Google Scholar] [CrossRef] [PubMed]
  4. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting patterns and features for between- and within-crop-row weed mapping using UAV-imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  5. Hung, C.; Xu, Z.; Sukkarieh, S. Feature Learning Based Approach for Weed Classification Using High Resolution Aerial Images from a Digital Camera Mounted on a UAV. Remote Sens. 2014, 6, 12037–12054. [Google Scholar] [CrossRef]
  6. López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; Mesas-Carrascosa, F.J.; Peña, J.M. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
  7. López-Granados, F.; Torres-Sánchez, J.; De Castro, A.I.; Serrano-Pérez, A.; Mesas-Carrascosa, F.J.; Peña, J.M. Object-based early monitoring of a grass weed in a grass crop using high resolution UAV imagery. Agron. Sustain. Dev. 2016, 36, 1–12. [Google Scholar] [CrossRef]
  8. De Castro, A.; Torres-Sánchez, J.; Peña, J.; Jiménez-Brenes, F.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
  9. Fernandez-Quintanilla, C.; Peña-Barragán, J.M.; Andújar, D.; Dorado, J.; Ribeiro, A.; López-Granados, F. Is the current state-of-the-art of weed monitoring suitable for site-specific weed management in arable crops? Weed Res. 2018. [Google Scholar] [CrossRef]
  10. Salamí, E.; Barrado, C.; Pastor, E. UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  11. Turner, D.; Lucieer, A.; Watson, C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  12. Tilly, N.; Aasen, H.; Bareth, G. Fusion of Plant Height and Vegetation Indices for the Estimation of Barley Biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef]
  13. Geipel, J.; Link, J.; Claupein, W. Combined Spectral and Spatial Modeling of Corn Yield Based on Aerial Images and Crop Surface Models Acquired with an Unmanned Aircraft System. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
  14. Andújar, D.; Escolà, A.; Dorado, J.; Fernández-Quintanilla, C. Weed discrimination using ultrasonic sensors. Weed Res. 2011, 51, 543–547. [Google Scholar] [CrossRef]
  15. Christensen, S.; Rasmussen, J.; Pedersen, S.M.; Dorado, J.; Fernandez-Quintanilla, C. Prospects for Site Specific Weed Management. In Proceedings of the International Conference on Robotics and Associated High-Technologies and Equipment for Agriculture and Forestry, Madrid, Spain, 21–23 May 2014; pp. 541–549. [Google Scholar]
  16. Longchamps, L.; Panneton, B.; Simard, M.-J.; Leroux, G.D. An Imagery-Based Weed Cover Threshold Established Using Expert Knowledge. Weed Sci. 2014, 62, 177–185. [Google Scholar] [CrossRef]
  17. Keller, M.; Gutjahr, C.; Möhring, J.; Weis, M.; Sökefeld, M.; Gerhards, R. Estimating economic thresholds for site-specific weed control using manual weed counts and sensor technology: An example based on three winter wheat trials. Pest Manag. Sci. 2014, 70, 200–211. [Google Scholar] [CrossRef] [PubMed]
  18. Hamouz, P.; Hamouzová, K.; Holec, J.; Tyšer, L. Impact of site-specific weed management on herbicide savings and winter wheat yield. Plant Soil Environ. 2018, 59, 101–107. [Google Scholar] [CrossRef]
  19. Marshall, E.J.P. Field-scale estimates of grass weed populations in arable land. Weed Res. 1988, 28, 191–198. [Google Scholar] [CrossRef]
  20. Johnson, G.A.; Mortensen, D.A.; Martin, A.R. A simulation of herbicide use based on weed spatial distribution. Weed Res. 1995, 35, 197–205. [Google Scholar] [CrossRef]
  21. Gerhards, R.; Sökefeld, M.; Timmermann, C.; Kühbauch, W.; Williams, M.M., II. Site-Specific weed control in maize, sugar beet, winter wheat, and winter barley. Precis. Agric. 2002, 3, 25–35. [Google Scholar] [CrossRef]
  22. Gerhards, R.; Oebel, H. Practical experiences with a system for site-specific weed control in arable crops using real-time image analysis and GPS-controlled patch spraying. Weed Res. 2006, 46, 185–193. [Google Scholar] [CrossRef]
  23. Christensen, S.; SØgaard, H.T.; Kudsk, P.; NØrremark, M.; Lund, I.; Nadimi, E.S.; JØrgensen, R. Site-specific weed control technologies. Weed Res. 2009, 49, 233–241. [Google Scholar] [CrossRef]
  24. Gerhards, R.; Sökefeld, M.; Knuf, D.; Kühbauch, W. Kartierung und geostatistische Analyse der Unkrautverteilung in Zuckerrübenschlägen als Grundlage für eine teilschlagspezifische Bekämpfung. J. Agron. Crop Sci. 1996, 176, 259–266. [Google Scholar] [CrossRef]
  25. Lamastus, F.E.; Shaw, D.R. Comparison of different sampling scales to estimate weed populations in three soybean fields. Precis. Agric. 2005, 6, 271–280. [Google Scholar] [CrossRef]
  26. Gutjahr, C.; Sökefeld, M.; Gerhards, R. Evaluation of two patch spraying systems in winter wheat and maize. Weed Res. 2012, 52, 510–519. [Google Scholar] [CrossRef]
  27. Core Team R. R: A Language and Environment for Statistical Computing; The R Foundation for Statistical Computing: Vienna, Austria, 2017. [Google Scholar]
  28. Hijmans, R.J.; van Etten, J.; Cheng, J.; Mattiuzzi, M.; Sumner, M.; Greenberg, J.A.; Lamigueiro, O.P.; Bevan, A.; Racine, E.B.; Shortridge, A.; et al. Raster: Geographic Analysis and Modeling with Raster Data. Available online: https://cran.r-project.org/web/packages/raster/index.html (accessed on 23 March 2018).
  29. Bivand, R.; Keitt, T.; Rowlingson, B.; Pebesma, E.; Sumner, M.; Hijmans, R.; Rouault, E. Rgdal: Bindings for the “Geospatial” Data Abstraction Library. Available online: https://cran.r-project.org/web/packages/rgdal/index.html (accessed on 23 March 2018).
  30. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; Technical Report; NASA: Washington, DC, USA, 1973; p. 112.
  31. Liebisch, F.; Kirchgessner, N.; Schneider, D.; Walter, A.; Hund, A. Remote, aerial phenotyping of maize traits with a mobile multi-sensor approach. Plant Methods 2015, 11, 9. [Google Scholar] [CrossRef] [PubMed]
  32. Roth, L.; Streit, B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach. Precis. Agric. 2018, 19, 93–114. [Google Scholar] [CrossRef]
  33. Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine Vision Detection Parameters for Plant Species Identification. In Proceedings of the Precision Agriculture and Biological Quality, Boston, MA, USA, 3–4 November 1999; Volume 3543, pp. 327–335. [Google Scholar] [CrossRef]
  34. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  35. Meyer, G.E.; Neto, J.C.; Jones, D.D.; Hindman, T.W. Intensified fuzzy clusters for classifying plant, soil, and residue regions of interest from color images. Comput. Electron. Agric. 2004, 42, 161–180. [Google Scholar] [CrossRef]
  36. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
  37. Williams, M.M., II; Gerhards, R.; Mortensen, D.A. Two-year weed seedling population responses to a post-emergent method of site-specific weed management. Precis. Agric. 2000, 2, 247–263. [Google Scholar] [CrossRef]
  38. Tian, L.; Reid, J.F.; Hummel, J.W. Development of a precision sprayer for site-specific weed management. Trans. ASAE 1999, 42, 893–900. [Google Scholar] [CrossRef]
  39. Longchamps, L.; Panneton, B.; Simard, M.-J.; Leroux, G.D. Could weed sensing in corn interrows result in efficient weed control? Weed Technol. 2012, 26, 649–656. [Google Scholar] [CrossRef]
  40. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by UAV to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef]
  41. Mesas-Carrascosa, F.J.; Clavero Rumbao, I.; Torres-Sánchez, J.; García-Ferrer, A.; Peña, J.M.; López Granados, F. Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes. Int. J. Remote Sens. 2017, 38, 2161–2176. [Google Scholar] [CrossRef]
  42. Nkurunziza, L.; Streibig, J.C. Carbohydrate dynamics in roots and rhizomes of Cirsium arvense and Tussilago farfara. Weed Res. 2011, 51, 461–468. [Google Scholar] [CrossRef]
  43. Welton, F.A.; Morris, V.H.; Hartzler, A.J. Organic Food Reserves in Relation to the Eradication of Canada Thistles; Ohio Agricultural Experiment Station: Wooster, OH, USA, 1929. [Google Scholar]
  44. Håkansson, S. Weeds and Weed Management on Arable Land: An Ecological Approach; Hakansson, S., Ed.; CABI: Wallingford, UK, 2003; ISBN 0-85199-651-5. [Google Scholar]
Figure 1. Data input and processing steps to compute UAV imagery-based weed maps for site-specific herbicide applications in a patch sprayer.
Figure 1. Data input and processing steps to compute UAV imagery-based weed maps for site-specific herbicide applications in a patch sprayer.
Agriculture 08 00065 g001
Figure 2. Overlay of multiple raster layers onto the original RGB image of a C. arvense patch in the sugar beet field previous to the POST II herbicide application at four leaf crop development stages. The crop row polygons are depicted as white stripes. Red parts indicate ExGR vegetation index approved plant parts of the CHM >6 cm. Green pixels show non-ExGR approved parts of the CHM >6 cm.
Figure 2. Overlay of multiple raster layers onto the original RGB image of a C. arvense patch in the sugar beet field previous to the POST II herbicide application at four leaf crop development stages. The crop row polygons are depicted as white stripes. Red parts indicate ExGR vegetation index approved plant parts of the CHM >6 cm. Green pixels show non-ExGR approved parts of the CHM >6 cm.
Agriculture 08 00065 g002
Figure 3. Maps identifying manual (A,C,E) and UAV (B,D,E) mapped C. arvense and R. crispus from the vegetation period of 2017. Red squares indicate the presence of ≥1 C. arvense or R. crispus plant. The orthoimages and application maps were computed prior to the site-specific herbicide treatments in maize POST I (A,B), sugar beet POST I (C,D) and sugar beet POST II (E,F).
Figure 3. Maps identifying manual (A,C,E) and UAV (B,D,E) mapped C. arvense and R. crispus from the vegetation period of 2017. Red squares indicate the presence of ≥1 C. arvense or R. crispus plant. The orthoimages and application maps were computed prior to the site-specific herbicide treatments in maize POST I (A,B), sugar beet POST I (C,D) and sugar beet POST II (E,F).
Agriculture 08 00065 g003
Table 1. Experimental details of the UAV—weed mapping trials at Ihinger Hof, Germany.
Table 1. Experimental details of the UAV—weed mapping trials at Ihinger Hof, Germany.
YearCropTrial Size (ha)Terrain Slope (%)Row Distance (cm)Sowing DateSeeds ha−1
2016sugar beet2.0725021 April 2016107,000
2016maize1.7257518 May 201694,000
2017sugar beet2.0935013 April 2017107,000
2017maize2.0727511 May 201789,000
Table 2. Plant development stages at weed mapping and herbicide application dates.
Table 2. Plant development stages at weed mapping and herbicide application dates.
Application DateHerbicide Application CropCrop Development StageMapping Date
20 May 2016POST Isugar beetcotyledon stage18 May 2016
10 June 2016POST IIsugar beet5 leaf stage9 June 2016
22 June 2016POST Imaize3 leaf stage20 June 2016
12 May 2017POST Isugar beetcotyledon stage11 May 2017
24 May 2017POST IIsugar beet4 leaf stage23 May 2017
1 June 2017POST Imaize3 leaf stage31 May 2017
Herbicide applications were performed after crop emergence with specific herbicide doses per weed group as shown in Table 3 and Table 4 for each crop respectively; Mapping date includes UAV flight and manual weed mapping.
Table 3. Weed-specific herbicides applied with a three-tank spot-sprayer in sugar beet trials.
Table 3. Weed-specific herbicides applied with a three-tank spot-sprayer in sugar beet trials.
WeedHerbicide a.i.ProductFormulationProductHerbicide Rate
Group Name Concentrationg a.i. ha−1
DicotyledonphenmediphamBetasana SC®160 g L−10.52 L 100 L−1208.0
ethofumesatEthosat 500®250 g L−10.18 L 100 L−1112.0
metamitronGoltix Gold®700 g L−10.72 L 100 L−11254.5
Cirsium arvense L.clopyralidLontrel 720 SG® ‡720 g kg−132 g 100 L−179.9
Rumex crispus L.
Monocotyledonfluazifop-P-butylFusilade max®125 g L−10.16 L 100 L−150.8
All herbicides applied at 250 L ha−1; Lontrel 720 SG was applied with 0.5 L ha−1 liquid paraffin.
Table 4. Weed specific herbicides applied with three tank sprayer in maize trials.
Table 4. Weed specific herbicides applied with three tank sprayer in maize trials.
WeedHerbicide a.i.ProductFormulationProductHerbicide Rate
Group Name Concentrationg a.i. ha−1
DicotyledonbromoxynilBromotril®225 g L−10.17 L 100 L−1112.5
mesomitroneCallisto®100 g L−10.23 L 100 L−166.5
MonocotyledonforamsulfuronMaister®30 g L−10.20 L 100 L−117.7
+ iodosulfuron-methyl-natrium1 g L−10.6
+ isoxadifen-ethyl (safener)30 g L−117.7
Cirsium arvense L.clopyralidEfigo®267 g L−10.08 L 100 L−159.3
+ picloram67 g L−114.9
All herbicides applied at 290 L ha−1.
Table 5. UAV and manual mapping prior to maize POST I treatment.
Table 5. UAV and manual mapping prior to maize POST I treatment.
Manual Mapping
Weed-FreeWeed
UAV mappingWeed-free162
Weed5150
Weed refers to C. arvense and R crispus > 6 cm, n = 173.
Table 6. UAV and manual mapping prior to sugar beet POST I treatment.
Table 6. UAV and manual mapping prior to sugar beet POST I treatment.
Manual Mapping
Weed-freeWeed
UAV mappingWeed-free1447
Weed1485
Weed refers to C. arvense and R crispus > 6 cm, n = 220.
Table 7. UAV and manual mapping prior to sugar beet POST II treatment.
Table 7. UAV and manual mapping prior to sugar beet POST II treatment.
Manual Mapping
Weed-freeWeed
UAV mappingWeed-free10323
Weed2173
Weed refers to C. arvense and R crispus > 6 cm, n = 220.

Share and Cite

MDPI and ACS Style

Mink, R.; Dutta, A.; Peteinatos, G.G.; Sökefeld, M.; Engels, J.J.; Hahn, M.; Gerhards, R. Multi-Temporal Site-Specific Weed Control of Cirsium arvense (L.) Scop. and Rumex crispus L. in Maize and Sugar Beet Using Unmanned Aerial Vehicle Based Mapping. Agriculture 2018, 8, 65. https://doi.org/10.3390/agriculture8050065

AMA Style

Mink R, Dutta A, Peteinatos GG, Sökefeld M, Engels JJ, Hahn M, Gerhards R. Multi-Temporal Site-Specific Weed Control of Cirsium arvense (L.) Scop. and Rumex crispus L. in Maize and Sugar Beet Using Unmanned Aerial Vehicle Based Mapping. Agriculture. 2018; 8(5):65. https://doi.org/10.3390/agriculture8050065

Chicago/Turabian Style

Mink, Robin, Avishek Dutta, Gerassimos G. Peteinatos, Markus Sökefeld, Johannes Joachim Engels, Michael Hahn, and Roland Gerhards. 2018. "Multi-Temporal Site-Specific Weed Control of Cirsium arvense (L.) Scop. and Rumex crispus L. in Maize and Sugar Beet Using Unmanned Aerial Vehicle Based Mapping" Agriculture 8, no. 5: 65. https://doi.org/10.3390/agriculture8050065

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop