sensors-logo

Journal Browser

Journal Browser

Sensors and Robotic Systems for Agriculture Applications

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: 30 October 2024 | Viewed by 8580

Special Issue Editors


E-Mail Website1 Website2
Guest Editor
CSIC-UPM-Centro de Automatica y Robotica (CAR), Madrid, Spain
Interests: field and service robotic systems; intelligent robotics; multisensory systems; nonlinear actuators and nonlinear controllers
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Technische Universität Berlin, Straße des 17 Juni 144, 10623 Berlin, Germany
2. Leibniz Institute for Agricultural Engineering and Bioeconomy (ATB), Max-Eyth-Allee 100, 14469 Potsdam, Germany
Interests: smart sensors; automation; precision agriculture; system development; smart systems

Special Issue Information

Dear Colleagues,

To satisfy the growing demand for fruit and vegetables, the agricultural industry is immersed in a transformation process that allows it to double productivity in a sustainable and environmentally friendly way. The increasing difficulties in finding workers in the agricultural sector and the gradual increase in labour costs are also factors that are precipitating the aforementioned change. That is why there is currently a clear consensus worldwide that the introduction and consolidation of precision agriculture is essential to achieve the required performance. Sensors and robotic systems are among the most promising technologies to help farmers increase the sustainability, productivity, and profitability of their operations. However, a great research effort is still required, not only to develop faster, more efficient and autonomous systems, but also to endow them with the ability to adapt to the complexities and variabilities of agricultural scenarios.

Therefore, the objective of this Special Issue is to compile recent advances in sensors and robotic systems for agriculture applications. Topics of interest include, but are not limited to:

  • Sensor fusion for agricultural applications;
  • Precision phenotyping;
  • Sensory systems for detection of pests and diseases;
  • New robotics applications for precision agriculture;
  • AGV and UAV for soil/crop monitoring, prediction and decision making;
  • Robots for mowing, spraying and weed removal;
  • Robotic manipulators and end-effectors for harvesting and post-harvesting tasks;
  • AI for precision agriculture.

Original papers and survey papers are solicited for the Special Issue, covering research results as well as case studies and applications in related areas of interest.

Dr. Roemi Fernandez
Dr. Cornelia Weltzien
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • agricultural applications
  • sensor fusion
  • pests and diseases detection
  • AGV, UAV, robotic manipulation
  • weed removal
  • artificial intelligence

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 10274 KiB  
Article
Development of Location-Data-Based Orchard Passage Map Generation Method
by Joong-hee Han, Chi-ho Park and Young Yoon Jang
Sensors 2024, 24(3), 795; https://doi.org/10.3390/s24030795 - 25 Jan 2024
Viewed by 582
Abstract
Currently, pest control work using speed sprayers results in increasing numbers of safety accidents such as worker pesticide poisoning and rollover of vehicles during work. To address this, there is growing interest in autonomous driving technology for speed sprayers. To commercialize and rapidly [...] Read more.
Currently, pest control work using speed sprayers results in increasing numbers of safety accidents such as worker pesticide poisoning and rollover of vehicles during work. To address this, there is growing interest in autonomous driving technology for speed sprayers. To commercialize and rapidly expand the use of self-driving speed sprayers, an economically efficient self-driving speed sprayer using a minimum number of sensors is essential. This study developed an orchard passage map using location data acquired from positioning sensors to generate autonomous driving paths, without installing additional sensors. The method for creating the orchard passage map presented in this study was to create paths using location data obtained by manually driving the speed sprayer and merging them. In addition, to apply the orchard passage map when operating autonomously, a method is introduced for generating an autonomous driving path for the work start point movement path, work path, and return point movement path. Full article
(This article belongs to the Special Issue Sensors and Robotic Systems for Agriculture Applications)
Show Figures

Figure 1

13 pages, 3054 KiB  
Article
Estimation of Off-Target Dicamba Damage on Soybean Using UAV Imagery and Deep Learning
by Fengkai Tian, Caio Canella Vieira, Jing Zhou, Jianfeng Zhou and Pengyin Chen
Sensors 2023, 23(6), 3241; https://doi.org/10.3390/s23063241 - 19 Mar 2023
Cited by 5 | Viewed by 1801
Abstract
Weeds can cause significant yield losses and will continue to be a problem for agricultural production due to climate change. Dicamba is widely used to control weeds in monocot crops, especially genetically engineered dicamba-tolerant (DT) dicot crops, such as soybean and cotton, which [...] Read more.
Weeds can cause significant yield losses and will continue to be a problem for agricultural production due to climate change. Dicamba is widely used to control weeds in monocot crops, especially genetically engineered dicamba-tolerant (DT) dicot crops, such as soybean and cotton, which has resulted in severe off-target dicamba exposure and substantial yield losses to non-tolerant crops. There is a strong demand for non-genetically engineered DT soybeans through conventional breeding selection. Public breeding programs have identified genetic resources that confer greater tolerance to off-target dicamba damage in soybeans. Efficient and high throughput phenotyping tools can facilitate the collection of a large number of accurate crop traits to improve the breeding efficiency. This study aimed to evaluate unmanned aerial vehicle (UAV) imagery and deep-learning-based data analytic methods to quantify off-target dicamba damage in genetically diverse soybean genotypes. In this research, a total of 463 soybean genotypes were planted in five different fields (different soil types) with prolonged exposure to off-target dicamba in 2020 and 2021. Crop damage due to off-target dicamba was assessed by breeders using a 1–5 scale with a 0.5 increment, which was further classified into three classes, i.e., susceptible (≥3.5), moderate (2.0 to 3.0), and tolerant (≤1.5). A UAV platform equipped with a red-green-blue (RGB) camera was used to collect images on the same days. Collected images were stitched to generate orthomosaic images for each field, and soybean plots were manually segmented from the orthomosaic images. Deep learning models, including dense convolutional neural network-121 (DenseNet121), residual neural network-50 (ResNet50), visual geometry group-16 (VGG16), and Depthwise Separable Convolutions (Xception), were developed to quantify crop damage levels. Results show that the DenseNet121 had the best performance in classifying damage with an accuracy of 82%. The 95% binomial proportion confidence interval showed a range of accuracy from 79% to 84% (p-value ≤ 0.01). In addition, no extreme misclassifications (i.e., misclassification between tolerant and susceptible soybeans) were observed. The results are promising since soybean breeding programs typically aim to identify those genotypes with ‘extreme’ phenotypes (e.g., the top 10% of highly tolerant genotypes). This study demonstrates that UAV imagery and deep learning have great potential to high-throughput quantify soybean damage due to off-target dicamba and improve the efficiency of crop breeding programs in selecting soybean genotypes with desired traits. Full article
(This article belongs to the Special Issue Sensors and Robotic Systems for Agriculture Applications)
Show Figures

Figure 1

13 pages, 17186 KiB  
Article
Direct Drive Brush-Shaped Tool with Torque Sensing Capability for Compliant Robotic Vine Suckering
by Ivo Vatavuk, Dario Stuhne, Goran Vasiljević and Zdenko Kovačić
Sensors 2023, 23(3), 1195; https://doi.org/10.3390/s23031195 - 20 Jan 2023
Cited by 2 | Viewed by 1552
Abstract
In this paper, we present a direct drive brush-shaped tool developed for the use of robotic vine suckering. Direct drive design philosophy allows for precise and high bandwidth control of the torque exerted by the brush. Besides limiting the torque exerted onto the [...] Read more.
In this paper, we present a direct drive brush-shaped tool developed for the use of robotic vine suckering. Direct drive design philosophy allows for precise and high bandwidth control of the torque exerted by the brush. Besides limiting the torque exerted onto the plant, this kind of design philosophy allows the brush to be used as a torque sensor. High bandwidth torque feedback from the tool is used to enable a position controlled robot arm to perform the suckering task without knowing the exact position and shape of the trunk of the vine. An experiment was conducted to investigate the dependency of the applied torque on the overlap between the brush and the obstacle. The results of the experiment indicate a quadratic relationship between torque and overlap. This quadratic function is estimated and used for compliant trunk shape following. A trunk shape following experiment demonstrates the utility of the presented tool to be used as a sensor for compliant robot arm control. The shape of the trunk is estimated by tracking the motion of the robot arm during the experiment. Full article
(This article belongs to the Special Issue Sensors and Robotic Systems for Agriculture Applications)
Show Figures

Figure 1

20 pages, 9638 KiB  
Article
Grass Cutting Robot for Inclined Surfaces in Hilly and Mountainous Areas
by Yuki Nishimura and Tomoyuki Yamaguchi
Sensors 2023, 23(1), 528; https://doi.org/10.3390/s23010528 - 3 Jan 2023
Cited by 2 | Viewed by 3897
Abstract
Grass cutting is necessary to prevent grass from diverting essential nutrients and water from crops. Usually, in hilly and mountainous areas, grass cutting is performed on steep slopes with an inclination angle of up to 60° (inclination gradient of 173%). However, such grass [...] Read more.
Grass cutting is necessary to prevent grass from diverting essential nutrients and water from crops. Usually, in hilly and mountainous areas, grass cutting is performed on steep slopes with an inclination angle of up to 60° (inclination gradient of 173%). However, such grass cutting tasks are dangerous owing to the unstable positioning of workers. For robots to perform these grass cutting tasks, slipping and falling must be prevented on inclined surfaces. In this study, a robot based on stable propeller control and four-wheel steering was developed to provide stable locomotion during grass cutting tasks. The robot was evaluated in terms of locomotion for different steering methods, straight motion on steep slopes, climbing ability, and coverage area. The results revealed that the robot was capable of navigating uneven terrains with steep slope angles. Moreover, no slipping actions that could have affected the grass cutting operations were observed. We confirmed that the proposed robot is able to cover 99.95% and 98.45% of an area on a rubber and grass slope, respectively. Finally, the robot was tested on different slopes with different angles in hilly and mountainous areas. The developed robot was able to perform the grass cutting task as expected. Full article
(This article belongs to the Special Issue Sensors and Robotic Systems for Agriculture Applications)
Show Figures

Figure 1

Back to TopTop