Sensor-Based Precision Agriculture

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Digital Agriculture".

Deadline for manuscript submissions: 20 August 2024 | Viewed by 8003

Special Issue Editors


E-Mail Website
Guest Editor
Department of Biosystems Engineering, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, Republic of Korea
Interests: UAV-based remote sensing; field robotics; artificial intelligence; automation control; internet of things

E-Mail Website
Guest Editor
Agricultural Engineering, China Agricultural University, Beijing 100083, China
Interests: smart agricultural equipment; precision agriculture; grassland machinery; forage grass; pests and diseases control

Special Issue Information

Dear Colleagues,

The growth of the world's population puts enormous pressure on traditional agriculture to meet the growing demand for food and fiber production while minimizing the environmental impact. As a result, the agricultural sector continues to explore innovative technological solutions to improve agricultural practices. One such solution is the integration of intelligent technologies, including next-generation sensors, communications, autonomous flight systems, artificial intelligence, robotics, and analytics.

This Special Issue is dedicated to investigating the research and development of solid-state sensors to collect varied agricultural data. The aim is to monitor biochemical parameters, such as nutrition, humidity, temperature, light, and pH in real time, and biochemical interactions, such as predation, parasitism, and competition. Sensors are used at different spatial and time scales to provide farmers with data-driven insights into crop and livestock growth and health, pests, pesticides, soil health, water, fruit quality, greenhouse gases, and volatile compounds. This Special Issue will also cover the utilization of low-power sensors, energy harvesting technologies, and high-throughput phenotyping using sensors.

We welcome original research, opinions, and reviews covering various specialized crops, including vegetable, ornamental, and field crops and seeds from other managed ecosystems. With this Special Issue, we aim to provide valuable insight into the latest advancements in agricultural technology that can improve the sustainability and efficiency of agricultural practices.

Dr. Xiongzhe Han
Dr. Tianyi Wang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • precision agriculture
  • proximal soil sensing
  • crop canopy sensors
  • precision livestock management
  • sensor networks
  • multi-sensor
  • data fusion
  • decision support

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

16 pages, 6777 KiB  
Article
Detection of Black Spot Disease on Kimchi Cabbage Using Hyperspectral Imaging and Machine Learning Techniques
by Lukas Wiku Kuswidiyanto, Dong Eok Kim, Teng Fu, Kyoung Su Kim and Xiongzhe Han
Agriculture 2023, 13(12), 2215; https://doi.org/10.3390/agriculture13122215 - 29 Nov 2023
Cited by 1 | Viewed by 1372
Abstract
The cultivation of kimchi cabbage in South Korea has always faced significant challenges due to the looming presence of Alternaria leaf spot (ALS), which is a fungal disease mainly caused by Alternaria alternata. The emergence of black spots resulting from Alternaria infection [...] Read more.
The cultivation of kimchi cabbage in South Korea has always faced significant challenges due to the looming presence of Alternaria leaf spot (ALS), which is a fungal disease mainly caused by Alternaria alternata. The emergence of black spots resulting from Alternaria infection lowers the quality of the plant, rendering it inedible and unmarketable. The timely identification of this disease is crucial, as it provides essential data enabling swift intervention, thereby localizing the infection throughout the field. Hyperspectral imaging technologies excel in detecting subtle shifts in reflectance values induced by chemical differences within leaf tissues. However, research on the spectral correlation between Alternaria and kimchi cabbage remains relatively scarce. Therefore, this study aims to identify the spectral signature of Alternaria infection on kimchi cabbage and develop an automatic classifier for detecting Alternaria disease symptoms. Alternaria alternata was inoculated on various sizes of kimchi cabbage leaves and observed daily using a hyperspectral imaging system. Datasets were created based on captured hyperspectral images to train four classifier models, including support vector machine (SVM), random forest (RF), one-dimensional convolutional neural network (1D-CNN), and one-dimensional residual network (1D-ResNet). The results suggest that 1D-ResNet outperforms the other models with an overall accuracy of 0.91, whereas SVM, RF, and 1D-CNN achieved 0.80, 0.88, and 0.86, respectively. This study may lay the foundation for future research on high-throughput disease detection, frequently incorporating drones and aerial imagery. Full article
(This article belongs to the Special Issue Sensor-Based Precision Agriculture)
Show Figures

Figure 1

15 pages, 8074 KiB  
Article
Study on Agricultural Machinery-Load-Testing Technology and Equipment Based on Six-Dimensional Force Sensor
by Wei Chen, Guangqiao Cao, Dong Yuan, Yan Ding, Jiping Zhu and Xiaobing Chen
Agriculture 2023, 13(9), 1649; https://doi.org/10.3390/agriculture13091649 - 22 Aug 2023
Viewed by 1325
Abstract
Tractor traction power consumption is one of the main causes of energy consumption in agricultural production. Scientific and accurate control of tractor traction power consumption can obviously save energy and reduce consumption. In view of the backward load-testing technology and low measurement accuracy [...] Read more.
Tractor traction power consumption is one of the main causes of energy consumption in agricultural production. Scientific and accurate control of tractor traction power consumption can obviously save energy and reduce consumption. In view of the backward load-testing technology and low measurement accuracy in field work, this study designed an array test equipment, which formed a measurement matrix based on a six-dimensional force sensor to accurately measure tractor hauled load, which could provide a reference signal for intelligent operation. In this paper, the static calibration test was carried out on the six-dimension force sensor, and the linearity, sensitivity, and zero drift were analyzed. The static characteristics of the test unit meet the measurement requirements. A static decoupling model was established. The decoupling errors of each channel were stable at 0.02%FS, 0.02%FS, 0.8%FS, 0.36%FS, 0.018%FS, and 0.06%FS, respectively. Finally, the whole hanging test of the measuring equipment was carried out—the error was 1.24%, −1.2% respectively—to verify the accuracy of the measurement of the sensor device under different working conditions. Full article
(This article belongs to the Special Issue Sensor-Based Precision Agriculture)
Show Figures

Figure 1

22 pages, 14890 KiB  
Article
Plastic Contaminant Detection in Aerial Imagery of Cotton Fields Using Deep Learning
by Pappu Kumar Yadav, J. Alex Thomasson, Robert Hardin, Stephen W. Searcy, Ulisses Braga-Neto, Sorin C. Popescu, Roberto Rodriguez III, Daniel E. Martin, Juan Enciso, Karem Meza and Emma L. White
Agriculture 2023, 13(7), 1365; https://doi.org/10.3390/agriculture13071365 - 9 Jul 2023
Viewed by 1400
Abstract
Plastic shopping bags are often discarded as litter and can be carried away from roadsides and become tangled on cotton plants in farm fields. This rubbish plastic can end up in the cotton at the gin if not removed before harvest. These bags [...] Read more.
Plastic shopping bags are often discarded as litter and can be carried away from roadsides and become tangled on cotton plants in farm fields. This rubbish plastic can end up in the cotton at the gin if not removed before harvest. These bags may not only cause problems in the ginning process but might also become embedded in cotton fibers, reducing the quality and marketable value. Therefore, detecting, locating, and removing the bags before the cotton is harvested is required. Manually detecting and locating these bags in cotton fields is a tedious, time-consuming, and costly process. To solve this, this paper shows the application of YOLOv5 to detect white and brown colored plastic bags tangled at three different heights in cotton plants (bottom, middle, top) using Unmanned Aircraft Systems (UAS)-acquired Red, Green, Blue (RGB) images. It was found that an average white and brown bag could be detected at 92.35% and 77.87% accuracies and a mean average precision (mAP) of 87.68%. Similarly, the trained YOLOv5 model, on average, could detect 94.25% of the top, 49.58% of the middle, and only 5% of the bottom bags. It was also found that both the color of the bags (p < 0.001) and their height on cotton plants (p < 0.0001) had a significant effect on detection accuracy. The findings reported in this paper can help in the autonomous detection of plastic contaminants in cotton fields and potentially speed up the mitigation efforts, thereby reducing the amount of contaminants in cotton gins. Full article
(This article belongs to the Special Issue Sensor-Based Precision Agriculture)
Show Figures

Figure 1

18 pages, 4107 KiB  
Article
YOLO-Sp: A Novel Transformer-Based Deep Learning Model for Achnatherum splendens Detection
by Yuzhuo Zhang, Tianyi Wang, Yong You, Decheng Wang, Dongyan Zhang, Yuchan Lv, Mengyuan Lu and Xingshan Zhang
Agriculture 2023, 13(6), 1197; https://doi.org/10.3390/agriculture13061197 - 4 Jun 2023
Viewed by 2091
Abstract
The growth of Achnatherum splendens (A. splendens) inhibits the growth of dominant grassland herbaceous species, resulting in a loss of grassland biomass and a worsening of the grassland ecological environment. Therefore, it is crucial to identify the dynamic development of A. [...] Read more.
The growth of Achnatherum splendens (A. splendens) inhibits the growth of dominant grassland herbaceous species, resulting in a loss of grassland biomass and a worsening of the grassland ecological environment. Therefore, it is crucial to identify the dynamic development of A. splendens adequately. This study intended to offer a transformer-based A. splendens detection model named YOLO-Sp through ground-based visible spectrum proximal sensing images. YOLO-Sp achieved 98.4% and 95.4% AP values in object detection and image segmentation for A. splendens, respectively, outperforming previous SOTA algorithms. The research indicated that Transformer had great potential for monitoring A. splendens. Under identical training settings, the AP value of YOLO-Sp was greater by more than 5% than that of YOLOv5. The model’s average accuracy was 98.6% in trials conducted at genuine test sites. The experiment revealed that factors such as the amount of light, the degree of grass growth, and the camera resolution would affect the detection accuracy. This study could contribute to the monitoring and assessing grass plant biomass in grasslands. Full article
(This article belongs to the Special Issue Sensor-Based Precision Agriculture)
Show Figures

Figure 1

Other

Jump to: Research

6 pages, 243 KiB  
Opinion
BeeOpen—An Open Data Sharing Ecosystem for Apiculture
by Shreyas M. Guruprasad and Benjamin Leiding
Agriculture 2024, 14(3), 470; https://doi.org/10.3390/agriculture14030470 - 14 Mar 2024
Viewed by 729
Abstract
The digital transformation of apiculture initially encompasses Internet of Things (IoT) systems, incorporating sensor technologies to capture and transmit bee-centric data. Subsequently, data analysis assumes a vital role by establishing correlations between the collected data and the biological conditions of beehives, often leveraging [...] Read more.
The digital transformation of apiculture initially encompasses Internet of Things (IoT) systems, incorporating sensor technologies to capture and transmit bee-centric data. Subsequently, data analysis assumes a vital role by establishing correlations between the collected data and the biological conditions of beehives, often leveraging artificial intelligence (AI) approaches. The field of precision bee monitoring has witnessed a surge in the collection of large volumes of diverse data, ranging from the hive weight and temperature to health status, queen bee presence, pests, and overall hive activity. Further, these datasets’ heterogeneous nature and lack of standardization present challenges in applying machine learning techniques directly to extract valuable insights. To address this issue, the envisioned ecosystem serves as an open and collaborative information platform, facilitating the exchange and utilization of bee monitoring datasets. The data storage architecture can process a large variety of data at high frequency, e.g., images, videos, audio, and time series data. The platform serves as a repository, providing crucial information about the condition of beehives, health assessments, pest attacks, swarming patterns, and other relevant data. Notably, this information portal is managed through a citizen scientist initiative. By consolidating data from various sources, including beekeepers, researchers, and monitoring systems, the platform offers a holistic view of the bee population’s status in any given area. Full article
(This article belongs to the Special Issue Sensor-Based Precision Agriculture)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Title: An impurity rate estimation method of post-harvest sugarcane based on rotated bounding box and binocular vision
Authors: Zhiheng Lu; Shusheng Yu; Kai Huang; Wang Yang
Affiliation: School of Mechanical Engineering, Guangxi University
Abstract: Sugarcane is an important economic crop. After machine harvesting, the impurity rate of sug-arcane is an important metric, which affects the sugar output rate. In order to obtain the impurity rate and detect primary impurities, this paper proposes an impurity rate estimation method for post-harvest sugarcane based on rotated bounding box and binocular vision. Firstly, the sugarcane mixture image including sugarcane segments, sugarcane tips, and sugarcane leaves was captured by a binocular camera. Secondly, the YOLOv5-obb algorithm is used to obtain rotated bounding boxes for sugarcane mixture. Next, based on binocular vision, the actual dimensions of sugarcane segments are calculated. And the sugarcane segments are fitted as cylinders, enabling the calcu-lation of their volume and mass. Finally, the impurity rate of post-harvest sugarcane is calculated based on the mass of sugarcane segments and the total mass of the mixture. Experimental results demonstrate that rotated bounding boxes can fit the shape of each target accurately, with a mean average precision (mAP) of 93.5%. The model also performs well in detecting occluded and overlapped targets. The average detection time per image is 0.02 s, and the average time for impurity rate estimation per image is 0.19 s. For 830 test images, the average mass error of sug-arcane segments is 10.88%, the total mass error is 2.58%, and the total impurity rate error is 10.16%.

Back to TopTop