Next Article in Journal
Development of a Robotic Platform with Autonomous Navigation System for Agriculture
Previous Article in Journal
Spatial–Temporal Dynamics of Land Use and Cover in Mata da Pimenteira State Park Based on MapBiomas Brasil Data: Perspectives and Social Impacts
Previous Article in Special Issue
Computational Techniques for Analysis of Thermal Images of Pigs and Characterization of Heat Stress in the Rearing Environment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Application of a Real-Time Field-Programmable Gate Array-Based Image-Processing System for Crop Monitoring in Precision Agriculture

by
Sabiha Shahid Antora
1,
Mohammad Ashik Alahe
2,
Young K. Chang
1,2,*,
Tri Nguyen-Quang
1 and
Brandon Heung
3
1
Department of Engineering, Faculty of Agriculture, Dalhousie University, Truro, NS B2N 5E3, Canada
2
Department of Agricultural and Biosystems Engineering, South Dakota State University, Brookings, SD 57007, USA
3
Department of Plant, Food, and Environmental Sciences, Faculty of Agriculture, Dalhousie University, Truro, NS B2N 5E3, Canada
*
Author to whom correspondence should be addressed.
AgriEngineering 2024, 6(3), 3345-3361; https://doi.org/10.3390/agriengineering6030191
Submission received: 26 June 2024 / Revised: 10 September 2024 / Accepted: 11 September 2024 / Published: 14 September 2024

Abstract

:
Precision agriculture (PA) technologies combined with remote sensors, GPS, and GIS are transforming the agricultural industry while promoting sustainable farming practices with the ability to optimize resource utilization and minimize environmental impact. However, their implementation faces challenges such as high computational costs, complexity, low image resolution, and limited GPS accuracy. These issues hinder timely delivery of prescription maps and impede farmers’ ability to make effective, on-the-spot decisions regarding farm management, especially in stress-sensitive crops. Therefore, this study proposes field programmable gate array (FPGA)-based hardware solutions and real-time kinematic GPS (RTK-GPS) to develop a real-time crop-monitoring system that can address the limitations of current PA technologies. Our proposed system uses high-accuracy RTK and real-time FPGA-based image-processing (RFIP) devices for data collection, geotagging real-time field data via Python and a camera. The acquired images are processed to extract metadata then visualized as a heat map on Google Maps, indicating green area intensity based on romaine lettuce leafage. The RFIP system showed a strong correlation (R2 = 0.9566) with a reference system and performed well in field tests, providing a Lin’s concordance correlation coefficient (CCC) of 0.8292. This study demonstrates the potential of the developed system to address current PA limitations by providing real-time, accurate data for immediate decision making. In the future, this proposed system will be integrated with autonomous farm equipment to further enhance sustainable farming practices, including real-time crop health monitoring, yield assessment, and crop disease detection.

1. Introduction

Food insecurity is a growing concern due to the rising global population, which places significant pressure on the agricultural sector to produce enough food to sustain the expanding population. The latest report from the Food and Agriculture Organization (FAO) on food security and nutrition has shown an increasing trend in the number of people affected by hunger worldwide since 2014 [1]. Therefore, it has become essential to implement sustainable crop production techniques to reduce crop losses and enhance productivity. These techniques may include integrated weed management, variable-rate agrochemical applications, irrigation management, crop vegetation growth monitoring, and yield estimation.
One critical tool for ensuring future food security is the emergence of precision agriculture (PA) technologies, which integrate Global Positioning System (GPS), Geographic Information System (GIS), and remote sensing technologies. By managing, analyzing, and processing a large amount of real-time agricultural data, PA provides farm management services that assist farmers in on-the-spot decisions. PA technologies have been used for a wide array of applications, such as delineating management zones for variable-rate (VR) operations and monitoring crop health [2]. Other examples demonstrate that with real-time remote sensing imagery, a crop health monitoring framework can be instrumental in identifying the impacts caused by insects, weeds, fungal infestations, or weather-related damage [3]. Other implementations of PA technologies included using digital color cameras to determine plant leaf areas for spot application of fungicide [4] or detecting fruit orangeness when estimating yields within citrus orchards [5]. Rehman et al. [6] developed a Graphical User Interface (GUI) system to acquire imagery for controlling spray nozzles of a VR sprayer, resulting in agrochemical savings when applied to spot application.
To provide PA-based, farm management solutions to farmers, it is crucial to consider not only the collection of spatiotemporal data but also the processing, analysis, management, and storage of that data [7]. Guo et al. [8] demonstrated that monitoring crop growth status is possible by tracking vegetation indices along with state-of-the-art GPS location information derived from the crop stress maps. Moreover, real-time remote sensing data can significantly enhance decision making in PA if high spatial and spectral resolutions are ensured [9].
Farmers demand faster processing and real-time, actionable information for analyzing the status of their field using image processing [10]. However, processing speed and cost-effectiveness remain significant obstacles in existing PA image-processing techniques in real-time and at field scale [11]. Although real-time control over farmlands has been achieved using the latest, high-performance multiprocessor data-computing systems, such as clusters of networks of central processing units (CPUs) [12], it is cost-prohibitive. Field programmable gate array (FPGA), however, may present an ideal alternative for image-processing applications [13] due to its high processing speed, ability to run multiple tasks simultaneously and in parallel, and cost-effectiveness [13,14,15,16,17].
For site-specific crop management decisions, Ruß and Brenning [18] established spatial relationships among geographical data, soil parameters, and crop properties. However, georeferencing using a GPS with a positional accuracy of less than 1 m is needed for monitoring plant canopy status, yield mapping, weed mapping, and weed detection techniques to construct the most accurate map for VR and PA applications [19,20]. To address this issue, real-time kinematic (RTK)-GPS can achieve positional accuracies at a few-centimeter scale; for example, Sun et al. [21] proved its feasibility though the automated mapping of transplanted row crops.
To meet the need for high positional accuracy in crop monitoring, this study integrated RTK-GPS with FPGA-based real-time image acquisition and processing as the foundation for real-time crop monitoring. The study aims to address current limitations in PA related to computational complexity and develop cost-effective and efficient solutions for field-scale farm management applications. The specific research goals of this study were (1) to develop a real-time crop-monitoring system that includes a custom-developed real-time FPGA-based image-processing (RFIP) system and RTK-GPS for monitoring crop growth and/or for other related purposes; (2) to create a real-time data collection and post-processing system; and (3) to establish a geo-visualization layer to support crop management decisions.

2. Materials and Methods

To achieve the research goals, an RFIP, i.e., a real-time FPGA-based image-processing system, was developed and evaluated in the laboratory environment using colored objects. Subsequently, outdoor testing was conducted on Romaine lettuce (Lactuca sativa L. var. longifolia) plants [22] and image processing was carried out on real-time imagery. An overview of the proposed system and performance evaluation techniques are explained in the following subsections.

2.1. Overview of the Real-Time Crop-Monitoring System

The overall functionality of the real-time crop-monitoring system is depicted in Figure 1. Firstly, an RTK high-accuracy solution using the u-blox NEO-M8P-2 module and the C94-M8P application board (u-blox Inc.; Thalwil, Switzerland) was configured to achieve an accuracy of 8 cm. The RFIP and RTK-GPS served as real-time data collection devices and are collectively referred to as the real-time data collection unit (RDCU). Here, a personal computer (PC) was connected to the output ports of the RDCU and acted as a data-storing unit (DSU). The Python programming tool V10.0 (Python Software Foundation Inc.; Wilmington, DE, USA) was installed in the DSU to read the RDCU output ports and geotag the real-time field data using reference images from the AVERMEDIA Live Streamer CAM 313 (AVerMedia Inc.; New Taipei City, Taiwan) web camera. Due to the heavy weight and high cost of a digital single-lens reflex (DSLR), it could not be used for this study. However, the web camera showed good correlation with the DSLR camera in a previous study [22]; therefore, we selected the web camera as an alternative reference camera system.
The collected real-time georeferenced images were then processed by the post-processing unit (PPU) to extract metadata and save it in a specified format for use in the web-based visualization layer (WVL). The PPU used the same PC and programming tool as the DSU; however, the processing did not require real-time data acquisition. The WVL involves two steps: creating and serving local files on a localhost web server and displaying a heat map layer on a real-time Google Map using Maps JavaScript API on the webpage.
The final product is a heat map that displays recorded geographic locations along with real-time field data. The color scheme on the marked locations represents the intensity of the green areas, which is the ratio of the detected Romaine lettuce (Lactuca sativa L. var. longifolia) plant leaf areas to the ground resolution of the RFIP system at corresponding locations.

2.2. Real-Time Data Collection Unit

One of the major parts of this research was the RDCU, a prototype of an agile, real-time, FPGA-based, lightweight, and cost-effective crop-monitoring system. For development and testing, the necessary hardware and devices were installed on a horizontal, T-shaped, wooden frame. After installation, the custom-built wooden frame was affixed to a tripod using a metal screw. The tripod had adjustable legs that could be extended to a length of 5 feet and could rotate both left and right.
The tripod setup (Figure 2) includes the following components: (1) one of two identical RTK boards, configured as a rover with two antennas for global navigation satellite system (GNSS) and UHF; (2) a liquid crystal display (LCD) monitor connected to the VGA output port of the FPGA development board, displaying output images based on specific switching logic [22]; (3) the RFIP system, consisting of the FPGA board and D8M camera board, which acquires and processes images and transfers the detected pixel area in real-time; (4) a web camera used for geotagging the GPS location and the detected pixel area; (5) one PC that supplies power to the USB web camera and to the RTK rover, connected with the USB output for downloading the Quartus Prime (Intel Inc.; Santa Clara, CA, USA) program configuration file to the FPGA board, and links with the RS232 serial output port of the RFIP system; (6) a battery source with a 400 W power inverter to supply power to the FPGA device and the LCD monitor; and (7) a blue-painted wooden frame measuring 30.0 cm × 22.5 cm, designed to maintain a consistent camera projection at a fixed ground resolution over the selected spots.

2.2.1. RFIP Sensing System

The RFIP sensing system consisted of three distinct units: the image acquisition, image-processing, and data transfer units [22]. In this setup, the same hardware and software configuration was used to instruct the camera to capture images. The analog gain, digital gain (i.e., red, green, and blue channel gain), and exposure gain were optimized through several experiments and adjustments to acquire imagery at a resolution of 800 × 600 px and at a rate of 40 frames per second. These images were then processed onboard using the G-ratio formula: (255 × G)/(R + G + B) [22]. An intensity threshold of 90 was selected for each of the color ratio filters to produce a binary image, where the detected area appeared white by setting the processed G output color components to the maximum intensity of 255. The resulting binary image was displayed on the LCD monitor attached to the RDCU. Due to the automatic exposure control limitation of the RFIP system, field data collection was scheduled during clear, mainly sunny, or slightly cloudy weather conditions. To avoid potential effects of occasional cloudiness on plant intensity and maintain consistent brightness during data collection, an umbrella was positioned as a shade in the camera’s ground view at all sample points.

2.2.2. RTK-GPS System

Geolocational information, specifically latitude and longitude, is a crucial factor in the decision support system that assists end-users in applying site-specific farm management solutions in the practical field. A review of current variable-rate PA solutions revealed that GPS accuracy within the meter range is sometimes inadequate for practical ground-truth prescription map generation [19]. Therefore, another significant aspect of this study was our focus on real-time GPS accuracy in the cm range. To achieve this, two identical RTK boards from u-blox’s M8 high-precision positioning module were configured to serve as RTK rover and RTK base station. Each board required three basic connections (Figure 3): a GNSS patch antenna that responds to the radio signals from GNSS satellites to compute the position, a UHF whip antenna that provides maximum flexibility to assess GPS signals in the high-frequency range, and a micro-USB to supply both the 5-volt power and the configuration setup.
During field data collection with the RTK base station running in TIME mode, the RTK rover took a couple of minutes to go into RTK FLOAT mode and eventually RTK FIXED mode after receiving Radio Technical Commission for Maritime Services (RTCM) corrections and resolving carrier ambiguities. Throughout the data collection period, it was crucial for the RTK rover to be in FIXED mode to provide accurate latitude and longitude. The functionality of the RTK-GPS is depicted in Figure 4.

2.3. Data Storing Unit

The data-storing unit (DSU) is a crucial component of the real-time crop-monitoring system, enabling software communication between different hardware units through the physical ports of a PC (Figure 5). The DSU consists of a small PC or laptop with an Intel® Core™ i5-8250U CPU @ 1.60 GHz–1.80 GHz × 64-based processor, running on Windows 10 64-bit operation system. Python serves as the core programming tool for designing and operating the DSU, which collects real-time data from the serial output of the RFIP hardware and the RTK rover and geotags them with corresponding reference images at a 1920 × 1080 pixel resolution via USB connection with the web camera. The piexif, PIL, cv2, pynmea2, serial, and fractions Python packages were employed. The freely available integrated development environment (IDE), known as Spyder, from the Anaconda navigator desktop GUI was used to write and run the Python scripts for data collection and storage.

2.4. Post-Processing Unit

In many current real-time crop-monitoring services, analyzing field data is often time-consuming. However, in this research, the RFIP prototype was developed not only to acquire field images but also to process them and deliver results in real time. As a result, the PPU only extracts the image metadata from the georeferenced field images and stores the detected area with the corresponding GPS location in a specified format for use by the WVL.
The PPU uses the same PC and programming tool as the DSU but employs different Python packages. The software for the PPU is simpler than that of the DSU, utilizing only two Python packages: piexif and PIL. Furthermore, it is straightforward, without the need for multiple functions or layers. The Exif metadata of saved images is read and saved as a comma-separated value (CSV) file in a specified location on the PC.

2.5. Web-Based Visualization Layer

In the pursuit of cost-free map visualization techniques with limited field data, a simpler and cost-effective service has been identified (Figure 6). This approach involves setting up a localhost web server and using the PC as a local server from port 8000. This setup enables the serving of local files, including Hypertext Markup Language (HTML), Cascading Style Sheets (CSS), and JavaScript (JS), on the local port. By accessing Google Maps services and following the heat map layer example, it was straightforward to represent the intensities with corresponding GPS locations as a heat map layer on top of the real-time Google map.

2.5.1. Webpage Creation

The decision was made to use the same Python programming tool as the DSU and the PPU, which was installed on the PC. To establish a simple web server using Python, the ‘hello Handler’ class was created and port 8000 was defined to serve indefinitely, utilizing the HTTPServer and BaseHTTPRequestHandler Python packages. This made it easy to serve local files in the specified folder directory directly on the localhost:8000 website using the command prompt terminal.
In this setup, the PC was designated as a local server and connected with the web browser through port 8000, as defined in the Python script. The command written in the specified directory inside the command prompt terminal was python -m http.server 8000 (Figure 7). Finally, by typing the http://localhost:8000/ web address on the Google web browser’s search window, the files within the specified folder directory on the PC became visible as directory listings. After creating an HTML file with any intended content within the folder directory, viewing the webpage from the Google browser became straightforward.

2.5.2. Heat Map Generation on Real-Time Google Maps

The first step in generating a heat map is to display the Google map on the created webpage. This requires having a Google account and using that account to create a new project in the Google Cloud Platform along with a billing account. The next step involves obtaining the application programming interface (API) key by creating credentials for the Maps JavaScript API and Places API. This API key allows 100 data points to be uploaded and 1000 visits to be paid to the webpage for free during the 90-day free trial.
For simplicity, the example scripts have not been extensively modified except for the JS, which includes the latitude and longitude of the map, gradient scheme, radius, opacity to change color appearance, and the data points. The data points were replaced by the latitude and longitude collected during real-time crop monitoring. An important modification was made to the weight field, influencing the visual color scheme on the heat map. The weight field values were generated as intensity values using the detected pixel area during real-time crop monitoring, using Equation (1). We chose shades of green for the heat map color scheme to represent greater or lower percentage areas detected as areas of plant leafage.
I n t e n s i t y = T o t a l   P i x e l   A r e a   C o n s i d e r e d D e t e c t e d   P i x e l   A r e a T o t a l   P i x e l   A r e a   C o n s i d e r e d × 100

2.6. Testing of the RTK-GPS

This study had two primary objectives: to assess the accuracy of the RTK-GPS system at the centimeter level, and to collect the real-time field data, including geographic location and the detected pixel area from the region of interest (ROI). Although the entire system needed to be transported and positioned at selected points on a field, the data were recorded in stationary mode to minimize the potential for human errors.

Data Collection Using RTK-GPS

To evaluate the accuracy of the RTK rover, four spots were selected (Spot 1: a straight line with an intermediate distance of 50 cm; Spot 2: a square with a side length of 20 cm; Spot 3: a straight line with an intermediate distance of 20 cm; and Spot 4: a square with a side length of 30 cm) (Figure 8). These spots, illustrated in Figure 8, were located within a 50 m radius of the RTK base station. The GPS data of the RTK rover were recorded using u-center 21.05 (u-blox Inc.; Thalwil, Switzerland) GNSS evaluation software, ensuring the rover’s FIXED status. Subsequently, Google KML files were generated from the four recorded data files using u-center software to visualize the GPS data on a real-time Google map through Google Earth Pro software 7.3 (Google Inc.; Mountain View, CA, USA).

2.7. Testing of the Real-Time Crop-Monitoring System

To test the real-time crop-monitoring system, we selected Dalhousie University’s Demonstration Garden (45.3755° N, 63.2631° W), located within a 100 m range of the RTK base station. We specifically chose a portion of three plant rows within a Romaine lettuce (Lactuca sativa L. var. longifolia) crop area, consisting of 21 plants (7 in each row; Figure 9). The system was repositioned and adjusted by changing the lengths and angles of the tripod’s three legs to maintain a consistent ground resolution of 30 cm × 22.5 cm with an image resolution of 800 × 600 pixels for each of 21 data points. At each data point, we collected a total of 7 samples, resulting in a comprehensive dataset of 147 samples.
The field data were collected from two serial outputs: the latitude and longitude with the RTK rover in FIXED mode, and the pixel area detected as a plant. These data were geotagged as image metadata with the web camera reference image of corresponding data points while maintaining 1920 × 1080 pixels. The reference images were then cropped and resized to match the FPGA imagery. The cropped and resized reference were processed using the NumPy and cv2 packages in the Python programming language. A Python script was written to perform mathematical operations pixel by pixel, following the G-ratio formula [22]. This resulted in a list of 147 reference pixel areas to compare the performance of the RFIP system. Simultaneously, the 147 GPS data points were extracted from the reference images’ Exif metadata and saved as a CSV file. The latitude and longitude from the CSV file were utilized by the Google Earth Pro software to visualize the data points on the Google map for validation.
Finally, to visualize the real-time field data in the form of a heat map on a localhost webpage, an average was computed for every 7 samples from 21 data points, resulting in averaged latitude, longitude, and detected area values. To display the color scheme on the heat map based on the detected area, an additional step was required to prepare the dataset by performing the percentage area calculation for 21 points (using the previously mentioned formula) to assess the intensity of the corresponding geographic location. In this process, latitude, longitude, and intensity values of 21 data points were included in the JS of the local server webpage to create a layer serving as a heat map on top of the Google map.

2.8. Performance Evaluation

Since this research aimed to provide a cost-effective, faster, and reliable real-time image-processing system alternative, the web camera imagery was used as the gold standard to compare and evaluate the performance of RFIP system. The web camera has been widely utilized in real-time image-processing systems in recent years [6,23]. Statistical analyses on collected data were conducted using Minitab 19 (Minitab Inc., State College, PA, USA). Basic statistical measures, including the mean and standard deviation (SD) of the detected pixel area, were formed on the basis of our comparisons.
For the field evaluation of the RFIP system, there were 7 samples for each of the 21 plants, totaling 147 samples from both the FPGA output and web camera reference system. The FPGA data and web camera data were averaged from 7 samples per plant to produce 21 samples for each system. These data were then analyzed and compared using the G-ratio algorithm for real-time detection in the field environment. The detected pixel area predicted by the RFIP system was correlated with the area detected by the web camera using regression analyses. The coefficient of determination (R2) was calculated to compare the RFIP system’s performance with the web camera’s image acquisition.
Since the RFIP system integrates image acquisition and image processing, the reference images were processed pixel by pixel using Python, applying the same algorithm used in the RFIP system’s image-processing unit. To further assess the degree of correctness between the RFIP data and the reference data, Lin’s concordance correlation coefficient (CCC) was computed using the field test findings [24]. According to Lin [24], it makes more sense to test if CCC is larger than a threshold number, C C C 0 , while carrying out hypothesis testing, rather than just determining whether CCC is zero. The following Equation (2) was used to calculate the threshold, where ρ 2   represents the R-squared achieved when the RFIP data were regressed to the reference data; x a   is the measure of precision, determined by Equation (3); υ and ω are the functions of mean and standard deviation; and d is the % loss in precision that can be tolerated (Lin, 1992) [24].
C C C 0 = X a ρ 2 d
X a = 2 υ 2 + ω + 1 ω
The null and alternative hypotheses are H 0 : CCC ≤ C C C 0 (indicating no significant concordance between the RFIP data and the reference data) and H 1 : CCC > C C C 0 (indicating significant concordance between the RFIP data and the reference data). If CCC > C C C 0 , the null hypothesis is rejected, establishing the concordance of the new test procedure.
In addition, Google Earth Pro software is used for evaluating GPS data. As this research was conducted within a limited geographical range, the available data points or latitude and longitude values were insufficient to generate a prominent visualization on the Google map. Therefore, instead of using the full-screen images from the Google Earth Pro software, small areas with a few data points were cropped and found to represent the minimum visualization for this prototype.

3. Results

The RTK-GPS data collected from four selected spots during the experiment were visualized using Google Earth Pro software to assess the RTK-GPS performance at an 8 cm scale. The spots were chosen at distances of 50 cm (Spot 1), 30 cm (Spot 2), 25 cm (Spot 3), and 20 cm (Spot 4) (Figure 10). In the Google Earth Pro map visualization presented in Figure 10, the selected lines and squares from the experimental setup are easily identifiable, demonstrating the excellent performance of the configured RTK-GPS in accurately locating geographical places.
The RFIP system’s output from the field experiment, which includes the number of pixels detected as the plant leaf area of Romaine lettuce (Lactuca sativa L. var. longifolia) by applying a green color detection algorithm, was compared with the reference images captured by the web camera, continuing the same experimental setup for both cases. The complete data, including all the resulting numbers, are listed in Table 1.
From the numerical data analysis, the variability in the SD of the 21 objects [Romaine lettuce (Lactuca sativa L. var. longifolia) plants] was found to be considerably low, ranging from 0.027% to 8.32% of the ROI. During the field trials, the RFIP system performed well in terms of detecting plant leaf area. The results are further represented as a bar graph showing the area detected by the RFIP system and reference system, along with the SD values as error bars on top of the corresponding data columns (Figure 11). The bar graph illustrates that the RFIP system worked considerably well when evaluated against the reference system.
The performance of the developed system was compared to the web camera system using the G-ratio algorithm. The detected area by the RFIP system showed a strong correlation with the web camera reference system ( R F I P = 1.4714 × W e b   C a m e r a ;   R 2   =   0.9566 ;   p v a l u e < 0.05 ), which implies that the developed system could explain 95.66% variability in the area detected by web camera with substantial accuracy (CCC = 0.8292). A regression model, having a zero y-intercept, was generated to visualize the performance of the RFIP system and compare it with ideal system (Figure 12). Additionally, Lin’s CCC = 0.8292 with a 95% confidence interval (0.7804, 0.8656) and C C C 0 of 0.8468 with a tolerable 5% loss of precision were obtained from the hypothesis testing based on Equations (2) and (3). As the lower limit of CCC > C C C 0 , the test is statistically significant, and the null hypothesis is rejected. Therefore, the field experiment result shows that the proposed RFIP system can be utilized to determine the area that the web camera detects.
Google Earth Pro software was used to visualize the real-time geographic locations of the RTK rover using the 147 GPS locations from the georeferenced web camera images. The best possible view from the ground-level map is shown in Figure 13.
For the final product, 147 RTK data points and the intensity of 147 FPGA data were averaged using 7 samples for each plant to obtain 21 samples, whereby the data were then formatted into a CSV file with the corresponding latitude, longitude, and intensity values. This final dataset was submitted to the webpage’s JS file, which runs as localhost on port 8000 and serves the local files. A numerical representation of the complete data, including all resulting numbers, is presented in Table 2. Due to the limited number of data points, the heat map visualization is not very prominent on the Google map (Figure 14).

4. Discussion

In accordance with the primary prerequisites for evaluating the RFIP system in an outdoor environment [22], several important factors were taken into account during field data collection. Firstly, a mainly sunny day was chosen with a temperature of 4 °C, wind speed of 14 km/h NW, and a maximum wind gust of 30 km/h. Secondly, a large, 7.5-foot, summer outdoor umbrella and a small rain umbrella were used to maintain and mitigate the effects of clouds over the plant. However, some unavoidable challenges during the field experiment could impact precise data collection, such as the wind’s effect on plant leaves, the shadowing of leaves on plants with multiple leaves, the color shadow effect from the umbrella, and the significant influence of ambient sunlight and clouds in the open field. Despite these challenges, maintaining a consistent experimental setup improved the system’s performance, as the same real-time crop growth monitoring setup remained stationary while recording the data.
In addition, the RFIP system demonstrated better performance in the outdoor evaluation (99.56% correlation) than in the real crop field evaluation (95.66% correlation) [22]. This difference may be attributed to the reduced impact of wind, the more pronounced shadow over the ground ROI, and the greater number of replications for each object (10 samples per object for outdoor evaluation and 7 samples per object for field evaluation). The RTK-GPS accuracy was precise enough to locate the field data points on the real-time Google map. Although the RFIP system did not show superior performance in real-time crop growth monitoring, it remains useful in assisting the decision support system by providing an accurate ground truth map to farmers for on-the-spot farm management decisions.

5. Conclusions

This study aimed to address the current limitations of high-resolution imagery and to achieve GPS data accuracy of better than 1 m for budget-friendly, real-time monitoring of crop growth and/or germination PA applications. Here, we demonstrated the RFIP system’s significant potential by applying it in field conditions. When validated, the RFIP system achieved a 95.66% correlation with the reference data during field testing. The proposed system has the ability to overcome the challenges in agricultural imaging associated with computational complexity, image resolution, time, and cost involved in deploying photographic technology, thereby providing real-time actionable insights for field management in smart agriculture.
This proposed system primarily focused on the development and initial evaluation of real-time crop monitoring with enhanced accuracy using cost-effective technology. However, the project faced several constraints that impacted its implementation and data collection process. The RFIP system’s automatic exposure control limitations required scheduling field data collection during specific weather conditions and using an umbrella to maintain consistent lighting. Additionally, the RTK GPS required several minutes to achieve the crucial FIXED mode for accurate positioning, potentially delaying data collection. Weight and cost constraints also prevented the use of a high-quality DSLR camera, leading to the adoption of a web camera as an alternative reference system. While this camera correlated with DSLR performance in [22], it introduced some compromises in image quality.
In the future, we aim to develop an advanced automatic exposure control system for the RFIP system that can adapt to varying weather conditions, reducing reliance on specific environmental factors. We will also investigate methods to reduce RTK GPS initialization time and improve stability in FIXED mode. Additionally, we plan to integrate the system with other mobile platforms, such as unmanned ground and aerial vehicles, for real-time crop health monitoring, yield assessment, and crop disease detection.

Author Contributions

Conceptualization, Y.K.C.; Data curation, S.S.A., T.N.-Q. and M.A.A.; Formal analysis, S.S.A., M.A.A. and Y.K.C.; Funding acquisition, Y.K.C.; Investigation, S.S.A., M.A.A. and Y.K.C.; Methodology, S.S.A., M.A.A. and Y.K.C.; Project administration, Y.K.C.; Resources, Y.K.C.; Software, S.S.A.; Supervision, Y.K.C.; Validation, S.S.A. and Y.K.C.; Writing—original draft, S.S.A. and M.A.A.; Writing—review and editing, Y.K.C., M.A.A., T.N.-Q. and B.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Natural Science and Engineering Research Council of Canada (NSERC) Discovery Grants Program (RGPIN-2017-05815), an internship program from the MITACS Accelerate program (IT20902), and the USDA National Institute of Food and Agriculture Hatch (SD00H777-23) and Hatch-Multistate (SD00R730-23).

Data Availability Statement

Data are contained within the article.

Acknowledgments

In this section, the authors would like to acknowledge administrative and technical support from Travis Esau and Ahmad Al-Mallahi.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. FAO. Hunger and Food Insecurity; Food and Agriculture Organization of the United Nations: Rome, Italy, 2023. [Google Scholar]
  2. Zhang, Q. Precision Agriculture Technology for Crop Farming; CRC Press: Boca Raton, FL, USA, 2015. [Google Scholar] [CrossRef]
  3. Al-Gaadi, K.A.; Hassaballa, A.A.; Tola, E.K.; Kayad, A.G.; Madugundu, R.; Alblewi, B.; Assiri, F. Prediction of Potato Crop Yield Using Precision Agriculture Techniques. PLoS ONE 2016, 11, e0162219. [Google Scholar] [CrossRef] [PubMed]
  4. Esau, T.J.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Percival, D.C.; Farooque, A.A. Spot-application of fungicide for wild blueberry using an automated prototype variable rate sprayer. Precis. Agric. 2014, 15, 147–161. [Google Scholar] [CrossRef]
  5. Dorj, U.-O.; Lee, M.; Yun, S. An yield estimation in citrus orchards via fruit detection and counting using image processing. Comput. Electron. Agric. 2017, 140, 103–112. [Google Scholar] [CrossRef]
  6. Rehman, T.U.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Corscadden, K.W.; Esau, T.J. Optimising the parameters influencing performance and weed (goldenrod) identification accuracy of colour co-occurrence matrices. Biosyst. Eng. 2018, 170, 85–95. [Google Scholar] [CrossRef]
  7. Zhang, N.; Wang, M.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  8. Guo, T.; Kujirai, T.; Watanabe, T. Mapping Crop Status from an Unmanned Aerial Vehicle for Precision Agriculture Applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 485–490. [Google Scholar] [CrossRef]
  9. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  10. Myers, S. 6 Ways Data Analytics Leads to Better Decisions for Farmers. Available online: https://blogs.sas.com/content/sascom/2020/10/12/6-ways-data-analytics-leads-to-better-decisions-for-farmer (accessed on 22 March 2024).
  11. Burgos-Artizzu, X.P.; Ribeiro, A.; Guijarro, M.; Pajares, G. Real-time image processing for crop/weed discrimination in maize fields. Comput. Electron. Agric. 2011, 75, 337–346. [Google Scholar] [CrossRef]
  12. Caballero, D.; Calvini, R.; Amigo, J.M. Hyperspectral imaging in crop fields: Precision agriculture. In Data Handling in Science and Technology; Elsevier: Amsterdam, The Netherlands, 2019; pp. 453–473. [Google Scholar] [CrossRef]
  13. Ramirez-Cortes, J.M.; Gomez-Gil, P.; Alarcon-Aquino, V.; Martinez-Carballido, J.; Morales-Flores, E. FPGA-based educational platform for real-time image processing experiments. Comput. Appl. Eng. Educ. 2013, 21, 193–201. [Google Scholar] [CrossRef]
  14. Price, A.; Pyke, J.; Ashiri, D.; Cornall, T. Real Time Object Detection for an Unmanned Aerial Vehicle Using an FPGA BASED Vision System. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, FL, USA, 15–19 May 2006; IEEE: New York, NY, USA, 2006; pp. 2854–2859. [Google Scholar] [CrossRef]
  15. Johnston, C.T.; Gribbon, K.T.; Bailey, D.G. Implementing Image Processing Algorithms on FPGAs. In Proceedings of the Eleventh Electronics New Zealand Conference, ENZCon’04, Palmerston North, New Zealand, 15 November 2004; pp. 118–123. [Google Scholar]
  16. AlAli, M.I.; Mhaidat, K.M.; Aljarrah, I.A. Implementing image processing algorithms in FPGA hardware. In Proceedings of the 2013 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT), Amman, Jordan, 3–5 December 2013; IEEE: New York, NY, USA, 2013; pp. 1–5. [Google Scholar] [CrossRef]
  17. Zhai, X.; Bensaali, F.; Ramalingam, S. Real-time license plate localisation on FPGA. In Proceedings of the CVPR 2011 WORKSHOPS, Colorado Springs, CO, USA, 20–25 June 2011; IEEE: New York, NY, USA, 2011; pp. 14–19. [Google Scholar] [CrossRef]
  18. Ruß, G.; Brenning, A. Data Mining in Precision Agriculture: Management of Spatial Information. In Computational Intelligence for Knowledge-Based Systems Design: 13th International Conference on Information Processing and Management of Uncertainty, IPMU 2010, Dortmund, Germany, 28 June–2 July 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 350–359. [Google Scholar] [CrossRef]
  19. Lamb, D.W.; Brown, R.B. PA—Precision Agriculture. J. Agric. Eng. Res. 2001, 78, 117–125. [Google Scholar] [CrossRef]
  20. Zude-Sasse, M.; Fountas, S.; Gemtos, T.A.; Abu-Khalaf, N. Applications of precision agriculture in horticultural crops. Eur. J. Hortic. Sci. 2016, 81, 78–90. [Google Scholar] [CrossRef]
  21. Sun, H.; Slaughter, D.C.; Ruiz, M.P.; Gliever, C.; Upadhyaya, S.K.; Smith, R.F. RTK GPS mapping of transplanted row crops. Comput. Electron. Agric. 2010, 71, 32–37. [Google Scholar] [CrossRef]
  22. Antora, S.S.; Chang, Y.K.; Nguyen-Quang, T.; Heung, B. Development and Assessment of a Field-Programmable Gate Array (FPGA)-Based Image Processing (FIP) System for Agricultural Field Monitoring Applications. AgriEngineering 2023, 5, 886–904. [Google Scholar] [CrossRef]
  23. Das, A.K. Development of an Automated Debris Detection System for Wild Blueberry Harvesters using a Convolutional Neural Network to Improve Food Quality. Master’s Thesis, Dalhousie University, Truro, Canada, 2020. [Google Scholar]
  24. Lin, L.I.-K. Assay Validation Using the Concordance Correlation Coefficient. Biometrics 1992, 48, 599–604. [Google Scholar] [CrossRef]
Figure 1. Block diagram of the real-time crop-monitoring system.
Figure 1. Block diagram of the real-time crop-monitoring system.
Agriengineering 06 00191 g001
Figure 2. Introduction to the required devices, hardware, and connections of the real-time crop-monitoring system [(1) a rover RTK board with two antennas for global navigation satellite system (GNSS) and UHF; (2) a liquid crystal display (LCD) monitor connected; (3) the RFIP system, consisting of the FPGA board and D8M camera board; (4) a web camera used for geotagging the GPS location and the detected pixel area; (5) one PC that supplies power to the USB web camera and to the RTK rover, connected with the USB output and links with the RS232 serial output port of the RFIP system; (6) a battery source with a 400 W power inverter to supply power to the FPGA device and the LCD monitor; and (7) a blue-painted wooden frame measuring 30.0 cm × 22.5 cm].
Figure 2. Introduction to the required devices, hardware, and connections of the real-time crop-monitoring system [(1) a rover RTK board with two antennas for global navigation satellite system (GNSS) and UHF; (2) a liquid crystal display (LCD) monitor connected; (3) the RFIP system, consisting of the FPGA board and D8M camera board; (4) a web camera used for geotagging the GPS location and the detected pixel area; (5) one PC that supplies power to the USB web camera and to the RTK rover, connected with the USB output and links with the RS232 serial output port of the RFIP system; (6) a battery source with a 400 W power inverter to supply power to the FPGA device and the LCD monitor; and (7) a blue-painted wooden frame measuring 30.0 cm × 22.5 cm].
Agriengineering 06 00191 g002
Figure 3. C94-M8P application board with necessary connections.
Figure 3. C94-M8P application board with necessary connections.
Agriengineering 06 00191 g003
Figure 4. Communication among the RTK base station, RTK rover, and the satellites.
Figure 4. Communication among the RTK base station, RTK rover, and the satellites.
Agriengineering 06 00191 g004
Figure 5. Data-storing PC with necessary connections.
Figure 5. Data-storing PC with necessary connections.
Agriengineering 06 00191 g005
Figure 6. Steps of the web-based visualization layer.
Figure 6. Steps of the web-based visualization layer.
Agriengineering 06 00191 g006
Figure 7. Required instructions to be written in the computer terminal for running the local server.
Figure 7. Required instructions to be written in the computer terminal for running the local server.
Agriengineering 06 00191 g007
Figure 8. RTK rover’s centimeter level accuracy test spots with scale to assist measurements (Spot 1: a straight line with an intermediate distance of 50 cm; Spot 2: a square with a side length of 20 cm; Spot 3: a straight line with an intermediate distance of 20 cm; Spot 4: a square with a side length of 30 cm).
Figure 8. RTK rover’s centimeter level accuracy test spots with scale to assist measurements (Spot 1: a straight line with an intermediate distance of 50 cm; Spot 2: a square with a side length of 20 cm; Spot 3: a straight line with an intermediate distance of 20 cm; Spot 4: a square with a side length of 30 cm).
Agriengineering 06 00191 g008
Figure 9. Selected site for field data collection inside the N.S.A.C. demonstration garden.
Figure 9. Selected site for field data collection inside the N.S.A.C. demonstration garden.
Agriengineering 06 00191 g009
Figure 10. GPS data from four spots were recorded and utilized using Google Earth Pro, allowing for a precise visual representation of the spots, which are positioned within a practical accuracy range of centimeters.
Figure 10. GPS data from four spots were recorded and utilized using Google Earth Pro, allowing for a precise visual representation of the spots, which are positioned within a practical accuracy range of centimeters.
Agriengineering 06 00191 g010
Figure 11. Comparison between pixels detected as green (the plant leafage area) by RFIP system and reference imaging system.
Figure 11. Comparison between pixels detected as green (the plant leafage area) by RFIP system and reference imaging system.
Agriengineering 06 00191 g011
Figure 12. Correlation between actual detected area by web camera and predicted area by RFIP ( R 2 = 0.9566 ;   p < 0.05 ).
Figure 12. Correlation between actual detected area by web camera and predicted area by RFIP ( R 2 = 0.9566 ;   p < 0.05 ).
Agriengineering 06 00191 g012
Figure 13. 147 GPS locations recorded using RTK rover with 8 cm accuracy.
Figure 13. 147 GPS locations recorded using RTK rover with 8 cm accuracy.
Agriengineering 06 00191 g013
Figure 14. Heat map visualization of field data collected using the real-time crop-monitoring system.
Figure 14. Heat map visualization of field data collected using the real-time crop-monitoring system.
Agriengineering 06 00191 g014
Table 1. Number of pixels detected from the ROI Area along with the SD using a G-ratio filter.
Table 1. Number of pixels detected from the ROI Area along with the SD using a G-ratio filter.
ObjectsGreen Ratio
FPGA_AreaWebCam_Area
155,914.86 ± 2356.63112,202.14 ± 3355.28
294,850.57 ± 829.61150,261.14 ± 2090.49
3111,186.71 ± 1534.93168,667.57 ± 3165.22
460,150.57 ± 413.3499,203.14 ± 3668.95
5162,434.57 ± 751.32211,898.00 ± 4453.95
6101,310.00 ± 2940.06168,733.29 ± 5122.88
753,131.86 ± 581.51102,009.71 ± 2662.57
887,633.14 ± 542.81147,850.29 ± 2586.95
952,860.29 ± 946.86109,408.14 ± 6531.86
1035,273.71 ± 785.1995,566.57 ± 2390.41
1173,615.50 ± 28,527.63130,999.00 ± 45,093.08
12129,493.00 ± 39,938.18220,778.71 ± 49,464.38
1338,720.83 ± 129.88110,204.83 ± 1358.75
1487,487.71 ± 656.52175,589.29 ± 2104.00
1567,868.14 ± 972.58133,883.14 ± 2291.39
1643,895.14 ± 166.41112,648.71 ± 3370.65
1743,451.29 ± 1888.89116,729.57 ± 2869.97
1827,292.71 ± 5389.58116,292.14 ± 3398.89
1993,550.86 ± 398.02149,379.43 ± 2268.15
2080,209.71 ± 246.35115,012.29 ± 1774.68
21127,274.86 ± 200.63189,066.29 ± 885.47
Table 2. The final dataset of 21 objects used to generate the heat map.
Table 2. The final dataset of 21 objects used to generate the heat map.
Sample No.Lat_AvgLon_AvgIntensity
145.375647°−63.262926°89
245.37564133°−63.26292433°80
345.37563592°−63.262923°77
445.37562833°−63.26292067°88
545.37562209°−63.26291933°66
645.37561783°−63.26291709°80
745.37561333°−63.2629165°89
845.375622°−63.2628955°82
945.37562617°−63.262898°89
1045.37563075°−63.26289909°93
1145.37563283°−63.26290167°87
1245.37563959°−63.2629°70
1345.37564167°−63.2629015°81
1445.37564825°−63.262903°82
1545.37565609°−63.26288642°86
1645.3756505°−63.26288467°91
1745.37564592°−63.26288159°91
1845.37563975°−63.262882°95
1945.37563542°−63.26287959°81
2045.37562867°−63.26288242°83
2145.37562483°−63.26287617°74
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Antora, S.S.; Alahe, M.A.; Chang, Y.K.; Nguyen-Quang, T.; Heung, B. Application of a Real-Time Field-Programmable Gate Array-Based Image-Processing System for Crop Monitoring in Precision Agriculture. AgriEngineering 2024, 6, 3345-3361. https://doi.org/10.3390/agriengineering6030191

AMA Style

Antora SS, Alahe MA, Chang YK, Nguyen-Quang T, Heung B. Application of a Real-Time Field-Programmable Gate Array-Based Image-Processing System for Crop Monitoring in Precision Agriculture. AgriEngineering. 2024; 6(3):3345-3361. https://doi.org/10.3390/agriengineering6030191

Chicago/Turabian Style

Antora, Sabiha Shahid, Mohammad Ashik Alahe, Young K. Chang, Tri Nguyen-Quang, and Brandon Heung. 2024. "Application of a Real-Time Field-Programmable Gate Array-Based Image-Processing System for Crop Monitoring in Precision Agriculture" AgriEngineering 6, no. 3: 3345-3361. https://doi.org/10.3390/agriengineering6030191

Article Metrics

Back to TopTop