Next Article in Journal
Compact VHF/UHF Ultrawideband Discone Antenna with Consistent Pattern
Previous Article in Journal
Study of Acoustic Emission Signal Noise Attenuation Based on Unsupervised Skip Neural Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Farm Monitoring System with Drones and Optical Camera Communication

1
Department of Electrical Engineering and Computer Science, Tokyo University of Agriculture and Technology, Tokyo 184-8588, Japan
2
Department of Opto-Electronics System Engineering, Chitose Institute of Science and Technology, Chitose 066-8655, Japan
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(18), 6146; https://doi.org/10.3390/s24186146
Submission received: 20 August 2024 / Revised: 18 September 2024 / Accepted: 20 September 2024 / Published: 23 September 2024
(This article belongs to the Section Internet of Things)

Abstract

:
Drones have been attracting significant attention in the field of agriculture. They can be used for various tasks such as spraying pesticides, monitoring pests, and assessing crop growth. Sensors are also widely used in agriculture to monitor environmental parameters such as soil moisture and temperature. Due to the high cost of communication infrastructure and radio-wave modules, the adoption of high-density sensing systems in agriculture is limited. To address this issue, we propose an agricultural sensor network system using drones and Optical Camera Communication (OCC). The idea is to transmit sensor data from LED panels mounted on sensor nodes and receive the data using a drone-mounted camera. This enables high-density sensing at low cost and can be deployed in areas with underdeveloped infrastructure and radio silence. We propose a trajectory control algorithm for the receiving drone to efficiently collect the sensor data. From computer simulations, we confirmed that the proposed algorithm reduces total flight time by 30 % compared to a shortest-path algorithm. We also conducted a preliminary experiment at a leaf mustard farm in Kamitonda-cho, Wakayama, Japan, to demonstrate the effectiveness of the proposed system. We collected 5178 images of LED panels with a drone-mounted camera to train YOLOv5 for object detection. With simple On–Off Keying (OOK) modulation, we achieved sufficiently low bit error rates (BERs) under 10 3 in the real-world environment. The experimental results show that the proposed system is applicable for drone-based sensor data collection in agriculture.

1. Introduction

Agriculture is the cornerstone of human civilization, not only as the primary source of food but also as a provider of essential raw materials for numerous industries. It plays a crucial role in the economic development of nations, contributing significantly to their growth and stability [1,2]. Despite this, a significant proportion of farmers worldwide continue to rely on traditional farming methods that often result in suboptimal yields. To address these challenges, the concept of smart agriculture, which aims to enhance productivity and efficiency through the automation of agricultural practices, has gained significant attention [3].
Smart agriculture refers to the integration of advanced technologies, such as robotics, artificial intelligence (AI), and the Internet of Things (IoT), into farming practices. These technologies enable the optimization of various agricultural processes, leading to enhanced efficiency and productivity. Key elements of smart agriculture include the deployment of autonomous ground vehicles, drones, and agricultural sensors that provide real-time data for precision farming.
In recent years, the use of drones in smart farming systems has become increasingly widespread [4,5]. Agricultural drones can follow preprogrammed flight paths autonomously, reducing the need for manual intervention. These drones are capable of performing critical agricultural tasks both during the day and at night, thereby alleviating the workload of farmers [6]. Drones are used for a variety of agricultural applications, including the spraying of pesticides and fertilizers as well as pest monitoring. One of the most valuable functions of drones is their ability to capture data through multispectral cameras, which can assess crop health, detect early signs of disease, and predict optimal harvest times [7,8]. This technology allows farmers to remotely monitor the condition of their crops and take timely action to address potential issues. The continuous development of additional functionalities for drones holds great potential for further enhancing agricultural practices.
Environmental monitoring using sensors to measure parameters such as temperature, humidity, soil moisture, and light intensity is a critical component of smart agriculture. These sensors enable precise management of crop growth, allowing farmers to make data-driven decisions to optimize agricultural productivity. The collection of sensor data is typically achieved through wireless communication, with protocols such as Low-Power Wide-Area Networks (LPWANs) being widely adopted due to their efficiency in transmitting data over long distances. However, despite the relatively low cost of sensors, the communication modules necessary for radio-based data transmission are often prohibitively expensive. This presents a significant challenge for the deployment of high-density sensor networks, where a large number of sensors are required to comprehensively monitor agricultural environments. The high cost of such infrastructure limits the scalability of sensor-based systems, particularly in large or resource-constrained farming operations. In addition to the cost barrier, geographical and infrastructural challenges also hinder the adoption of wireless sensor networks. In mountainous or remote regions, where communication infrastructure is sparse or nonexistent, radio-based transmission may fall outside the coverage area, further complicating the implementation of sensor networks in smart agriculture. Addressing these infrastructural gaps is essential to fully realize the potential of sensor-driven technologies in agricultural management.
To address this issue, we propose an agricultural sensor network system that leverages Optical Camera Communication (OCC) technology. OCC is a communication method that uses visible light to transmit data between a light source and a camera. The idea behind the proposed system is to transmit sensor data from LED panels mounted on sensor nodes and receive the data using a drone-mounted camera. The proposed system offers high-density sensing at a low cost due to the use of LED modules, which significantly reduces the cost of wireless communication. Thanks to the use of visible light, OCC does not require a license for operation, making it a cost-effective alternative to traditional radio-based communication systems. To efficiently collect densely distributed sensor data, we propose a trajectory control algorithm for the receiving drone. The proposed algorithm calculates the flight route and speed based on the locations and sizes of the light sources and the specifications of the receiver camera.
The primary contribution of this paper is the development of a low-cost, high-density sensing system utilizing existing agricultural drones. The proposed system leverages OCC technology to enable efficient data collection from sensor nodes distributed across farmland. This system is particularly suitable for deployment in areas with limited infrastructure or radio communication constraints. We present preliminary experimental results from a leaf mustard farm in Kamitonda-cho, Wakayama, Japan, demonstrating the feasibility of the proposed system. We implemented a system where drones automatically collect data from sensor nodes distributed across farmland. An Arduino UNO microcontroller was used to manage the sensor nodes, and a DJI Mavic 2 Pro drone served as the data receiver. The experimental results confirmed that OCC-based data transmission was successful, with minimal errors, allowing for effective collection of sensor data.
The rest of this paper is organized as follows. Section 2 summarizes related research relevant to this study. In Section 3, we introduce the proposed system. We explain the overall flow of the proposed system and provide the flight route calculation method in the proposed algorithm. Then, Section 4 provides the performance of the proposed algorithm using computer simulations and the effectiveness of the algorithm. Experimental results are presented in Section 5. We implemented a program for the drone to move autonomously and evaluated the feasibility of the proposed system. Finally, Section 6 summarizes the conclusions of this paper.

2. Related Work

2.1. Smart Agriculture with Drones

Recent advancements in drone technology have enabled applications across many industries. In agriculture, drones have significantly improved efficiency and productivity [9]. Their high mobility and ability to collect large amounts of data make them essential tools for monitoring and data collection in farming [10,11]. Research on the use of drones in agriculture continues to grow [12]. Pasquale et al. [13] highlighted the innovative role of drones in agriculture. They demonstrated that drones can collect and analyze data with high precision, improving efficiency and productivity. Spoorthi et al. [14] proposed the use of drones for spraying pesticides and fertilizers. They found that drone-based spraying provides more uniform coverage and greater efficiency than traditional methods. Paulina et al. [15] studied crop condition monitoring using drones equipped with multispectral and temperature sensors. They calculated the Normalized Difference Vegetation Index (NDVI) through field tests. Namani et al. [16] developed a crop management system using real-time drone data, IoT, and cloud computing. This system optimizes irrigation by analyzing sensor data to provide water at the right time and in the required amounts. Jawad et al. [17] tackled the problem of limited drone flight time by developing wireless power transfer (WPT) technologies. Shidrokh et al. [18] proposed a routing algorithm for efficient and accurate data collection. This algorithm considers both flight time and speed. Ciciouglu et al. [19] created a drone-based system for monitoring large cornfields. Their drones, equipped with high-resolution cameras and sensors, allowed for real-time monitoring and efficient management of the fields. Moribe et al. [20] introduced a communication protocol for drones to monitor leaf temperature data using infrared thermometers. Their system combines wireless sensor networks with drones. This setup allows for real-time monitoring of large areas and rapid data collection. Drones in this system extend the communication range of sensor networks and improve crop yield monitoring.

2.2. Sensor-Based Monitoring in Agriculture

Sensor-based monitoring is considered one of the most important technologies for smart agriculture [21]. Ramson et al. [22] developed a real-time soil health monitoring system based on a Long-Range Wide-Area Network (LoRaWAN). In this system, data collected from soil sensors are transmitted to a cloud platform via the LoRaWAN. Chen et al. [23] proposed a real-time temperature monitoring and frost prevention control system. This system, based on Z-BEE wireless sensor networks, automatically monitors temperature to prevent frost damage to tea plants. Khalid et al. [24] introduced an energy-efficient and secure IoT-based wireless sensor network framework for smart agriculture. Their framework provided stable network performance between agricultural sensors and ensured secure data transmission. Valecce et al. [25] evaluated the performance of data collection and monitoring systems in agriculture using Narrowband Internet of Things (NB-IoT) technology. They analyzed performance metrics such as data accuracy, communication latency, and battery consumption. Tsai et al. [26] proposed an environmental monitoring system for microfarms. In this system, environmental data collected from sensors are transmitted to a cloud server via a Wi-Fi network.

2.3. Optical Camera Communication (OCC)

OCC is a technology that utilizes a camera and a light source for data transmission. Light sources such as LED panels, displays, and digital signage are used as transmitters to send digital data in the form of optical signals [27]. A receiver camera captures the transmitted signals using a complementary metal-oxide-semiconductor (CMOS) sensor. OCC is particularly suited for point-to-multipoint (P2MP) communication, as it can use multiple light sources in a given area with minimal interference between transmitters. The multiple light signals captured in the image can be demodulated by cropping the pixels corresponding to each light source [28]. OCC falls under the category of visible light communication (VLC), where a light source serves as the transmitter and a camera functions as the receiver.
The primary advantages of OCC include cost efficiency, unlicensed operation, and low power consumption. Since most modern agricultural drones are equipped with cameras [8], they can serve as effective receivers. This makes OCC a promising technology for remote sensing applications in agriculture. In [29], the performance of OCC was evaluated based on parameters such as camera sampling rate, exposure time, focal length, pixel edge length, transmitter configurations, and optical flicker rate. The study defined the signal-to-interference-plus-noise ratio (SINR) and analyzed different modulation schemes. Ambient light interference is a major challenge for visible light communication, including OCC. Some studies have proposed solutions to mitigate this issue. Islim et al. [30] examined the impact of sunlight on VLC. They clarified the specific effects of sunlight and proposed a system to counteract them, achieving data rates exceeding 1 Gb/s even under strong sunlight. Takai et al. [31] developed a vehicle-to-vehicle communication system using OCC. They achieved a reception rate of 13.0 frames per second (fps) under real-world driving and outdoor lighting conditions. Elizabeth et al. [32] conducted experiments evaluating a 400-m communication link using OCC in outdoor environments. Their results demonstrated stable communication performance under various weather and lighting conditions. Drone-based OCC systems have also been studied in various applications. Takano et al. [33] investigated OCC systems using drones equipped with RGB-LEDs and a high-speed camera to achieve communication over a distance of 300 m. Their experimental results demonstrated that efficient and accurate communication is possible by employing object detection techniques. Li et al. [34] focused on the trajectory control of drones to maintain line-of-sight (LoS) communication links in OCC systems. They proposed a distributed trajectory control algorithm that allows drones to avoid inter-light interference during communication. However, the previous studies did not consider dense sensor data collection in agriculture using drones and OCC.
Furthermore, to the best of our knowledge, there has been no previous work on drone-based agricultural sensor networks using OCC. This paper proposes a novel system that combines drones and OCC for high-density sensor data collection in agriculture. The proposed system leverages the advantages of OCC to enable efficient data transmission from sensor nodes to a drone-mounted camera. The system is designed to be cost-effective, scalable, and suitable for deployment in areas with limited infrastructure. The details of the proposed system are presented in the following section.

3. Proposed Scheme

3.1. Concept

We propose a drone-based sensor network system using OCC. The conceptual system architecture is illustrated in Figure 1, while the block diagram of the system is detailed in Figure 2. Many sensor nodes are deployed in a target area. A sensor node is equipped with sensors and a light source. The light source can be an LED light, an LED panel, or a display. The sensor data are encoded and modulated as optical signals to be transmitted from the light source. A receiver drone equipped with a camera moves around the target area to film the optical signals from the sensor nodes. The received signals are demodulated at either the drone or a cloud/edge server in real time.
We propose a trajectory control algorithm for the receiver drone to efficiently collect the sensor data. The proposed algorithm calculates the flight route and speed based on the locations and sizes of the light sources and the specifications of the receiver camera. The proposed system can accommodate a large number of sensor nodes without time- and frequency-domain interference thanks to OCC. It can be deployed in radio quiet or infrastructure-underdeveloped areas since it does not use any radio waves.

3.2. Variable Definition

3.2.1. Coordination Systems

In this section, we describe the coordination systems used in the proposed model. We define the global coordination system as ( x , y , z ) . The camera coordination system is also defined as ( X , Y , Z ) . The Y-axis is aligned with the center line of the image. Note that the origin is set to the receiver camera at ( x c ( t ) , y c ( t ) , z c ( t ) ) . The elevation angle of the camera is defined as θ .

3.2.2. Variables

The variables used in the proposed model are summarized in Table 1. Let I denote the set of sensor nodes and i , j denote the identifier for them. The position of the ith sensor node in the global coordinate system is defined as ( x i , y i , z i ) . For simplicity, a sensor node is approximated as a sphere with radius r i . Let ( u i , v i ) denote the coordinates of the center of the ith sensor node in the image plane. The horizontal and vertical resolutions of the image are l h and l v , respectively. The horizontal and vertical angles of view of the camera are denoted as ϕ h and ϕ v . The focal length of the camera is described as f. The size of the image sensor is defined as ρ .

3.3. System Model

Based on the defined variables, we present the system model that describes how the sensor data are collected by the receiver drone.

3.3.1. Coordination Transformation

The positions of the sensor nodes in the image plane are computed with the coordination transformation. Since we assume a moving receiver drone, the relative position between ( x i , y i , z i ) and the origin ( x c ( t ) , y c ( t ) , z c ( t ) ) is computed. The position of the ith sensor node in the camera coordination system is formulated as
X i ( t ) Y i ( t ) Z i ( t ) = 1 0 0 0 cos ( θ ) sin ( θ ) 0 sin ( θ ) cos ( θ ) x i x c ( t ) y i y c ( t ) z i z c ( t )
which consists of the parallel displacement and the rotation by the elevation angle.
Then, the position of the ith sensor node in the image plane is computed with the perspective transformation as
( u i ( t ) , v i ( t ) ) = l h 2 tan ϕ h 2 0 0 l v 2 tan ϕ v 2 X i ( t ) Y i ( t ) Z i ( t ) Y i ( t ) + l h 2 l v 2 .
The perspective transformation is depicted in Figure 3. Let p i denote the size of a sensor node in the image plane. It is calculated with the radius of the projected circle as
p i = r i ρ Y i ( t ) f .
The conditional expression that the ith sensor node is filmed by the receiver drone is formulated as
0 u i ( t ) p i u i ( t ) + p i l h 0 v i ( t ) p i v i ( t ) + p i l v .

3.3.2. Ground Coverage

The ground coverage of the drone-mounted camera is a trapezoid A shown in Figure 4. The ground coverage of this trapezoid is defined as
a = 2 h cos ϕ v 2 tan ϕ h 2 cos θ ϕ v 2 b = 4 h tan ϕ h 2 cos θ a c = h tan θ + ϕ v 2 tan θ ϕ v 2
where a is the top length, b is the bottom length, c is the height of the trapezoid A , and h is the altitude of the receiver camera. The receiver can simultaneously receive the optical signals from the sensor nodes within this trapezoid at the same time. These sensor nodes are grouped as a single virtual sensor node. The sensors within a trapezoid are grouped to improve the efficiency of data collection. Since the size of the trapezoid increases in accordance with the altitude h, it is efficient to maximize h.

3.3.3. Altitude Limit

As the altitude of the drone-mounted camera increases, the size of a sensor node in the image plane decreases, and the accuracy of signal demodulation decreases. The altitude of the receiver camera is limited to establish a stable link with a ground sensor node. The constraint is described as
p i p t h ,
where p t h denotes the threshold for the size of a sensor node in the image plane. By satisfying (6), the optical signal transmitted from the sensor node can be correctly demodulated. In other words, a light source must be sufficiently large in the image to correctly demodulate the optical signals. From (3) and (6), the altitude limit of a receiver camera is determined as
h f r i ρ p t h cos θ .

3.3.4. Transmission Time

Here we formulate the data transmission time from a sensor node. The transmission rate of an OCC link is determined by the modulation number and symbol rate. Optical spatial modulation and color-shift keying (CSK) are employed to increase the data rate. Multiple light sources are employed in optical spatial modulation. CSK exploits the design of three-color luminaires of LED. Optical signals are modulated by modifying the light intensity to generate predefined constellation symbols [35]. The range of the symbol rate is constrained by the frame rate of the receiver camera and the image processing speed.
The transmission rate is formulated as
R i = S i D log 2 N i ,
where R i is the data rate, S i is the optical spatial multiplicity, D is the symbol rate, and N i is the number of constellation symbols. The maximum transmission time is computed as
T i = M i R i ,
where we define the maximum data size as M i . This transmission time has a direct impact on the trajectory of the drone, as it must remain within a sufficient time range to complete the data reception.

3.3.5. Trajectory Requirement

It is required for the receiver drone to film each sensor node for sufficient time duration to receive the transmitted data. Let τ i denote the time length where (5) is satisfied. To ensure receiving the maximum data size from the ith sensor node, τ i must satisfy
τ i T i .
The time required to receive the maximum data size from a single virtual sensor node can be formulated as
τ i M i S i D log 2 N i .
The trajectory of the receiver drone must be determined to satisfy (11).

3.4. Algorithm

The goal of the proposed algorithm is to achieve the approximately shortest trajectory ensuring data transmission from all sensor nodes. The concept of the trajectory control algorithm is shown in Figure 5. The proposed algorithm is summarized in Algorithm 1. The proposed algorithm consists of the following steps.

3.4.1. Node Clustering

The sensor nodes are organized into clusters to form virtual sensor nodes. Neighboring sensor nodes are grouped to satisfy (5). V represents a set of sensor nodes I that have been grouped using an appropriate clustering algorithm. It is verified that all sensor nodes in the vth group, V v , are situated within the trapezoid A , thereby satisfying (5). If any sensor node within V v does not reside within A , clustering is repeated. This guarantees that the groups satisfy (5). The clustering algorithm is selected based on the distribution of sensor nodes. The altitude of the receiver drone, h, is set to the highest value that satisfies (7). Minimizing the number of virtual sensor nodes reduces the receiver drone’s flight time.
Algorithm 1 Trajectory control algorithm.
  • Input:  I
  • Output:  T r a j e c t o r y , T i m e s
  1:
# Node clustering
  2:
function IsInTrapezoid( V v )
  3:
    for  i V v  do
  4:
        if  ( x i , y i , z i ) A  then
  5:
           return False
  6:
        end if
  7:
    end for
  8:
    return True
  9:
end function
10:
function ClusterNodes( I )
11:
     V ← AnyClusteringAlgorism( I )
12:
     c l u s t e r s
13:
    for v in range ( | V | )  do
14:
        if  I s I n T r a p e z o i d ( V v ) = T R U E  then
15:
            c l u s t e r s . a p p e n d ( V v )
16:
        else
17:
            c l u s t e r s . a p p e n d (ClusterNodes( V v ))
18:
        end if
19:
    end for
20:
    return  c l u s t e r s
21:
end function
22:
function CalculateFilmingTime( c l u s t e r s )
23:
     t i m e s
24:
    for  c c l u s t e r s  do
25:
         τ m a x ← 0
26:
        for  i c  do
27:
           if  τ i > τ m a x  then
28:
                τ m a x τ i
29:
           end if
30:
        end for
31:
         t i m e s . a p p e n d ( τ m a x )
32:
    end for
33:
    return  t i m e s
34:
end function
35:
c l u s t e r s ← ClusterNodes( I )
36:
# Graph generation
37:
G ← GenerateGraph( c l u s t e r s )
38:
# Trajectory determination
39:
T r a j e c t o r y ← TSP( G )
40:
T i m e s ← CalculateFilmingTime( c l u s t e r s )

3.4.2. Graph Generation

Following the clustering of the sensor nodes, a graph representing the virtual sensor nodes is constructed. This graph, denoted by G = ( V , E ) , comprises the set of virtual sensor nodes V and the edges E. The graph enables the computation of an optimized trajectory using a traveling salesman problem (TSP)-based approach.

3.4.3. Trajectory Determination

The trajectory is determined by solving a TSP on graph G . The time required for filming each virtual node is calculated in accordance with the formula presented in (9). To ensure reliable data transmission, the drone allocates sufficient time to film each sensor node. This duration is calculated using (11), which determines the required filming time based on sensor data size and transmission rate. Meeting this condition guarantees successful reception of all data from each sensor node. Consequently, the receiver drone can efficiently collect data from all sensor nodes in the farm.

4. Computer Simulation

The performance of the proposed algorithm was verified via computer simulation.

4.1. Simulation Conditions

Table 2 summarizes the parameters used in the simulation. To evaluate the performance of the proposed algorithm under realistic conditions, the parameters are the same as those used in the preliminary experiments in Section 5. These values are based on the specifications of the drone and the sensor nodes. The altitude of the receiver camera was set to 5 m. The speed of the drone was set to 3 m/s. The drone stops to film the sensor node for a certain amount of time depending on the size of the transmitted data. The filming time ranged from 0 to 3 s. The horizontal and vertical angles of view were 77 ° and 40 ° , respectively. The elevation angle of the camera was set to 20 ° .
The sensor nodes were distributed randomly in a square area. The density of sensor nodes was set to 4 per 10 × 10 m. The size of the study area ranged from 50 × 50 to 200 × 200 m. The simulation was iterated 1000 times for each condition with different seeds. The performance of the proposed trajectory control algorithm was compared with that of the shortest path algorithm. We employed several clustering algorithms such as k-means, group average method, and Ward’s method.

4.2. Simulation Results

4.2.1. Total Trajectory Length

Figure 6 shows the total trajectory length of the drone to collect the data from all of the deployed sensor nodes. The points represent the average. The error bars show the maximum and minimum values. The trajectory length increased according to the field size. As the field size increased beyond 120 m, a clear separation in performance became more apparent. The shortest path consistently resulted in the longest trajectory across all field sizes, whereas the proposed algorithm demonstrated slightly more efficient trajectory lengths. The proposed algorithm using k-means tended to yield the shortest trajectory, closely followed by the group average method and Ward’s method. This suggests that the proposed method is more efficient in optimizing the total trajectory length in larger fields.

4.2.2. Total Travel Time

Figure 7 shows the total travel time to receive all the data from the sensor nodes where the field size was 50 × 50 m. The points and error bars represent the average and the maximum/minimum values. The total travel time increased as the filming time, i.e., data size of the sensor nodes, increased. As the filming time increased from 0 to 3 s, the total travel time for all algorithms increased proportionally. The shortest path consistently produced the highest total travel time over the entire range of filming times. On the other hand, the proposed algorithm achieved significantly lower total travel times. This is because the sensor nodes were clustered so that the receiver drone simultaneously received data from multiple sensor nodes. Among the clustering algorithms, the k-means algorithm typically showed the lowest total travel time, closely followed by the group average and Ward’s methods. We confirmed that the proposed algorithm outperformed the shortest path algorithm regardless of the conditions and clustering algorithms. Although the expected performance was almost the same with the clustering algorithms employed, the minimum value of the total travel time was slightly shorter with the group average method.

5. Experimental Results

This section presents the experimental results of the proposed scheme. The feasibility of the proposed system and the accuracy of data transmission in the real environment were confirmed using a drone and an implemented sensor node.

5.1. Experimental Condition

5.1.1. Overview

The experiments consisted of three steps. First, the ground coverage of the drone-mounted camera was confirmed in the real environment. The theoretical equations were verified under different elevation angles of the camera. Second, an object detection model was trained using YOLOv5 in the real-world environment and evaluated for accuracy. This model is used in OCC to detect LED panels in images. Third, the accuracy of data transmission of OCC in the real-world environment was evaluated based on the bit error rate (BER). The performance was evaluated under different daylight conditions. Figure 8 shows the experimental setup. The deployed sensor nodes are marked with circles. The sensor nodes transmit sensor data.

5.1.2. Experimental Setup

We employed a Mavic 2 Pro drone, launched by DJI Co., Ltd. Shenzhen, China The resolution of the receiving camera was 1080 × 1920 , the pixel count was 2 mega pixels, and the frame rate was 30 fps. The focal length was f = 28 mm, and the zoom magnification was set to M = 1 .
Figure 9 shows a sensor node. A sensor node was equipped with five sensors and an LED panel. It collected temperature, humidity, illumination, soil water content, and infrared sensor data. We employed WS2812B serial LED panels with 64 LED lights arranged in a square. An Arduino UNO microcontroller was connected to the sensors and the LED panel to control them.
The battery supplies power to the sensors, the Arduino UNO microcontroller, and the LED panel.
In this experiment, On–Off Keying (OOK) was used as the modulation scheme. The Arduino UNO microcontroller encoded and modulated the sensor data as optical signals to be transmitted by the LED panel. A sensor node transmitted optical signals when a drone flew over it by detecting the drone with an infrared sensor to reduce energy consumption. An example image of the sensor node taken by the drone is shown in Figure 10.
The study area was a leaf mustard farm in Kamitonda-cho, Wakayama, Japan. We deployed eight sensor nodes in an area of 50 × 40 m2, which is shown in Figure 11. The flight trajectory of the drone was calculated using the proposed algorithm.

5.1.3. Coding and Modulation

In this experiment, the sensor data collected at the sensor nodes were encoded and modulated as optical using On–Off Keying (OOK). More specifically, the sensor data were converted to binary. The bit sequence was then converted into a data signal using 3b4b coding. The LED panels mounted on the sensor nodes were used as transmitters. A signal sent from a data panel carried one bit with OOK. In other words, the one bit was modulated as the on–off signals of the blue color.

5.1.4. Demodulation

The transmitters mounted on the sensor nodes transmitted continuous light that was captured by a drone-mounted camera. The captured video was sent to an edge server to be divided into a series of static images. The edge server used the filmed images to demodulate the optical signals from the sensor nodes. Specifically, it demodulated the optical signals by mapping the RGB values of the LED panels to the (x, y) color space. The color coordinates of each bit were determined by pilot signals. The optical signals were demodulated based on the determined color coordinates.

5.2. Results

5.2.1. Model Confirmation

First, we confirmed the mathematical model based on the coordination transformation. We measured the ground coverage of the drone-mounted camera in the real environment. Then, we compared the measured results with the theoretical model formulated in (5). Table 3 summarizes the difference of the measured values from the theoretical values. From this result, we confirmed the feasibility of the mathematical model.

5.2.2. Light Source Detection

We employed YOLOv5, which is a famous machine learning model using a CNN, to detect the LED panels from the optical signal images. To ensure the recognition accuracy of the LED panels mounted on the sensor nodes, we collected 5178 images as training data and marked the positions of the LED panels. The resolution of the images was changed to 1920 × 1080 . The results of the machine learning experiments with different parameters are presented in Table 4. YOLOv5l was selected as the model weight due to its superior detection accuracy and relatively short training time. We evaluated the performance of each parameter configuration by comparing the final loss values and the mean Average Precision (mAP). After comparing all configurations, we adopted the parameter set with a batch size of 16, an image size of 640, 300 epochs, and a final loss of 0.0030712, achieving an mAP of 0.71541, as indicated by the underlined values in Table 4. The PC environment was Ubuntu 20.04 and the GPU environment was NVIDIA GeForce RTX 3090. The software environments were CUDA 11.3 , CUDNN 8.8 , and Python 3.8 . The datasets were obtained by changing the location and time at which the images were taken to improve recognition accuracy under different conditions. This is because noise from ambient light changes the visibility of LED panels.
The results for the detection accuracy of the LED panels using YOLOv5 are shown in Figure 12. The accuracy of the object detection model was evaluated using loss and mAP. The object loss function is a measure of the probability that the detection target is within the region of interest. The lower the value of the loss function, the higher the accuracy. Figure 12a shows the evolution of the loss function per epoch. Before the training batch reached 50, both loss function values decreased rapidly. When the training batch reached 50, the decrease in both loss function values gradually slowed down. The mAP is the average of the average accuracy per class. In this paper, mAP and AP are equal because the number of classes is one. The higher the mAP value, the more accurate the network. Figure 12b shows the evolution of mAP per epoch. The value of mAP_ 0.5 reached 0.98 . The mAP_ 0.5 : 0.95 reached 0.71 . We confirmed that the object detection model successfully detected the LED panels.

5.2.3. Signal Reception

First, we confirmed that the drone had successfully filmed all the sensor nodes for a sufficient duration. The drone autonomously moved along the calculated trajectory to film the optical signals from the LED panels. The sensor nodes detected the drone and transmitted sensor data. Second, we measured the bit error rate (BER) to evaluate the accuracy of data transmission in the real environment. To evaluate the performance under different outdoor lighting conditions, the BER was measured per filmed hour. The calculated BER is shown in Figure 13. The average BER measured in this experiment was 0.00043338 . Assuming the use of 7 % hard decision forward error correction (HD-FEC), the BER threshold was set to 1 × 10 3 . A receiver drone equipped with a camera received the transmitted data with sufficiently few errors. A receiver drone equipped with a camera received the transmitted data without error. As a result, it was confirmed that the BER was low enough to receive the sensor data regardless of the observation time.

6. Conclusions

In this study, we introduced a sensor network system for farm monitoring that combines drones and OCC. The proposed system allows for high-density sensing at a reduced cost. By using OCC, the system collects data from multiple sensor nodes spread across agricultural fields. This approach minimizes the need for expensive communication infrastructure and addresses the challenge of deploying high-density sensor networks in agriculture. We developed a trajectory control algorithm to optimize data collection. This algorithm improves communication efficiency between drones and sensor nodes. It utilizes OCC’s strength in point-to-multipoint communication by reducing interference between multiple light sources. From computer simulations, we confirmed that the proposed algorithm reduces total flight time by 30 % compared to a shortest-path algorithm. Our preliminary experiments at a leaf mustard farm in Kamitonda-cho, Wakayama, Japan, demonstrated the feasibility of the system. We collected 5178 images of LED panels with a drone-mounted camera to train YOLOv5 for object detection. With simple On–Off Keying (OOK) modulation, we achieved sufficiently low bit error rates (BERs) under 10 3 in the real-world environment. These results validate the system’s potential to enhance agricultural practices with a cost-effective and scalable sensor network.
The proposed system is a significant step toward promoting intelligent agriculture. It offers a practical solution for overcoming the challenges of cost and infrastructure. The system is highly adaptable and can be deployed in areas with limited infrastructure or without radio-based communication, making it suitable for diverse agricultural environments. A major benefit of the proposed system is its compatibility with existing agricultural drones, such as those used for spraying pesticides. These drones can be easily repurposed as receivers within the OCC communication framework. Additionally, the cameras used for OCC can provide visual crop monitoring, offering dual functionality for data collection and crop observation. Future work will focus on improving the system’s capabilities, particularly in integrating self-localization features to enhance drone navigation and control.

Author Contributions

Validation, S.K.; Investigation, S.K.; Writing—original draft, S.K.; Writing—review & editing, Y.N.; Supervision, Y.N.; Project administration, N.Y. and Y.N. All authors have read and agreed to the published version of the manuscript.

Funding

A part of this work was supported by JSPS KAKENHI Grant Number JP20H04178 and JST, Presto Grant Number JPMJPR2137, and GMO Foundation, Japan.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to express their gratitude to Idomoto Farm at Kamitonda, Wakayama, Japan, and Sasaki Farm, Chitose, Japan, for their cooperation and valuable discussions throughout the experiments.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Scott, M.S.; Frank, L.; G Philip, R.; Stephen, K.H. Ecosystem services and agriculture: Cultivating agricultural ecosystems for diverse benefits. Ecol. Econ. 2007, 64, 245–252. [Google Scholar]
  2. Luc, C.; Lionel, D.; Jesper, K. The (evolving) role of agriculture in poverty reduction—An empirical perspective. J. Dev. Econ. 2011, 96, 239–254. [Google Scholar]
  3. Navulur, S.; Prasad, M.G. Agricultural management through wireless sensors and internet of things. Int. J. Electr. Comput. Eng. 2017, 7, 3492. [Google Scholar] [CrossRef]
  4. Del Cerro, J.; Cruz Ulloa, C.; Barrientos, A.; de León Rivas, J. Unmanned aerial vehicles in agriculture: A survey. Agronomy 2021, 11, 203. [Google Scholar] [CrossRef]
  5. Devi, G.; Sowmiya, N.; Yasoda, K.; Muthulakshmi, K.; Kishore, B. Review on Application of Drones for Crop Health Monitoring and Spraying Pesticides and Fertilizer. J. Crit. Rev. 2020, 7, 667–672. [Google Scholar]
  6. Shubhangi, G.R.; Mrunal, S.T.; Chaitali, V.W.; Manish, D.M. A Review on Agricultural Drone Used in Smart Farming. Int. Res. J. Eng. Technol. 2021, 8, 313–316. [Google Scholar]
  7. Primicerio, J.; Di Gennaro, S.F.; Fiorillo, E.; Genesio, L.; Lugato, E.; Matese, A.; Vaccari, F.P. A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 2012, 13, 517–523. [Google Scholar] [CrossRef]
  8. Ahirwar, S.; Swarnkar, R.; Bhukya, S.; Namwade, G. Application of Drone in Agriculture. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2500–2505. [Google Scholar] [CrossRef]
  9. Hafeez, A.; Husain, M.A.; Singh, S.P.; Chauhan, A.; Khan, M.T.; Kumar, N.; Soni, S.K. Implementation of drone technology for farm monitoring and pesticide spraying: A review. Inf. Process. Agric. 2023, 10, 192–203. [Google Scholar] [CrossRef]
  10. Kurkute, S.; Deore, B.; Kasar, P.; Bhamare, M.; Sahane, M. Drones for smart agriculture: A technical report. Int. J. Res. Appl. Sci. Eng. Technol. 2018, 6, 341–346. [Google Scholar] [CrossRef]
  11. Nikki, J.S. Drones: The Newest Technology for Precision Agriculture. Nat. Sci. Educ. 2015, 44, 89–91. [Google Scholar]
  12. Abderahman, R.; Alireza, A.; Karim, R.; Horst, T. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar]
  13. Pasquale, D.; Luca, D.V.; Luigi, G.; Luigi, I.; Davide, L.; Francesco, P.; Giuseppe, S. A review on the use of drones for precision agriculture. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2019; Volume 275, p. 012022. [Google Scholar]
  14. Spoorthi, S.; Shadaksharappa, B.; Suraj, S.; Manasa, V.K. Freyr drone: Pesticide/fertilizers spraying drone—An agricultural approach. In Proceedings of the 2017 2nd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, 23–24 February 2017; pp. 252–255. [Google Scholar] [CrossRef]
  15. Paulina, L.R.; Jaroslav, S.; Adam, D. Monitoring of crop fields using multispectral and thermal imagery from UAV. Eur. J. Remote Sens. 2019, 52, 192–201. [Google Scholar]
  16. Namani, S.; Gonen, B. Smart agriculture based on IoT and cloud computing. In Proceedings of the 2020 3rd International Conference on Information and Computer Technologies (ICICT), San Jose, CA, USA, 9–12 March 2020; pp. 553–556. [Google Scholar]
  17. Jawad, A.M.; Jawad, H.M.; Nordin, R.; Gharghan, S.K.; Abdullah, N.F.; Abu-Alshaeer, M.J. Wireless power transfer with magnetic resonator coupling and sleep/active strategy for a drone charging station in smart agriculture. IEEE Access 2019, 7, 139839–139851. [Google Scholar] [CrossRef]
  18. Goudarzi, S.; Kama, N.; Anisi, M.H.; Zeadally, S.; Mumtaz, S. Data collection using unmanned aerial vehicles for Internet of Things platforms. Comput. Electr. Eng. 2019, 75, 1–15. [Google Scholar] [CrossRef]
  19. Cicioğlu, M.; Çalhan, A. Smart agriculture with Internet of Things in cornfields. Comput. Electr. Eng. 2021, 90, 106982. [Google Scholar] [CrossRef]
  20. Moribe, T.; Okada, H.; Kobayashl, K.; Katayama, M. Combination of a wireless sensor network and drone using infrared thermometers for smart agriculture. In Proceedings of the 2018 15th IEEE Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 12–15 January 2018; pp. 1–2. [Google Scholar]
  21. Rajendra, P.S.; Ram, L.R.; Sudhir, K.S. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  22. Ramson, S.R.J.; León-Salas, W.D.; Brecheisen, Z.; Foster, E.J.; Johnston, C.T.; Schulze, D.G.; Filley, T.; Rahimi, R.; Soto, M.J.C.V.; Bolivar, J.A.L.; et al. A Self-Powered, Real-Time, LoRaWAN IoT-Based Soil Health Monitoring System. IEEE Internet Things J. 2021, 8, 9278–9293. [Google Scholar] [CrossRef]
  23. Chen, H.; Zhang, S. A temperature auto-monitoring and frost prevention real time control system based on a Z-BEE networks for the tea farm. In Proceedings of the 2012 IEEE Symposium on Electrical & Electronics Engineering (EEESYM), Kuala Lumpur, Malaysia, 24–27 June 2012; pp. 644–647. [Google Scholar] [CrossRef]
  24. Haseeb, K.; Ud Din, I.; Almogren, A.; Islam, N. An Energy Efficient and Secure IoT-Based WSN Framework: An Application to Smart Agriculture. Sensors 2020, 20, 2081. [Google Scholar] [CrossRef]
  25. Valecce, G.; Petruzzi, P.; Strazzella, S.; Grieco, L.A. NB-IoT for Smart Agriculture: Experiments from the Field. In Proceedings of the 2020 7th International Conference on Control, Decision and Information Technologies (CoDIT), Prague, Czech Republic, 29 June–2 July 2020; Volume 1, pp. 71–75. [Google Scholar] [CrossRef]
  26. Tsai, C.F.; Liang, T.W. Application of IoT Technology in The Simple Micro-farming Environmental Monitoring. In Proceedings of the 2018 IEEE International Conference on Advanced Manufacturing (ICAM), Yunlin, Taiwan, 16–18 November 2018; pp. 170–172. [Google Scholar] [CrossRef]
  27. Le, N.T.; Hossain, M.A.; Jang, Y.M. A survey of design and implementation for optical camera communication. Signal Process. Image Commun. 2017, 53, 95–109. [Google Scholar] [CrossRef]
  28. Onodera, Y.; Takano, H.; Hisano, D.; Nakayama, Y. Avoiding Inter-Light Sources Interference in Optical Camera Communication. In Proceedings of the IEEE Global Telecommunications Conference (GLOBECOM), Madrid, Spain, 7–11 December 2021. [Google Scholar]
  29. Moh, K.H.; Chowdhury, M.Z.; Md, S.; Nguyen, V.T.; Jang, Y.M. Performance Analysis and Improvement of Optical Camera Communication. Appl. Sci. 2018, 8, 2527. [Google Scholar] [CrossRef]
  30. Islim, M.S.; Videv, S.; Safari, M.; Xie, E.; McKendry, J.J.D.; Gu, E.; Dawson, M.D.; Haas, H. The Impact of Solar Irradiance on Visible Light Communications. J. Light. Technol. 2018, 36, 2376–2386. [Google Scholar] [CrossRef]
  31. Takai, I.; Harada, T.; Andoh, M.; Yasutomi, K.; Kagawa, K.; Kawahito, S. Optical Vehicle-to-Vehicle Communication System Using LED Transmitter and Camera Receiver. IEEE Photonics J. 2014, 6, 1–14. [Google Scholar] [CrossRef]
  32. Elizabeth, E.; Shivani, T.; Navid, B.H.; Stanislav, V.; Zabih, G.; Stanislav, Z. 400 m rolling-shutter-based optical camera communications link. Opt. Lett. 2020, 45, 1059–1062. [Google Scholar]
  33. Takano, H.; Nakahara, M.; Suzuoki, K.; Nakayama, Y.; Hisano, D. 300-Meter Long-Range Optical Camera Communication on RGB-LED-Equipped Drone and Object-Detecting Camera. IEEE Access 2022, 10, 55073–55080. [Google Scholar] [CrossRef]
  34. Li, T.; Onodera, Y.; Hisano, D.; Nakayama, Y. Drone Trajectory Control for Line-of-Sight Optical Camera Communication. In Proceedings of the ICC 2022—IEEE International Conference on Communications, Seoul, Republic of Korea, 16–20 May 2022; pp. 3808–3813. [Google Scholar] [CrossRef]
  35. Onodera, Y.; Takano, H.; Hisano, D.; Nakayama, Y. Adaptive N+1 Color Shift Keying for Optical Camera Communication. In Proceedings of the 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), Virtual Conference, 27–28 September 2021; pp. 1–5. [Google Scholar]
Figure 1. Conceptual system architecture (Orange arrows: drone trajectory).
Figure 1. Conceptual system architecture (Orange arrows: drone trajectory).
Sensors 24 06146 g001
Figure 2. Block diagram of proposed system.
Figure 2. Block diagram of proposed system.
Sensors 24 06146 g002
Figure 3. Perspective transformation to image plane.
Figure 3. Perspective transformation to image plane.
Sensors 24 06146 g003
Figure 4. Ground coverage of drone-mounted camera (a: top length, b: bottom length, c: height of trapezoid).
Figure 4. Ground coverage of drone-mounted camera (a: top length, b: bottom length, c: height of trapezoid).
Sensors 24 06146 g004
Figure 5. Concept of trajectory control algorithm.
Figure 5. Concept of trajectory control algorithm.
Sensors 24 06146 g005
Figure 6. Total trajectory length.
Figure 6. Total trajectory length.
Sensors 24 06146 g006
Figure 7. Total travel time.
Figure 7. Total travel time.
Sensors 24 06146 g007
Figure 8. Experimental setup.
Figure 8. Experimental setup.
Sensors 24 06146 g008
Figure 9. Sensor node.
Figure 9. Sensor node.
Sensors 24 06146 g009
Figure 10. Sensor node taken from drone.
Figure 10. Sensor node taken from drone.
Sensors 24 06146 g010
Figure 11. Sensor node placement (Red point: sensor node, Orange arrow: drone trajectory).
Figure 11. Sensor node placement (Red point: sensor node, Orange arrow: drone trajectory).
Sensors 24 06146 g011
Figure 12. Recognition accuracy of the LED panels.
Figure 12. Recognition accuracy of the LED panels.
Sensors 24 06146 g012
Figure 13. Bit error rate with threshold of 1 × 10 3 .
Figure 13. Bit error rate with threshold of 1 × 10 3 .
Sensors 24 06146 g013
Table 1. Variables.
Table 1. Variables.
VariableDefinition
I Set of sensor nodes
i , j Sensor node identifier in I
r i Radius of ith sensor node
( x i , y i , z i ) Position of ith sensor node in global coordinate system
( x c , y c , z c ) Position of camera in global coordinate system
( u i , v i ) Position of ith sensor node in image plane
l h Horizontal resolution of image plane
l v Vertical resolution of image plane
ϕ h Horizontal angle of view
ϕ v Vertical angle of view
θ Elevation angle of camera
fFocal length of camera
ρ Image sensor size
R i Data rate
S i Spatial multiplicity
DSymbol rate
N i Symbol number for CSK
M i Maximum data size
T i Transmission time
aTop length of camera coverage trapezoid
bBottom length of camera coverage trapezoid
cHeight of camera coverage trapezoid
hAltitude of receiver camera
A Ground coverage of receiver camera
Table 2. Simulation parameters.
Table 2. Simulation parameters.
ParameterValue
Altitude of receiver camera h5 m
Speed of drone3 m/s
Horizontal angle of view ϕ h 77 °
Vertical angle of view ϕ v 40 °
Elevation angle of camera θ 20 °
Table 3. Difference between measured and theoretical values [m].
Table 3. Difference between measured and theoretical values [m].
θ [Deg]abc
0 0.070 0.070 0.012
30 0.080 0.044 0.015
45 0.009 0.010 0.018
Table 4. Detection accuracy per parameter.
Table 4. Detection accuracy per parameter.
BatchImg-SizeWeightEpochsLossmAP_ 0.5 : 0.95
16640YOLOv5l300 0.0030712 0.71541
16320YOLOv5l300 0.0034124 0.70959
8640YOLOv5l300 0.0033053 0.68308
8320YOLOv5l300 0.0036273 0.68849
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kondo, S.; Yoshimoto, N.; Nakayama, Y. Farm Monitoring System with Drones and Optical Camera Communication. Sensors 2024, 24, 6146. https://doi.org/10.3390/s24186146

AMA Style

Kondo S, Yoshimoto N, Nakayama Y. Farm Monitoring System with Drones and Optical Camera Communication. Sensors. 2024; 24(18):6146. https://doi.org/10.3390/s24186146

Chicago/Turabian Style

Kondo, Shinnosuke, Naoto Yoshimoto, and Yu Nakayama. 2024. "Farm Monitoring System with Drones and Optical Camera Communication" Sensors 24, no. 18: 6146. https://doi.org/10.3390/s24186146

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop