Next Article in Journal
Dynamic Bandwidth Slicing in Passive Optical Networks to Empower Federated Learning
Previous Article in Journal
Automatic Weight Redistribution Ensemble Model Based on Transfer Learning to Use in Leak Detection for the Power Industry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion

1
College of Water Resources and Civil Engineering, China Agricultural University, Beijing 100083, China
2
College of Horticulture, Henan Agricultural University, Zhengzhou 450002, China
3
National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
*
Authors to whom correspondence should be addressed.
Sensors 2024, 24(15), 4998; https://doi.org/10.3390/s24154998
Submission received: 2 April 2024 / Revised: 13 July 2024 / Accepted: 31 July 2024 / Published: 2 August 2024
(This article belongs to the Section Navigation and Positioning)

Abstract

:
The labor shortage and rising costs in the greenhouse industry have driven the development of automation, with the core of autonomous operations being positioning and navigation technology. However, precise positioning in complex greenhouse environments and narrow aisles poses challenges to localization technologies. This study proposes a multi-sensor fusion positioning and navigation robot based on ultra-wideband (UWB), an inertial measurement unit (IMU), odometry (ODOM), and a laser rangefinder (RF). The system introduces a confidence optimization algorithm based on weakening non-line-of-sight (NLOS) for UWB positioning, obtaining calibrated UWB positioning results, which are then used as a baseline to correct the positioning errors generated by the IMU and ODOM. The extended Kalman filter (EKF) algorithm is employed to fuse multi-sensor data. To validate the feasibility of the system, experiments were conducted in a Chinese solar greenhouse. The results show that the proposed NLOS confidence optimization algorithm significantly improves UWB positioning accuracy by 60.05%. At a speed of 0.1 m/s, the root mean square error (RMSE) for lateral deviation is 0.038 m and for course deviation is 4.030°. This study provides a new approach for greenhouse positioning and navigation technology, achieving precise positioning and navigation in complex commercial greenhouse environments and narrow aisles, thereby laying a foundation for the intelligent development of greenhouses.

1. Introduction

Greenhouses provide crops with suitable microclimates to mitigate the impact of external natural climates on agriculture. Greenhouse cultivation enables off-season vegetable production, enhances crops’ resistance to pests and diseases, and improves both the quality and yield of crops to meet the growing demand for vegetables [1,2]. However, challenges such as labor shortages and rising costs restrict the development of greenhouses [3]. To reduce reliance on manual labor and achieve greenhouse automation [4], fully autonomous operation has become the primary solution. Precise positioning and navigation, as the core technology for autonomous operation, have become critical challenges that urgently need to be addressed for the intelligent development of greenhouses.
A greenhouse is characterized by a large steel framework and densely grown crops inside, resulting in significant differences between the greenhouse environment and the outdoor environment during navigation and positioning [5,6]. Autonomous navigation technologies for greenhouses mainly include fixed rail, simultaneous localization and mapping (SLAM), and track-following systems [7]. Fixed rail navigation, including ground-based and suspended types, requires the installation of tracks within the greenhouse, which will affect other production processes in the greenhouse [8]. SLAM is primarily divided into visual SLAM [9,10] and laser SLAM [11,12], which involve map construction followed by path planning to achieve navigation [13,14,15,16]. Path planning includes global path planning and local path planning [17]. Global planning, a form of static planning, calculates the optimal path from start to end points based on the known layout of the environment, often utilizing algorithms such as A* [18,19] and ant colony [20,21] optimization. Local planning, a form of dynamic planning, updates map information in real time based on sensor data to generate a local optimal path from the current node to the next sub-goal node, commonly employing techniques like artificial potential fields [22] and neural network algorithms [23]. Due to the high similarity in texture and structure of greenhouse environments, as well as the dense foliage obstructing visibility, feature extraction based on visual and laser methods becomes challenging [24,25], rendering such navigation technologies less applicable for greenhouses. Tracked navigation includes wired and wireless tracking methods. The former relies on magnetic strips [26], QR code tags, or wires for guidance, offering limited flexibility. Wireless track navigation utilizes technologies like ultra-wideband (UWB) positioning to determine navigation paths. UWB technology, with its advantages of a high bandwidth, short pulses, and strong penetration capabilities, significantly reduces the impact of obstacles on pulse signals compared to other wireless communication technologies. Therefore, it is more suitable for navigation and positioning applications within greenhouses [27,28,29]. A single sensor’s signal penetration capability is affected by complex channels, making it difficult to accurately perceive the movement characteristics of mobile objects [30]. Similarly, positioning accuracy relying solely on UWB is also limited under conditions with numerous non-line-of-sight (NLOS) environments indoors in greenhouses [31,32,33].
The main influencing factor of UWB positioning accuracy is NLOS effects. Naheem et al. [34] proposed an improved loosely coupled Kalman filtering algorithm that integrates UWB and an IMU into a positioning system. This system not only eliminates the IMU’s cumulative error drift but also filters out the NLOS error effects of UWB. Compared to traditional data fusion positioning systems, the positioning accuracy is improved by 27%. Liu et al. [35] utilized altitude information provided by a barometer to mitigate NLOS effects on UWB. By employing Kalman filtering (EKF) to fuse UWB and IMU information, the positioning accuracy increased by 45.71%. Sun et al. [36] employed a method combining power gain and probability statistics to identify NLOS signals, using an inertial navigation system to compute UWB position information, achieving indoor positioning in NLOS environments. However, the impact of NLOS effects in greenhouse environments remains unknown.
Compared with a single sensor, the fusion of multi-sensor information can provide a more comprehensive and accurate perception of the motion characteristics of moving objects [37,38,39]. Based on UWB technology, integrating other sensors for auxiliary positioning has become one of the important methods for solving greenhouse autonomous navigation. Zhang [40] proposed a positioning method that integrates UWB and laser LiDAR, with an average error that is reduced from 32 cm with UWB alone to 7 cm after fusion, demonstrating that multi-sensor fusion can effectively enhance navigation accuracy. Bi et al. [41] designed a fusion method using an extended Kalman filter for UWB ranging correction values and inertial measurement unit (IMU) data, achieving a positioning error of 11.95 cm under NLOS communication conditions. Long et al. [42] introduced a multi-sensor fusion positioning method combining UWB, an IMU, wheel odometry (ODOM), and laser LiDAR, achieving an error of 7.9 cm in positioning. However, greenhouse environments impose stringent requirements on navigation accuracy, particularly in narrow aisle planting greenhouses, where existing research struggles to meet navigation and positioning demands.
To enhance the accuracy of greenhouse positioning and navigation, this study proposes a navigation robot that integrates UWB, an IMU, ODOM, and a rangefinder (RF) for sensor fusion, aiming to overcome individual sensor limitations. Based on this, an algorithm is proposed to mitigate NLOS confidence for UWB specifically tailored for greenhouse environments, thereby improving the accuracy of positioning and navigation. To verify the feasibility and precision of this navigation robot, experiments will be conducted within a Chinese solar greenhouse located in Beijing, China to assess the positioning effectiveness and navigation accuracy during the robot’s movement.

2. Positioning System of Mobile Robot Based on Multi-Sensor Fusion

2.1. Multi-Sensor Fusion Positioning Method

The multi-sensor fusion positioning system comprises UWB, an IMU, an odometer, and an RF. The UWB positioning tag, IMU, odometer, and RF are all installed on the mobile robot. The UWB system calculates the position information of the mobile robot through the distance between the tag and each positioning base station and combines the heading angle of the mobile robot output by the IMU device and the linear speed of the vehicle body in the X and Y directions obtained by the odometer. The obtained data are fused using the EKF algorithm and combined with the data obtained from the rangefinder. The principle is shown in Figure 1.

2.2. UWB Positioning Model Optimization

According to the principles of positioning in three-dimensional space, obtaining precise coordinate information typically requires at least four positioning base stations. Due to the high planting density inside greenhouses, navigation systems encounter extensive NLOS conditions. Therefore, the number of base stations in the UWB positioning system is increased to six, and the positioning area is divided into two equally sized regions (as shown in Figure 2). Each region forms a calibration system composed of four positioning base stations and one mobile tag. In the basic UWB positioning algorithm, the ranging information from each base station carries equal weight, and the computed position in unobstructed environments can be considered the actual position. However, in greenhouse environments, each base station is affected by NLOS to varying degrees. Therefore, it is necessary to assess the extent of NLOS’s impact on the ranging values of each base station. Based on these assessments, different weights are assigned to the ranging values of affected base stations: those less affected are given greater weights, while those more affected are given lower weights. This approach aims to improve the accuracy of position calculations from the UWB positioning system in greenhouse conditions.

2.2.1. Determine NLOS Environments

The method proposed in this study to identify the NLOS environment is to compare the abscissa position of the mobile robot at a certain moment calculated by the measurement value of the laser rangefinder with the abscissa position calculated by the UWB positioning system, thereby determining whether the mobile robot is affected by NLOS. Therefore, a laser rangefinder is introduced into the original positioning system. The laser rangefinder selected in this article has a measurement accuracy of ±1.5 cm in short-distance mode. We use the distance between the mobile robot and the ridges on both sides measured by the laser rangefinder to calculate the abscissa Cm of the mobile robot in the navigation coordinate system at k time:
C ( k ) m = l + L + x 0 h cos α 0 sin α 0
It can be seen from Figure 2 that the mobile robot forms a “U-shaped” trajectory within the radiation range of the base station. Therefore, when the mobile robot travels to an aisle far away from the base station A0, the abscissa calculated by Formula (1) needs to be increased by a fixed value, which is as follows:
C ( k ) m = l + L + x 0 h cos α 0 sin α 0 + 1.8
In the formula, l is the distance between the laser ranging finder and the vehicle UWB tag; L is the distance from the bottom of the ridge to the base station A0; x0 is the perception distance of the ranging finder; α0 is the bottom angle of the ridge; h is the height of the laser ranging finder from the ground; and 1.8 is the sum of the ridge and aisle widths.
Compare with the abscissa CUWB obtained by the UWB system and calculate the difference at k time determined as follows:
| Δ C ( k ) | = | C ( k ) U W B C ( k ) m |
Select an appropriate judgment threshold γ to judge whether each base station is in the non-line-of-sight environment of the mobile robot.

2.2.2. Weakening the Impact of NLOS

If | Δ C ( k ) | < 3 γ , it is determined that the mobile robot is in the line-of-sight (LOS) environment at this time. That is, the coordinate position obtained by the UWB positioning system is reliable, and there is no need to weaken the influence of NLOS and continue to execute the driving program.
If | Δ C ( k ) | 3 γ , the position information calculated by the UWB positioning system is affected by NLOS, so the obtained fused positioning information is inaccurate, and the influence of NLOS needs to be weakened. Firstly, the operating area of the mobile robot is determined according to the coordinate position of the tag to be tested and calculated by the UWB system. It can be seen from Figure 2 that when the mobile robot is driving in the aisle, no matter where it is in any position in any area, there are at least two base stations in the LOS environment. At this time, it is only necessary to weaken the influence of NLOS on the other two base stations. Secondly, we combine the base station affected by NLOS with the two base stations in the LOS environment. We calculate the difference between the lateral position of the tag under the test measured by two different base station combinations and the lateral position calculated by the ranging finder. Based on this difference, the confidence βi of the base station ranging value in the NLOS environment can be determined as follows:
| β i = 1 / D L i |
In the formula, DLi is the lateral position difference of the tag to be tested determined by different base station combinations.
Then, we use the laser ranging finder installed at the same height as the UWB tag on the vehicle to measure the height h of the tag from the ground at this time and calculate the height difference Δh between the UWB base station and the tag. Then, the horizontal distance di between the tag to be tested and each base station is derived from the ranging value dmi between the tag to be tested and each base station:
d i = d m i 2 Δ h 2
Finally, we incorporate the confidence level and use the least squares method to solve the estimated coordinate value of the label to be tested at this time.
We compare the UWB abscissa information after weakening the influence of NLOS with the abscissa of the mobile robot calculated from the ranging information again. If | Δ C ( k ) | < 3 γ , the positioning information after weakening the influence of NLOS can meet the requirements and continue to execute the driving program. If | Δ C ( k ) | 3 γ , the abscissa in the navigation coordinate system obtained by using the weakening algorithm cannot meet the requirements. At this time, the abscissa calculated by the ranging finder and the ordinate calculated by the UWB positioning system are combined to form the coordinate information of the mobile robot.
Although the positioning accuracy of the UWB system in the greenhouse is lower than that of the laser rangefinder, due to the poor greenhouse environment, if the mobile robot uses the abscissa calculated by the rangefinder throughout its movement, the trajectory of the mobile robot will be a wave. Therefore, only when the accuracy of other sensors is affected, the positioning data of the rangefinder will be fused, ensuring that the mobile robot can walk straight.

2.3. IMU Error Judgment and Compensation

When the IMU works for a long time, it is easy to generate cumulative errors, affecting the accuracy of the proposed positioning system. This study employs the following methods to assess and compensate for its cumulative errors.
After the above calibration process, the UWB system can obtain more accurate coordinate information. At this time, the cumulative error of the IMU can be determined and compensated for based on the UWB positioning results. The coordinate value calculated by the UWB is compared with the coordinate value calculated by the IMU device to obtain |δP| as follows:
| δ P | = | P U W B P I M U |
If the difference is more significant than but |ΔC(k)| < 3γ, it means that the UWB positioning result is more accurate, while the IMU positioning result is less so. At this time, the positioning result of the UWB system can be used to compensate for the IMU error, and the specific value δR(k) that needs to be compensated for the error at time k is calculated by Formula (6):
δ R ( k ) = | P ( k ) U W B P ( k ) I M U |
Then the position information calculated by the IMU system at the subsequent k + 1 time can be calibrated with this value:
P adj ( k + 1 ) I M U = P ( k + 1 ) I M U δ R ( k )

2.4. Odometer Positioning Analysis

2.4.1. Establish the Kinematics Model

The greenhouse mobile robot adopts a crawler chassis. For the convenience of our calculations, the kinematics model can be simplified as a two-wheel differential movement model, and only the movement in two-dimensional space is considered. The kinematics model of the mobile robot in the aisle is shown in Figure 3.
According to the wheel speed measured by the encoders of the two driving wheels of the mobile robot and the kinematics model constructed, the whole vehicle’s linear velocity and angular velocity can be calculated. In adjacent time intervals, it can be considered that the mobile robot moves in a circle with O’ as the center, and the rotation angle is θ. It is considered that the two-wheel differential chassis does not need to consider lateral slippage, so the mobile robot in the carrier coordinate system can obtain the expressions of linear velocity and angular velocity as follows:
[ V y ω ] = [ 1 2 1 2 1 d 1 d ] [ V L V R ]
In the formula, Vy is the linear velocity of the mobile robot along the Yd axis; VL and VR correspond to the linear velocity of the left and right wheels of the mobile robot, respectively; ω is the rotational angular velocity of the mobile robot in a two-dimensional plane; and d is the wheelbase of the left and right wheels. We can obtain the motion equation of the mobile robot in the global coordinate system as follows:
[ X · Y · θ · ] = [ cos θ sin θ 0 sin θ cos θ 0 0 0 1 ] [ V x V y ω ]

2.4.2. Odometer Error Correction

The mobile robot uses two photoelectric encoders to form an odometer positioning system. However, the driving wheels will slip to a certain extent when turning in place in the solar greenhouse on non-hardened roads, resulting in inaccurate attitude angles calculated through the odometer. The pose error calculated in Formulas (9) and (10) will accumulate with time, so the positioning accuracy obtained by this method will decrease [43].
This study employs orientation sensing provided by an IMU to assist in the attitude detection during the steering process of the mobile robot. Angular velocity is utilized as a system variable to compare the angular velocity computed from odometry with that sensed by the IMU, thereby determining the slipping condition of the robot’s two tracks. In cases of slipping, the heading angle information provided by the IMU is used as observed angular data to steer the mobile robot.
To address the cumulative positioning errors of odometry, this study adopts a feature point correction method. When odometry accumulates errors, it leads to deviations in the position of the mobile robot. This position becomes a feature point, which is corrected using fused information from UWB and rangefinders to address the cumulative errors in odometry. The longitudinal position information of the mobile robot provided by UWB and the lateral position information obtained from rangefinders are integrated to establish the initial position of the odometer, enabling the continuation of navigation and positioning tasks in the subsequent stages.

2.5. Multi-Sensor Fusion Algorithm

2.5.1. Fusion of UWB/IMU/ODOM Data Based on Extended Kalman Filter

This study integrates UWB, an IMU, and ODOM under NLOS conditions, all of which are nonlinear. The extended Kalman filter (EKF) effectively handles nonlinear models using linearization techniques, while maintaining a relatively low level of computational complexity, making it suitable for real-time positioning in resource-constrained embedded systems [44,45]. Therefore, this study employs the EKF to fuse data from multiple sensors.
We assume the operating environment of the mobile robot is a two-dimensional plane in an ideal state. The pose of the mobile robot is used as the state vector in the fusion algorithm, and the data information provided by the UWB and IMU is used for measurement updates. We establish the state model and measurement model of the combined positioning system as follows:
{ x k + 1 = f ( x k , u k + 1 ) + w k + 1 z k + 1 = h ( x k + 1 ) + v k + 1
In the formula, xk+1 is the state variable at time k + 1; uk+1 is the control variable at time k + 1; f(·) is the functional relationship between the state variable xk+1 and the state variable xk; wk+1 is the motion noise at time k + 1; zk+1 is the system observation value at time k + 1; and vk+1 is the observation noise at time k + 1. The pose information uk+1 = [VO,X, VO,Y, ωO] calculated by the odometer is used as the control amount in the prediction stage, where VO,X, VO,Y, and ωO, respectively, represent the movement speed of the mobile robot in the X-axis direction and the Y-axis direction in the navigation coordinate system and the heading angle of the mobile robot.
According to the established kinematic model of the mobile robot, the pose at k + 1 time can be expressed as follows:
x k + 1 = x k + [ cos θ k sin θ k 0 sin θ k cos θ k 0 0 0 1 ] [ V O , X V O , Y ω O ] d t
The covariance matrix of the system state quantity in the prediction stage at time k + 1 is as follows:
P k + 1 = f x P x f x T + f w Q k + 1 f w T
In the formula, Pk+1 is the covariance matrix of the state quantity xk+1; and Qk+1 is the covariance matrix of the motion noise of the wheel odometer.
The Jacobian matrix of the constructed kinematic model and the Jacobian matrix of motion noise are, respectively, as follows:
f x = [ 1 0 ( V X sin θ k + V Y cos θ k ) d t 0 1 ( V X cos θ k V Y sin θ k ) d t 0 0 1 ]
f w = [ ( cos θ k ) d t ( sin θ k ) d t 0 ( sin θ k ) d t ( cos θ k ) d t 0 0 0 1 ]
The difference between the heading angle information θk+1 provided by the IMU at time k + 1 and the coordinate position information provided by UWB and the position information calculated by the IMU is input into the system as the observation value, and we obtain the observation model at this time as follows:
z k + 1 = [ P a d j ( k + 1 ) I M U , x P ( k + 1 ) U W B , x P a d j ( k + 1 ) I M U , y P ( k + 1 ) U W B , y θ k + 1 ] + v k + 1
The calculated Kalman gain coefficient is as follows:
K k + 1 = P k + 1 H x T ( H x P k + 1 H x T + R k + 1 ) 1
In the formula, Hx is the Jacobian matrix of the observation model; Rk+1 is the covariance matrix of the observation noise; and the noise estimate determined according to the sensor accuracy given by the manufacturer is as follows:
R k + 1 = [ σ X 0 0 0 σ Y 0 0 0 σ θ ]
In the formula, σX is the observation noise variance in the UWB output in the X-axis direction; σY is the observation noise variance in the UWB output in the Y-axis direction; and σθ is the observation noise variance in the attitude angle output by the IMU.
The posterior estimation of the state vector is as follows:
x k + 1 = x k + 1 + K k + 1 ( z k + 1 x k + 1 )
Substituting the posterior estimate of the state vector into the covariance matrix of the prediction stage yields a covariance matrix with posterior estimates:
P k + 1 = ( I 3 K k + 1 ) P k + 1
In the formula, I3 is the third-order identity matrix.
In summary, the fusion of UWB, IMU, and wheel odometer data has been achieved. However, among the three sensors, UWB is susceptible to the influence of NLOS, and it is difficult to meet the high-precision requirements, while both the IMU and wheel odometers have the disadvantage of error accumulation.

2.5.2. Correcting the Position of the Mobile Robot Based on the Ranging Value

To compensate for the cumulative errors in UWB, the IMU, and wheel odometry, rangefinders are employed as auxiliary sensors. The lateral coordinates of the mobile robot are calculated using the ranging methods described in Formulas (1) and (2), and these coordinates are utilized to compensate for positioning errors caused by external influences and inherent inaccuracies in the other three sensors.
The mobile robot drives along the aisle in the greenhouse, forming a “U-shaped” trajectory. When the mobile robot reaches the set target point, it will stop moving and then execute commands to go to the next target point. And when the robot starts, the timestamp at this time and the length information on both sides of the distance sensed by the rangefinder will be recorded as reference data when subsequent large lateral deviations occur in the mobile robot.
When the UWB is affected by NLOS, and the IMU [46] and the odometer system have a significant degree of deviation due to the accumulation of errors, resulting in a sizeable lateral offset of the mobile robot, the system will automatically cut off the sampling update of the IMU and the odometer. At the same time, the lateral position information provided by the ranging finder is compared with the data of the starting position, and the deviation between the two is used as the input value. The PID algorithm is used to drive the mobile robot back to the established channel, and then the position information calculated by UWB, the IMU, and the odometer is integrated into the system.

3. Experiments for Verification

3.1. Experimental Greenhouse and Mobile Robot Details

3.1.1. Experimental Greenhouse

To validate the feasibility of this positioning and navigating robot, experiments were conducted in a solar greenhouse located in Beijing, China. The experimental greenhouse measures 30 m in length (east to west) with a span of 5 m (north to south). The crops are planted in east–west rows. The planting ridges are 0.8 m wide with aisles that are 1 m wide. The actual conditions inside the greenhouse are depicted in Figure 4.

3.1.2. Mobile Robot Description

The mobile robot is mainly designed to meet the requirements of the confined environment of a Chinese solar greenhouse (CSG) with sufficient power and flexibility. The dimensions of the mobile robot are designed to be 1 m in length, 0.6 m in width, and 1.5 m in height (including the protective frame), with a track width of 0.1 m (as shown in Figure 5). The mobile chassis is equipped with a 48 V 80 Ah lithium battery and two 48 V 1700 W brushless DC reduction motors. The main control chip is STM32H750VBT6, powered separately by a 7.4 V 1500 mAh lithium battery. UWB positioning tags and IMU devices are installed on the top of the mobile robot’s cargo platform, 1.5 m above the ground, to minimize environmental signal interference. At the same height, an RF facing the ground is installed to assist the UWB positioning algorithm. The odometer is installed at the position of the drive wheel, coaxial with it, to record the changes in the distance moved relative to the ground and the directional angle of the wheel. Three RF modules are installed on both sides and directly in front of the mobile robot, working with other sensors to achieve precise positioning navigation and obstacle avoidance.

3.2. Experimental Study on Static Positioning Accuracy of UWB in NLOS Environments

To validate the feasibility of the NLOS optimization algorithm mentioned earlier, the corridor was partitioned into sections with 5 m intervals. The midpoint of each corridor section at the end of the region served as the test point for UWB positioning accuracy. After manually moving the robot to the test point, it stops to collect positioning data, then it is moved to the next test point under the control of the operator.

3.3. Navigation Accuracy Experiment

The mobile robot departs from the starting point o at (1.25 m, 28 m), sequentially passing through the three target points: a at (1.25 m, 1.5 m), b at (4 m, 1.5 m), and c at (4 m, 28 m), as shown in Figure 2. Testing occurs at intervals of 2 m within the aisle and at intervals of 1 m along the path from a to b.
To assess the navigation accuracy of a mobile robot during its operation, laser measurement is employed. A laser emitter, facing the ground, is installed at the midpoint around each side of the navigation robot (as shown in Figure 6). Upon reaching the test point, markings from the four laser emitters are recorded. The intersection point formed by the cross of the four marked points is considered the center point. The perpendicular distance from this center point to the planned path represents the lateral deviation of the navigation robot (as shown in Figure 6). The angle between the longitudinal axis of the navigation robot and the planned path constitutes the course deviation (as shown in Figure 6). Lateral deviation and course deviation serve as quantitative indicators of navigation accuracy. The navigation robot is programmed to stop for 10 s and record deviations every 2 m along the planned path.
A comparison of lateral deviations between the sensor measurement method and the aforementioned laser measurement method is shown in Table 1. The root mean square error (RMSE) of lateral deviation by the sensor measurement method is 0.070 m, while for the laser measurement method, the RMSE of lateral deviation is 0.040 m. Therefore, this study utilizes the laser measurement method to assess the navigation accuracy of the mobile robot.

4. Results and Discussion

4.1. Analysis of UWB Static Localization Results in NLOS Environments

The coordinates of 10 test points where UWB tags were statically positioned along with the coordinate data before and after the attenuation of NLOS’s effects and their root mean square errors are presented in Table 2. It can be seen from Table 2 that the minimum value of the root mean square error before weakening is 0.244 m, the maximum value is 0.497 m, and the average positioning error is 0.398 m in the static positioning test under the NLOS environment; the minimum value of the root mean square error is 0.115 m, the maximum value is 0.226 m, and the average positioning error is 0.159 m after the algorithm proposed in this paper weakens the influence of NLOS. It can be found that the UWB positioning system is affected by NLOS, so the positioning accuracy cannot meet the requirements of precise positioning, and the weakening algorithm proposed in this paper improves the UWB positioning accuracy by 60.05%. The impact of NLOS has a good weakening effect.

4.2. Analysis of Navigation Accuracy

The navigation accuracy test was carried out according to the route planned in Figure 2. The deviation data obtained at each test point are shown in Table 3. It is stipulated that the mobile robot is positive when facing left, and it is positive when it deviates from the trajectory to the left. As shown by the test data in Table 3, at a speed of 0.1 m/s, the average lateral deviation of the mobile robot is 0.037 m, the RMSE is 0.038 m, the average course deviation is 3.557°, and the RMSE is 4.030°. However, the maximum lateral deviation of two adjacent test points is 0.095 m, and the maximum course deviation is 9.663°. The larger deviations appear near the initial movement positions of the north and south aisles and the alternate positions of areas A and B. This phenomenon is because the vehicle body of the mobile robot is not oriented correctly at the initial position, and the deviation correction of the system lags, causing the mobile robot to produce relatively large corrections. In addition, the alternate positions of area A and area B are affected by NLOS, which makes the position of the tag to be tested calculated by the UWB positioning system inaccurate, and it is impossible to accurately determine which area the mobile robot is in. And the IMU and the odometer have a large cumulative error after a long work period. At this time, the ranging finder can only be used to correct the position of the mobile robot, which affects the accuracy of data fusion. However, when the mobile robot continues to drive for a certain distance, it can run smoothly and precisely when its direction is corrected and the sensor information is re-integrated into the system.
To test the overall navigation effect of the mobile robot, the mobile robot is continuously driven according to the previously planned path, and the lateral deviation of the driving track is judged according to the wheel marks on the driving path according to a distance that is twice the vehicle’s length. The test data are shown in Table 4. The data shown in Table 4 show that the proportion of lateral deviations between 0.03 m and 0.04 m is the largest, and the proportion of deviations between 0.02 m and 0.05 m is as high as 86.67%. It can be seen that the fusion positioning and navigation system can be used in solar greenhouses with complex environments, effectively improving positioning and navigation accuracy.

5. Conclusions and Prospects

A navigation robot utilizing fused positioning with UWB, an IMU, ODOM, and an RF was proposed, with an optimized UWB algorithm tailored for greenhouse conditions. Our experimental validation demonstrated the feasibility and accuracy of the navigation robot. The main conclusions of this study are as follows:
A weighted positioning algorithm is proposed to mitigate the effects of NLOS environments on UWB localization results. The results show that compared to before optimization, the proposed algorithm improves the robot’s positioning accuracy by 60.05%, demonstrating its ability to enhance the localization precision of navigation robots in greenhouse environments.
We form a greenhouse environment positioning and navigation system set based on UWB, an IMU, an odometer, and a ranging finder. As UWB, the IMU, and the odometer are affected by the external environment and their cumulative errors in the greenhouse, we aimed to reduce the positioning accuracy. The ranging data information is introduced into the multi-sensor fusion system, and our experiments verify the multi-sensor fusion system. When the mobile robot is driving at a speed of 0.1 m/s, the average lateral deviation is 0.037 m, the RMSE is 0.038 m, the average course deviation is 3.557°, and the RMSE is 4.030°, which further improve the navigation accuracy.
The fusion technology of UWB, an IMU, ODOM, and an RF proposed in this study provides a new approach for greenhouse positioning and navigation technology, achieving precise positioning and navigation in complex commercial greenhouse environments and narrow aisles. This lays a foundation for the intelligent development of greenhouses. However, NLOS environments vary among different greenhouses and crops. Therefore, it is necessary to explore more signal processing methods under different NLOS conditions in further research to enhance the robustness and accuracy of UWB positioning.

Author Contributions

Conceptualization, W.S.; Methodology, X.L. and B.C.; Software, X.L.; Investigation, X.L., B.C., X.H. and N.Z.; Resources, X.H.; Data curation, X.L.; Writing—original draft, X.L.; Writing—review and editing, W.S., X.H. and B.C.; Supervision, H.W.; Project administration, W.S.; Funding acquisition, H.W. and W.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by China Agriculture Research System of MOF and MARA (CARS-23-D02).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Panwar, N.L.; Kaushik, S.C.; Kothari, S. Solar greenhouse an option for renewable and sustainable farming. Renew. Sustain. Energy Rev. 2011, 15, 3934–3945. [Google Scholar] [CrossRef]
  2. Qi, F.; Wei, X.; Zhang, Y. Development status and future research emphase on greenhouse horticultural equipment and its relative technology in China. Trans. Chin. Soc. Agric. Eng. 2017, 33, 1–9. [Google Scholar]
  3. Li, X.; Ma, Z.; Chu, X.; Liu, Y. A Cloud-Assisted Region Monitoring Strategy of Mobile Robot in Smart Greenhouse. Mob. Inf. Syst. 2019, 2019, 5846232. [Google Scholar] [CrossRef]
  4. Cai, L.; Huang, D. Analysis on the Development Status and Countermeasures of Vegetable Production ‘Machine Substitution’ in Chongming District. Shanghai Veg. 2023, 5, 73–74. [Google Scholar]
  5. Jagelčák, J.; Gnap, J.; Kuba, O.; Frnda, J.; Kostrzewski, M. Determination of Turning Radius and Lateral Acceleration of Vehicle by GNSS/INS Sensor. Sensors 2022, 22, 2298. [Google Scholar] [CrossRef] [PubMed]
  6. Kárník, J.; Streit, J. Summary of available indoor location techniques. IFAC-PapersOnLine 2016, 49, 311–317. [Google Scholar] [CrossRef]
  7. Jiang, S.; Zhang, M.; Li, X.; Qi, Y.; Lü, X. Development of navigation and control technology for autonomous mobile equipment in greenhouse. J. Chin. Agric. Mech. 2022, 43, 159–169. [Google Scholar]
  8. Sun, G.; Huang, Y.; Wang, X.; Yuan, Y.; Chen, G. Autonomous navigation system in a greenhouse using LIO-SAM mapping and laser vision fusion localization. Trans. Chin. Soc. Agric. Eng. 2024, 40, 227–239. [Google Scholar]
  9. Xiao, L.; Wang, J.; Qiu, X.; Rong, Z.; Zou, X. Dynamic-SLAM: Semantic monocular visual localization and mapping based on deep learning in dynamic environment. Robot. Auton. Syst. 2019, 117, 1–16. [Google Scholar] [CrossRef]
  10. Xu, L.; Feng, C.; Kamat, V.R.; Menassa, C.C. A scene-adaptive descriptor for visual SLAM-based locating applications in built environments. Automat. Constr. 2020, 112, 103067. [Google Scholar] [CrossRef]
  11. Han, J.; Kim, J.; Shim, D.H. Precise Localization and Mapping in Indoor Parking Structures via Parameterized SLAM. IEEE Trans. Intell. Transp. 2019, 20, 4415–4426. [Google Scholar] [CrossRef]
  12. Jiang, S.; Wang, S.; Yi, Z.; Zhang, M.; Lv, X. Autonomous Navigation System of Greenhouse Mobile Robot Based on 3D Lidar and 2D Lidar SLAM. Front. Plant Sci. 2022, 13, 815218. [Google Scholar] [CrossRef] [PubMed]
  13. Jung, J.; Oh, T.; Myung, H. Magnetic field constraints and sequence-based matching for indoor pose graph SLAM. Robot. Auton. Syst. 2015, 70, 92–105. [Google Scholar] [CrossRef]
  14. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
  15. Feng, X.; Liang, W.; Chen, H.; Liu, X.; Yan, F. Autonomous Localization and Navigation for Agricultural Robots in Greenhouse. Wireless Pers. Commun. 2023, 131, 2039–2053. [Google Scholar] [CrossRef]
  16. Tian, Y.; Chen, H.; Wang, F.; Chen, X. Overview of SLAM algorithm for mobile robots. Comput. Sci. 2020, 48, 223–234. [Google Scholar]
  17. Yue, X.; Li, J.; Wang, Y. A Review of Path Planning for Autonomous Vehicles. Sens. World 2024, 30, 1–8. [Google Scholar]
  18. Xu, H.; Yu, G.; Wang, Y.; Zhao, X.; Chen, Y.; Liu, J. Path Planning of Mecanum Wheel Chassis Based on Improved A* Algorithm. Electronics 2023, 12, 1754. [Google Scholar] [CrossRef]
  19. Cui, S.; Chen, Y.; Li, X. A Robust and Efficient UAV Path Planning Approach for Tracking Agile Targets in Complex Environments. Machines 2022, 10, 931. [Google Scholar] [CrossRef]
  20. Yue, L.; Chen, H. Unmanned vehicle path planning using a novel ant colony algorithm. EURASIP J. Wirel. Comm. 2019, 2019, 136. [Google Scholar] [CrossRef]
  21. Wu, M.; Gao, B.; Hu, H.; Hong, K. Research on path planning of tea picking robot based on ant colony algorithm. Meas. Control 2024, 1–17. [Google Scholar] [CrossRef]
  22. Park, G.; Choi, M. Optimal Path Planning for Autonomous Vehicles Using Artificial Potential Field Algorithm. Int. J. Automot. Technol. 2023, 24, 1259–1267. [Google Scholar] [CrossRef]
  23. Luo, M.; Hou, X.; Yang, S.X. A Multi-Scale Map Method Based on Bioinspired Neural Network Algorithm for Robot Path Planning. IEEE Access 2019, 7, 142682–142691. [Google Scholar] [CrossRef]
  24. Zhang, G. Research on Positioning and Navigation Methods of Agricultural Robots in Greenhouse Environment. Master’s Thesis, North China University of Technology, Beijing, China, 2023. [Google Scholar]
  25. Xie, B.; Jin, Y.; Faheem, M.; Gao, W.; Liu, J.; Jiang, H.; Cai, L.; Li, Y. Research progress of autonomous navigation technology for multi-agricultural scenes. Comput. Electron. Agr. 2023, 211, 107963. [Google Scholar] [CrossRef]
  26. Liu, T.; Zhang, B.; Zheng, C. Design and performance test of greenhouse robot navigation system. J. Inn. Mong. Agric. Univ. 2013, 34, 108–111. [Google Scholar]
  27. Chen, Y.; Huang, S.; Wu, T.; Tsai, W.; Liou, C.; Mao, S. UWB System for Indoor Positioning and Tracking with Arbitrary Target Orientation, Optimal Anchor Location, and Adaptive NLOS Mitigation. IEEE Trans. Veh. Technol. 2020, 69, 9304–9314. [Google Scholar] [CrossRef]
  28. Maranò, S.; Gifford, W.M.; Wymeersch, H.; Win, M.Z. NLOS Identification and Mitigation for Localization Based on UWB Experimental Data. IEEE J. Sel. Area Comm. 2010, 28, 1026–1035. [Google Scholar] [CrossRef]
  29. Lin, X.; Wang, X.; Lin, C.; Geng, J.; Xue, J.; Zheng, E. Location Information Collection and Optimization for Agricultural Vehicle Based on UWB. Trans. Chin. Soc. Agric. Mach. 2018, 49, 23–29, 45. [Google Scholar]
  30. Niu, Z.; Yang, H.; Zhou, L.; Farag Taha, M.; He, Y.; Qiu, Z. Deep learning-based ranging error mitigation method for UWB localization system in greenhouse. Comput. Electron. Agr. 2023, 205, 107573. [Google Scholar] [CrossRef]
  31. Li, M.; Zhu, H.; You, S.; Tang, C. UWB-Based Localization System Aided with Inertial Sensor for Underground Coal Mine Applications. IEEE Sens. J. 2020, 20, 6652–6669. [Google Scholar] [CrossRef]
  32. Sun, J.; Sun, W.; Zheng, J.; Chen, Z.; Tang, C.; Zhang, X. A Novel UWB/IMU/Odometer-Based Robot Localization System in LOS/NLOS Mixed Environments. IEEE Trans. Instrum. Meas. 2024, 73, 7502913. [Google Scholar] [CrossRef]
  33. Zheng, Y.; Zeng, Q.; Lv, C.; Yu, H.; Ou, B. Mobile Robot Integrated Navigation Algorithm Based on Template Matching VO/IMU/UWB. IEEE Sens. J. 2021, 21, 27957–27966. [Google Scholar] [CrossRef]
  34. Naheem, K.; Kim, M.S. A Low-Cost Foot-Placed UWB and IMU Fusion-Based Indoor Pedestrian Tracking System for IoT Applications. Sensors 2022, 22, 8160. [Google Scholar] [CrossRef] [PubMed]
  35. Liu, Q.; Wan, Z.; Zhou, W.; Chen, C. An Improved Indoor Location Method Based on Multi-source Information Fusion. Telecommun. Eng. 2021, 61, 1526–1533. [Google Scholar]
  36. Sun, W.; Sun, P. A UWB/INS indoor positioning method in NLOS environment. Sci. Surv. Mapp. 2023, 48, 1–7. [Google Scholar]
  37. Zhang, S. Research on Localization and Navigation of Indoor Mobile Robot Based on Multi-sensor Fusion. Master’s Thesis, University of Chinese Academy of Sciences, Beijing, China, 2021. [Google Scholar]
  38. Cai, Y. Research on Greenhouse Mobile Robot Navigation Based on Multi-Sensor Fusion. Master’s Thesis, Fujian Agriculture and Forestry University, Fuzhou, China, 2023. [Google Scholar]
  39. Lan, Y.; Yan, Y.; Wang, B.; Song, C.; Wang, G. Current status and future development of the key technologies for intelligent pesticide spraying robots. Trans. Chin. Soc. Agric. Eng. 2022, 38, 30–40. [Google Scholar]
  40. Zhang, L. Design of Mobile Platform for Greenhouse Image Acquisition Based on UWB and LIDAR. Master’s Thesis, Northwest A&F University, Yangling, China, 2023. [Google Scholar]
  41. Bi, S.; Zhang, G.; Li, Z.; Hu, F. Positioning Method of Greenhouse Plant Protection Robot Based on Distance Measurement Value Correction. Trans. Chin. Soc. Agric. Mach. 2023, 54, 347–358. [Google Scholar]
  42. Long, Z.; Xiang, Y.; Lei, X.; Li, Y.; Hu, Z.; Dai, X. Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR. Sensors 2022, 22, 4819. [Google Scholar] [CrossRef]
  43. Huang, J.; Li, H.; Jia, C. Fusion Positioning Method of UWB and Odometer Based on Kalman Filter. Mach. Tool Hydraul. 2022, 50, 119–125. [Google Scholar]
  44. Ali, R.; Liu, R.; Nayyar, A.; Qureshi, B.; Cao, Z. Tightly Coupling Fusion of UWB Ranging and IMU Pedestrian Dead Reckoning for Indoor Localization. IEEE Access 2021, 9, 164206–164222. [Google Scholar] [CrossRef]
  45. Yang, Z.; Zhang, H.; Lyu, W.; Yu, J.; Zhang, C. An Improved Adaptive Extended Kalman Filter Algorithm for Radar Target Tracking. Fire Control Command. Control 2024, 49, 19–24. [Google Scholar]
  46. Shi, P.; Li, J.; Liu, X.; Zhang, Y. LiDAR Global Localization and Loop Closure Detection Based on Indoor Cylinders. Geomat. Inf. Sci. Wuhan Univ. 2024, 49, 1088–1099. [Google Scholar]
Figure 1. A fusion positioning framework based on UWB/IMU/ODOM/RF.
Figure 1. A fusion positioning framework based on UWB/IMU/ODOM/RF.
Sensors 24 04998 g001
Figure 2. Layout of UWB positioning base station and planning trajectory.
Figure 2. Layout of UWB positioning base station and planning trajectory.
Sensors 24 04998 g002
Figure 3. Schematic diagram of mobile robot’s kinematics.
Figure 3. Schematic diagram of mobile robot’s kinematics.
Sensors 24 04998 g003
Figure 4. Greenhouse environment.
Figure 4. Greenhouse environment.
Sensors 24 04998 g004
Figure 5. Greenhouse mobile robot ((a)—overall robot; (b)—inertial measurement unit; (c)—UWB tags; (d)—rangefinder; (e)—odometer).
Figure 5. Greenhouse mobile robot ((a)—overall robot; (b)—inertial measurement unit; (c)—UWB tags; (d)—rangefinder; (e)—odometer).
Sensors 24 04998 g005
Figure 6. Metrics for navigation accuracy measurement.
Figure 6. Metrics for navigation accuracy measurement.
Sensors 24 04998 g006
Table 1. Comparison of lateral deviation measured by two methods.
Table 1. Comparison of lateral deviation measured by two methods.
Sensor Measurement Deviation/mLaser Measurement Deviation/m
2 m0.0740.028
4 m0.0820.049
6 m0.0880.037
8 m0.0890.036
10 m0.0940.031
12 m0.0280.052
14 m0.0470.036
16 m0.0700.047
18 m0.0740.038
20 m0.0850.034
22 m0.0470.039
24 m0.0250.046
26 m0.0560.043
Table 2. Static positioning results of UWB positioning system in NLOS environment.
Table 2. Static positioning results of UWB positioning system in NLOS environment.
Actual
Coordinates/m
Average Positioning Coordinates/mRMSE/m
Before
Weakening
After
Weakening
Before
Weakening
After
Weakening
(1.250, 25.000)(1.283, 25.251)(1.252, 25.231)0.2440.126
(1.250, 20.000)(1.236, 20.430)(1.240, 20.350)0.2610.209
(1.250, 15.000)(1.334, 15.164)(1.254, 15.160)0.4810.115
(1.250, 10.000)(1.521, 10.473)(1.252, 10.274)0.4580.141
(1.250, 5.000)(1.334, 5.382)(1.248, 5.152)0.3290.119
(4.000, 5.000)(4.249, 5.596)(4.016, 5.402)0.4970.226
(4.000, 10.000)(4.138, 10.472)(4.065, 10.302)0.4000.201
(4.000, 15.000)(4.451, 15.292)(3.987, 15.176)0.4570.122
(4.000, 20.000)(4.143, 20.625)(4.012, 20.196)0.4070.140
(4.000, 25.000)(3.744, 25.477)(4.009, 25.324)0.4470.189
Table 3. Navigation accuracy test results.
Table 3. Navigation accuracy test results.
Coordinate Test Point/mLateral Deviation/mCourse Deviation/°
(1.25, 26.00)−0.0431.426
(1.25, 24.00)−0.0462.103
(1.25, 22.00)0.039−3.277
(1.25, 20.00)−0.0341.325
(1.25, 18.00)−0.0381.019
(1.25, 16.00)−0.047−1.715
(1.25, 14.00)0.036−4.179
(1.25, 12.00)−0.0523.332
(1.25, 10.00)0.0311.415
(1.25, 8.00)0.0362.927
(1.25, 6.00)−0.037−4.348
(1.25, 4.00)−0.049−2.246
(1.25, 2.00)0.028−5.433
(2.25, 1.50)−0.0243.586
(3.25, 1.50)0.031−2.729
(4.00, 2.00)−0.0522.331
(4.00, 4.00)0.043−2.211
(4.00, 6.00)0.0495.226
(4.00, 8.00)0.033−3.846
(4.00, 10.00)−0.043−4.233
(4.00, 12.00)−0.048−6.817
(4.00, 14.00)0.0221.215
(4.00, 16.00)−0.032−6.424
(4.00, 18.00)0.0273.239
(4.00, 20.00)0.022−3.622
(4.00, 22.00)−0.031−7.391
(4.00, 24.00)−0.033−8.143
(4.00, 26.00)0.035−3.835
Table 4. Navigation accuracy test results.
Table 4. Navigation accuracy test results.
Absolute Range of Lateral Deviation/mProportion/%
(0, 0.01]0
(0.01, 0.02]0
(0.02, 0.03]26.67
(0.03, 0.04]40
(0.04, 0.05]20
(0.05, 0.06]6.67
(0.06, 0.07]3.33
(0.07, 0.08]0
(0.08, 0.09]3.33
(0.09, 0.10]0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cheng, B.; He, X.; Li, X.; Zhang, N.; Song, W.; Wu, H. Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion. Sensors 2024, 24, 4998. https://doi.org/10.3390/s24154998

AMA Style

Cheng B, He X, Li X, Zhang N, Song W, Wu H. Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion. Sensors. 2024; 24(15):4998. https://doi.org/10.3390/s24154998

Chicago/Turabian Style

Cheng, Bo, Xueying He, Xiaoyue Li, Ning Zhang, Weitang Song, and Huarui Wu. 2024. "Research on Positioning and Navigation System of Greenhouse Mobile Robot Based on Multi-Sensor Fusion" Sensors 24, no. 15: 4998. https://doi.org/10.3390/s24154998

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop