Next Article in Journal
Latency-Adjustable Cloud/Fog Computing Architecture for Time-Sensitive Environmental Monitoring in Olive Groves
Previous Article in Journal
Assessment of RGB Vegetation Indices to Estimate Chlorophyll Content in Sugar Beet Leaves in the Final Cultivation Stage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting

1
College of Engineering and Department of Entomology, University of Georgia, Tifton, GA 31793, USA
2
Department of Crop and Soil Sciences, University of Georgia, Tifton, GA 31793, USA
3
Cotton Incorporated, Cary, NC 27513, USA
4
Department of Entomology, University of Georgia, Tifton, GA 31793, USA
*
Author to whom correspondence should be addressed.
AgriEngineering 2020, 2(1), 150-174; https://doi.org/10.3390/agriengineering2010010
Submission received: 30 January 2020 / Revised: 19 February 2020 / Accepted: 2 March 2020 / Published: 4 March 2020

Abstract

:
In this review, we examine opportunities and challenges for 21st-century robotic agricultural cotton harvesting research and commercial development. The paper reviews opportunities present in the agricultural robotics industry, and a detailed analysis is conducted for the cotton harvesting robot industry. The review is divided into four sections: (1) general agricultural robotic operations, where we check the current robotic technologies in agriculture; (2) opportunities and advances in related robotic harvesting fields, which is focused on investigating robotic harvesting technologies; (3) status and progress in cotton harvesting robot research, which concentrates on the current research and technology development in cotton harvesting robots; and (4) challenges in commercial deployment of agricultural robots, where challenges to commercializing and using these robots are reviewed. Conclusions are drawn about cotton harvesting robot research and the potential of multipurpose robotic operations in general. The development of multipurpose robots that can do multiple operations on different crops to increase the value of the robots is discussed. In each of the sections except the conclusion, the analysis is divided into four robotic system categories; mobility and steering, sensing and localization, path planning, and robotic manipulation.

1. Introduction

The cotton industry holds an important position as a commercial crop worldwide, especially in the U.S, China, India, and Brazil, who are the leading producers of cotton [1]. However, the cotton industry has several challenges, particularly in cotton harvesting. Timely harvesting of the quality cotton fiber is among the most pressing challenges in the cotton production industry. The current practice of mechanical harvesting after defoliation has led to huge losses in the industry since its inception in the 1950s [2]. The open bolls can sit 40 days waiting to be picked, since it is advised to pick the cotton after at least 60% to 75% of the cotton bolls are opened [3]. Also, the cotton picker needs defoliated plants to harvest, which adds expense to the farmer [3]. The defoliants are applied to the plants, and the farmers need to wait 10 to 14 days before harvesting the crop. Defoliant activity can also be compromised by rainfall. Also, cotton is harvested at or below 12 percent moisture because wet cotton brings clogging problems in the picker and the ginning process, which can add more waiting time during harvesting [3]. This waiting time exposes the open bolls to harsh conditions that degrade their quality. Any solution that would reduce cotton losses and improve quality would be welcomed by the industry. In most cases, the mechanical combine machines are very big and expensive (the current six-row cotton picker costs around $725,000). Unfortunately, expensive cotton pickers are stored under the shed for more than nine months a year, waiting to harvest for only three months. Also, the machines weigh more than 33 tons, causing soil compaction, which reduces land productivity [4]. The maintenance of such machines is also expensive and complicated. Breakdowns in the field can take days to repair, reducing operating efficiency, and exposing bolls to further weather-related quality degradation [2].
Most cotton harvesting technologies are either “stripper” or “spindle” pickers [3]. The “stripper” grabs the lint from the plant and some amount of plant matter [3]. Later, the lint is separated from the plant matter by dropping the heavy plant matter while leaving the lint behind, which is directed to the basket at the back of the machine. The “spindle” grabs the seed-cotton from the plant by using barbed spindles that rotate at a high velocity. Then, a counter-rotating doffer is used to strip the seed-cotton from the spindles [3]. So, for each row, one “stripper” or “spindle” picking tool is used. This means six-row picking technology has six picking tools for each row. The current six-row picking technology navigates at 5.5 mph in the field and covers around 8 to 10 acres per hour. [5] and [6] estimated that 40-inch row spacing has 23.2 bolls per row-ft for a two bales per acre yield while 30-inch row spacing has 17.4 bolls per row-ft for a two bales per acre yield. A 40-inch row has 13,081 linear feet per acre, while a 30-inch row has 17,424 linear feet per acre. This means that an acre has an estimated 303,479 bolls per acre for a 40-inch row and 303,178 bolls per acre for 30-inch row spacing (Table 1). Small robotic rovers that collect at least 12,140 bolls per trip every day 25 times per harvest cycle will cover around 303,500 fresh open bolls that have been exposed to minimum degradation. A small rover moving at 3 mph and picking one boll every 3 s and working 10 hours per day would finish harvesting one acre of 40-inch rows within 50 days (Table 1). Hence, the development of a robot that costs around $7000 will equate to 104 robots to compare with one large machine that costs more than $725,000 (Table 1).
One potential advantage of robotics is using a single rover platform for multiple tasks by using interchangeable attachments. By changing attachments and selecting the appropriate software application, robots can perform tasks like planting, weeding, spraying, and harvesting. Also, these machines can be reprogrammed to cover different tasks on a different crop, which can be a huge cost-saving measure for both small and large farmers.
The deployment of autonomous machines can improve the quality of the fiber since only a boll is picked. Autonomous means no human intervention, which reduces labor costs. Also, the need to serve the environment by stopping the use of undegradable defoliants is very important, as the robots may pick the mature bolls as they appear without defoliation. The use of lightweight machines dramatically reduces soil compaction [7]. The cost of operations can also be reduced since autonomous robots may need less supervision and hence low labor costs. Also, electrical energy sources like solar energy can be introduced to reduce fuel costs because robotic machines are light, electric and can survive with limited energy consumption.
The unstructured environment, like the agricultural field, requires more advanced methods of machine learning [8]. The unstructured approach is required for agriculture rather than a structured approach [9]. However, advancements of machine vision, machine learning (especially deep learning), sensing, and end effector manipulation have fueled the application of robots in an unstructured agricultural environment. However, most of these machines have been deployed in horticultural crops only [10].
The development of a cotton harvesting robot is feasible because there is an opportunity to use the current advancements in machine vision, actuators, motors, and agricultural robotics to develop an autonomous platform to harvest cotton bolls and be adaptable to other cotton operations like planting, spraying, weeding and scouting. To the best of our knowledge, there is no commercial cotton harvesting robot available yet. Hence, we propose to review current cotton harvesting robot research and opportunities and challenges for future development.

2. Methodology

We discuss the most important issues concerning agricultural robots, harvesting robots, and finally, cotton harvesting robots. We start our discussion with general agricultural robots and then harvesting robots because we believe multipurpose agricultural robots have more value compared to single-purpose agricultural machines. So, it is feasible to adopt some of the commercially available machines to develop cotton harvesting robots at a lower cost. To achieve this, the literature was identified using the following keywords; “robot,” “agricultural robot,” “cotton harvesting robot,” “crop imaging,” “cotton harvesting,” “cotton robot,” “picking robot,” “robots in agriculture” and “harvesting robots.” Several relevant papers with the keywords from the leading databases and indexing, such as Web of Science, Google Scholar, Science direct, UGA libs, ProQuest, IEEE Xplore, and Scopus, were retrieved and identified. Also, Google and Bing search engines were used to retrieve any commercial or non-academic material related to the mentioned keywords, as most of the commercial companies prefer to advertise products instead of writing scientific papers. Since the cotton harvesting robot is a new idea, there were very few materials covering the topic. Other sources of information like YouTube were also investigated to uncover any commercial or related works presented by companies or hobbyists. Approximately 74 peer-reviewed articles, 4 chapters, 3 books, 6 scientific reports, 24 refereed conference proceedings, and other sources from websites of commercial agricultural robotic companies were selected and included in this review.
Each of the articles retrieved was analyzed according to the robot components in Figure 1. Robots usually consist of 4 components (Figure 1): sensing and localization (Sensing), path planning (Planning), mobility and steering (Mobility), and end effector manipulation (Manipulation) [11]. The adoption and performance of agricultural robots rely solely on those four components [11]. It is difficult for a robot to succeed when some of the components do not meet expectations. Mobility and steering mainly focuses on providing the movement ability of the robot to reach the target and accomplish the mission. This can consist of legs, wheels, tires, plane wings, undulation abilities, propellers, etc. Sensing and localization is the perception of the environment and its occupying objects that may allow or hinder the operation of the robot. The robot needs to reason which environmental characteristics are conducive or not for it to operate. This is done by detecting a clear path, obstacles, detecting targets, and remembering current and past positions. Path planning is the optimized decision made by the robot to reach the targets. Robots need to be designed to identify the target position and then plan their movement according to their capability and the sensed information. This allows the robot to decide the most optimized path that can lead to a quick achievement of the mission. When the robot reaches the target, then end effector and manipulators (manipulation) are executed to accomplish the mission. In the case of agriculture, this can be picking fruit, spraying chemicals, measuring the size of the target, killing weeds, picking soil or leaf samples, planting seeds, plowing land, or plant irrigation.
Therefore, the robot’s operation is summarized as sense, reason, plan, and act. In each of these procedures, the robot may succeed or fail and take some time to succeed or fail. Therefore, it is possible to measure success rates and execution times of each operation. [12] proposed a very good method and parameters to measure robot performance in the field (Table 2).
Finally, we discuss four main topics regarding cotton harvesting robots consisting of the four main operations in Figure 1. In the first topic, we discuss general agricultural robotic advances and opportunities that can be inherited in specified field operations. In the second topic, we discuss the main harvesting robots available and similarity to other farm operations. In the third topic, we discuss cotton harvesting robotics status and compare it with other harvesting robots. Lastly, we conclude and frame the future work required for cotton harvesting robots

3. Agricultural Robotics

General operations in agricultural robots are equivalent to each other for similar crops and differ slightly in other types of crops. It is important to discuss the agricultural robot framework that can be adapted to other crops. In agriculture, various jobs such as plant phenotyping, sorting and packing, scouting, mowing, pruning, thinning, planting, spraying, weeding, harvesting and picking could be automated using robots. This can be achieved by the same robot with changes in attachments and selecting a different computer program for robotic perception and a different end-effector for the new task. This kind of robot is called a multi-functional intelligent agricultural robot (MIAR).

3.1. Agricultural Robot Mobility and Steering

Most agricultural robots reported use wheels or legs (Table 3). Legs are advantageous for flexible movement in the agricultural field with high occlusion of stems and branches, but wheels provide faster and more convenient navigation in the field. Some emerging technologies involve the use of drones for agricultural operations, such as spraying and scouting, but are excluded from other operations such as crop harvesting and pruning. A more leveraged approach for operations like scouting is to combine the drone large area sensing with a ground robotic system that is partially directed by analysis of drone data [14]. The combined system achieves timely and efficient operations in the agricultural field [14]. Legged robots may be limited in speed, but are advantageous for multiple obstacle avoidance, irregular terrains, and crevices [15]. Over time, the deployment of wheeled robots has become more prevalent [15]. Comparing the two, the execution time is good with the wheeled robot, but legged robots achieve a good success rate, and have the flexibility to maneuver over diverse terrains [15].
There are several types of mobility in agricultural robots according to the condition of the agricultural field or robotic operation that would be cost-effective and fast. For high-speed navigation, robots over rails are useful, especially in phenotyping studies (Figure 2b). Legged robots like AgAnt (Figure 2a) and the Tarzan robot, which swings over the crops on a wire (Figure 2c), are both preferred in wetlands and close inspection of crops and animals. The rack-like [16] weeding robot (Figure 2d) and Fuji Agricultural robot (Figure 2f) are preferred in slippery grounds to reduce skidding. Dogtooth (www.dogtooth.tech), which is a strawberry harvesting robot, uses a track in a nominal strawberry growing system because it is a convenient method of navigation in greenhouses. Swinging robots like the Tarzan robot discussed above can be very good for high throughput phenotyping tasks as they can maneuver close to the plants or animals compared to drones. However, the mobility needed also depends on the flexibility required on the farm. Four-wheel steered robots like SwagBot (Figure 2i), Thorvald II (Figure 2e), or Agribot (Figure 2g) are required for conditions where the wheel traction is difficult, such as a feedlot, or any muddy environment. However, in most cases, for normal operation, a two-wheel turning robot is enough for farming operations.
Auat Cheein et al. [17], Ouadah et. al. [18], and Cheein et.al. [19] presented a simple model for a mobile robot that can explicitly demonstrate how mobility is modeled with a car-like unmanned mobile robot.
Xue et.al. [20] reported a skid-steer robot that controlled the wheels on either side of the mobile robot by linking them.
The navigation of a robot (Figure 3) in row-crop production should be easy to track and retrieve while working to allow self-navigation when some of the sensors (GPS or IMUs or cameras) fail.

3.2. Agricultural Robot Sensing

Sensing is done to update the system on the environment so that it can navigate or pick fruits [2,11], discover disease, insects, or weeds, control spraying height above the canopy, and other tasks. Robust sensing systems are required for the robot to work well in dynamic environments with changing weather conditions, vegetation variation, topographical changes, and unexpected obstacles. Most agricultural robots, so far, use image sensing systems and Global Navigation Satellite Systems (GNSS) to achieve the localization of the robot (Table 3). The advancement of imaging technologies has provided a great opportunity to sense and create 2D, 3D, and 4D (spatial + temporal) images of plants [24]. Technologies to obtain 2D, 3D and 4D perception of the environment has been achieved using the following sensors in agricultural fields; visible light, near-infrared, thermal, fluorescence, spectroscopy, structural topography imaging, fluorescence, digital imaging (RGB), multispectral, color infrared, hyperspectral, thermal, spectroradiometer, spectrometer, 3D cameras, moisture, pH, light-reflective, light detection and ranging (LIDAR), sound navigation and ranging (SONAR), ground-penetrating radar and electrical resistance tomography [24,25,26,27,28,29,30]
Other sensors, such as potentiometers, inertial, mechanical, ultrasonic, optical encoder, RF receiver, piezoelectric rate, Near Infrared (NIR), laser range finder (LRF), Geomagnetic Direction Sensor (GDS), Fiber Optic Gyroscope (FOG), piezoelectric yaw, pitch and roll rate, acoustic and Inertial Measurement Units (IMUs) have been used to provide direction of the robot and navigation feedback [7,31,32,33].
The choice of imaging sensor somewhat depends on the distinct characteristics of the target from the rest of the obstacle-dense environment. The normal digital camera may be used if the target on the field can be visually identified. For example, to identify green citrus or green bell pepper in a population of green plants may require using an alternative sensor, or the method of detection may be complicated by involving advanced methods of machine learning [2,34,35,36,37,38]. Images may suffer from illumination changes, motion change, cluttering, temperature swings, camera motion, wind-induced movements, deformation, and scene complexity. Hence, some image refinement algorithms may be required to enhance the images [35,38]. Then, object recognition or feature extraction using pattern recognition and other machine vision algorithms can be performed. There are several methods of image rectification and enhancement that have been reported; image smoothing and segmentation [2,36,39], morphological operations and filters [2,34,38], a fast bilateral filtering based Retinex [35], illumination normalization [2,35], image color space-changing [40], and normalized co-occurrence matrix and gray level co-occurrence matrix [41]. Feature extraction can be achieved using classical image processing techniques or advanced techniques in machine learning, such as color filtering and masking [2].
After sensing the surrounding environment, the robot sensors need to recognize and establish a position within the environment so that the robot can make navigation decisions to reach its target. The use of machine vision and GPS has been used in agriculture to recognize the position and even help the robot to move in-between or over the rows of crops and turn at the end of the row [42]. The robot needs a quick decision for localization so it can decide to move. In so doing, the simultaneous localization and mapping (SLAM) algorithms are required to achieve the mission [42].
In a compact robot, it could be useful to use wireless sensors and utilize the Robotic Operating System (ROS) to transmit data between controllers and sensors. However, the wireless transmission may be affected by several features like the radio transmission standard used, data rate, nodes allowed per master controller, slave enumeration latency, the data type to be transmitted, the range of transmission, extendibility, sensor battery life, costs and complexity [43].

3.3. Agricultural Robot Path Planning

Path planning in agricultural fields means the decisions made by the robot to navigate in an agricultural field safely without destroying the plants (Figure 3). Path planning also involves a technique to plan for the movement of the manipulators to the target. In other words, path planning is the technique used to utilize the information provided by the sensing unit of the robot to decide on steering and manipulation to accomplish the mission.
There are several path planning algorithms developed for robotics systems, such as grid-based search algorithms (assumes every point/object is covered in a grid configuration [44,45]), interval-based search algorithms (generates paving to cover an entire configuration space instead of grid [44]), geometric algorithms (find safe path from the start to goal initially [46]), reward-based algorithms (a robot tries to take a path, and it is rewarded positively if successful and negatively if otherwise [47]), artificial potential fields algorithms (robot is modeled to be attracted to positive path and repelled by obstacles [48]), and sampling-based algorithms (path is found from the roadmap spaces of the configuration space). Each of the algorithms has potential use, and some are just classic methods like grid-based algorithms [49]. However, the most advanced methods are sampling-based algorithms, as they attain considerably better performance in high-dimensional spaces using a large degree of freedom. Since many robots in agriculture will work in swarms to accomplish tasks comparable to the big machines currently used, real-time path and motion planning are required to control and restrict swarm agents’ motion [49].
For plants like cotton, overlapping leaves prevent the robot from seeing clear rows to navigate and move the manipulator [7]. This was not the case for large plants like citrus, in which the rows were clear for the robot to move in between and pick the fruit on both or one side of the row [50]. The robot also needs to plan how it is going to move between the row without repeating the same rows using simultaneous localization and mapping (SLAM) and how the arm is going to move without destroying branches [51].

3.4. Agricultural Robot Manipulation

Manipulators and end effectors are tools designed for the smooth operation of the robot on objects and the environment. End effectors consist of the gripper or a certain tool to be impactive (physically grasping objects, like the citrus robot reported by [39]), ingressive (physically penetrate the surface of the object, like the soil sampling robot reported by [52]), astrictive or attractive (suction objects by using external forces, like the tomato gripper reported by [53]) or contigutive (direct adhesion to the object) [54,55,56,57,58,59]. Some robots may use a combination of two or more end effector techniques; for example, [53] used both astrictive and impactive grippers to improve success rates in tomato picking.
Table 3 discusses other agricultural robots designed to work on non-harvesting tasks. N/A means the authors did not report any information related to that category. Most of the robots use GPS, camera, and four-wheel platforms.
Manipulators can be identified by their freedom of movement in space. This is known as degree of freedom (DOF) (Figure 4) which means the body can freely change the position as up/down (known as heave), left/right (known as sway), forward/backward (known as surge) and it can do orientation through rotation by yawing (around normal axis), pitching (around lateral axis), or rolling (around longitudinal axis) [57]. In agriculture, robots have been designed to accommodate various levels of DOF from three DOF (strawberry robot designed by [58]) to seven DOF (tomato robot designed by [53]). As DOF increases, flexibility increases, but it may become heavier and slow in response [8,59]. In agriculture, high power-weight ratio actuators are more suitable and effectively used [42].

4. Agricultural Harvesting Robotics

We identified several harvesting robots that have been developed and reported that could potentially be used as a template for a robotic system in cotton.

4.1. Agricultural Harvesting Robot Mobility and Steering

Most of the reported robots in agriculture above for harvesting were wheeled robots (Table 4). Also, these robots have an arm mounted on top of the vehicle moving in-between or over the rows (Table 4). Most of the four-wheeled robots reported turn using front tires (Ackerman steering model) (Table 4). Some that are deployed in greenhouses use rails, since greenhouses are semi-structured farms [78,79]. Fraunhofer Institute for Production Systems and Design Technology IPK (www.ipk.fraunhofer.de) developed a prototype of a dual-arm robot that navigated by using rails for cucumber harvesting (Figure 2b) that was semi-autonomous.

4.2. Agricultural Harvesting Robot Sensing

Cotton bolls appear like flowers; hence, any potential flower harvesting robot could be adaptable. The 3D positions of flowers can be obtained using stereotypic cameras [80]. Also, [80] reported that in stereotypic cameras, increasing the distance between lenses reduces errors, while increasing the distance between the lens and the object (flower) increases error. In harvesting, it becomes more complicated due to the occlusion of the bolls. Ripe fruit may be located inside the canopy, where access can be limited.

4.3. Agricultural Harvesting Robot Path Planning

In harvesting, path planning is dependent on the manipulators, end effectors, and the agricultural produce to be harvested. In any case, if the fruit to be gripped is very delicate, then path planning becomes more complicated for impactive end effectors compared to sucking end effectors to avoid collisions that may damage the fruit [81]. Also, the fruit to be sold to consumers is expensively harvested as the robot needs to match human picking action compared to fruits harvested for juice or industrial processing. Most of the heavy mechanical robotic machines may be used to harvest fruits for industrial use since the machines may be fast enough compared to a robot.
If the plant branches are weak or the fruit is very delicate, path planning becomes expensive to preserve the plant that needs to be left undestroyed. Path planning is also expensive when many degrees of freedom (DOF) arm is used. However, most methods for path planning are more effective and successful when the number of DOF is optimized to be small enough to achieve the purpose [82]. Also, in multiple arm robots, some machines use a prescription map to harvest many fruits at high speeds [83]. Hohimer et al. [81] concluded that by increasing the degrees of freedom, the apple fruit picking robot was performing well but at the slowest speed. This was caused by the path prediction algorithms, and the time the actuators took to reach the target. They advised attempting to use a lower degree of freedom to achieve the speed required to attend large fields like cotton farms.
Most of the robots reported path tracking algorithms to navigate on the farm using GPS and cameras (Table 4). Most of the robots used for greenhouse harvesting use rails; hence they do not need navigation algorithms but rather position control algorithms (Table 4). Also, most of the studies except [84] reported motion planning, which is done using arm trajectory motion without including search mechanism algorithms for path planning or obstacle avoidance. However, [85] introduced path planning in agriculture using advanced methods in neural networks (NN) and a genetic algorithm (GA) in 1997. Also, [86] developed a robot path planning system with limited end-of-row space using a Depth-First Search (DFS) algorithm.

4.4. Agricultural Harvesting Robot Manipulation

For manipulators, various degrees of freedom (DOF) have been studied, including a three-DOF rectangular coordinate manipulator to the nine-DOF manipulator (Table 4). For end-effectors, it mainly depends on the type of farm product to grip and the degree of abrasion that can be tolerated. Impactive, attractive, and contigutive end effectors are the most common, with most of the end-effectors being attractive, impactive, or both. This was because fruits that require robotic harvesting need expensive handling to avoid abrasions [87]. Hence, ingressive end effectors are not common in agricultural harvesting of fruits as most must be pristine for the fresh market.
Manipulators are evaluated using success rates [12]. Cotton boll harvesting needs less than 3 s for each boll to be effective. [88] reported a strawberry robot success rate of 53.9% while [89] got a success rate of 62.2% on picking tomatoes. [90] got a success rate of 84% for apple picking. All the researchers (Table 4) that reported the execution time have achieved an execution time of more than 20 secs per fruit. Hence, cotton harvesting cannot adopt the manipulation methods reported, at least without some modification to increase success.

5. Cotton Harvesting Robot

Cotton bolls, as seen in Figure 5, do not require soft robotics like other fruit crops, which may require very careful design of the end effector and manipulation to avoid fruit damage. The plants are close to each other because the cotton plant tends to fill out spaces as it grows [95]. Most of the bolls begin opening from the bottom of the canopy [95].

5.1. Cotton Harvesting Robot Mobility and Steering

Most of the cotton harvesting robots reported use four-wheel vehicles (Figure 6, Figure 7 and Figure 8). Figure 7 is a prototype developed in India by a startup owned by Sambandam company. The prototype involves a four-wheel vehicle that is used in small farms in India. However, this prototype was designed to be controlled by human operators for navigation. The same approach (Figure 6 and Figure 8) of using a four-wheel rover but with center-articulation was proposed in our group as well [2,7,96]. Currently, no other type of mobility or steering and navigation algorithm or method for cotton harvesting has been reported.
The use of robots in harvesting cotton faces a complex environment for robot mobility (Figure 5). Since plants are very close to each other and leave a thin path, the adoption of accurate autonomous navigation that uses the fusion of sensors like IMUs, GPS, and machine vision becomes a vital requirement. Accurate path following without breaking branches will increase the precision and other metrics of the robotic system. Due to this complexity, [11] proposed the use of humans in operating semi-autonomous robots just to increase the productivity and quality of the operation rather than leaving the machines alone. The technology for semi-autonomous or autosteering navigation is also currently available; hence, it can be more easily accepted and adopted. However, commercial autonomous tractors are highly desirable in precision farming because they are cost-effective, can reduce labor requirements, and are safe to humans if designed well. [7] proposed a navigation algorithm for navigating the cotton field by detecting the rows from above. Depth maps are acquired, transformed into binary depth maps, and then rows detected using a sliding window algorithm, which compares the depth of the pixel to differentiate between canopy and land.

5.2. Cotton Harvest Robot Sensing

Cotton is an indeterminate crop and continues to open bolls for a period of approximately 50 days [95]. Hence, there is a need to harvest bolls as they open. Sensing capability should be able to distinguish fully opened bolls from others, and it should be able to detect open bolls located at the bottom of the plant canopy. However, lowering the camera into the canopy could readily destroy the lenses due to plant branches’ impact.
Fortunately, the cotton’s whitish color gives it a distinguishing feature to be easily detected by a color camera. Also, the cotton recognition algorithm should be able to work well under direct sunlight. There are several cotton recognition algorithms reported using machine vision techniques like color segmentation [2], optimized segmentation algorithm based on chromatic aberration, color subtraction and dynamic Freeman chain coding, region-based segmentation, deep learning methods and ensemble methods [96,97,98,99,100,101]. All the methods described in these studies can be adopted to improve the current cotton harvesting prototypes. However, it was a challenge to detect separately occluded bolls using color segmentation [2,100].
Our group designed a cotton detection algorithm using a stereo camera that was able to precisely locate and track cotton bolls using deep learning [97]. A stereo camera (ZED) was used to estimate boll positions and found the mean error standard deviation increased as the speed of the rover and installed camera increased to 0.64 km/h [2]. The robot was performing well with 9 mm RMSE and an average R2 value of 99% when stationary, but when the vehicle started moving to approximately 0.64 km/h, the R2 dropped to 95%, and RMSE increased to 34 mm [2]. It was the only study that has demonstrated the detection and estimation of the location of cotton bolls in field conditions in real-time using an embedded system.

5.3. Cotton Harvest Robot Path Planning

With high cotton boll occlusion, path planning for navigation and manipulators becomes a very crucial requirement for the successful deployment of a commercial agricultural robot [102]. There is no research discovered that describes path planning a cotton harvesting robot rather than navigation planning of the robot along the rows [7]. However, it seems the current researchers do not see the necessity to develop a commercial product in a non-specialty crop that demands good path planning. Most of the designed path planning algorithms in agricultural robots use IMU, camera, and RTK-GPS (Table 3 and Table 4). Hence, cotton harvesting systems may adopt this approach too. If small robots are adopted in cotton harvesting, navigation between the rows using Lidar has been shown to be successful [103].
The cotton field environment is highly unpredictable due to varying plant canopy growth patterns. Cotton crop canopy grows to fully cover the space between the rows, and it can grow very tall [95]. Planting practices, especially plant spacing, requires special recommendations for robotic harvesting. This could be done in cotton by modifying farm management practices or by manipulating the genes of the crops to allow easy access to the bolls for robotic manipulators. This is common for specialty crops like apples, strawberries, and grapes, which were bred to provide effective access to fruits.

5.4. Cotton Harvest Robot Manipulation

Current approaches to grippers are not effective for cotton plants because the cotton boll fibers stick on the end effector. So, grippers need to be strong enough to grab the boll effectively without destroying the plant. With harvesting as bolls open comes a challenge to design manipulators which start harvesting bolls at the bottom of the plants, and that are highly occluded by the canopy. The reported cotton harvesting manipulators and end effectors picked the cotton boll, but they also broke the plant branches, removed leaves, or knocked down unharvested bolls to the ground [97]. Therefore, a well-designed astrictive or attractive method is desirable for cotton harvesting. Figure 6 shows a prototype of a cotton harvesting robot with the two-DOF cartesian manipulator that holds a vacuum suction end effector [97]. Figure 11 shows a Clemson-developed cotton harvesting prototype robot that uses a two-DOF cartesian manipulator. Figure 9 and Figure 10 present a gRoboMac prototype robot that uses a three-DOF manipulator and four-DOF manipulator, respectively. All the reported manipulators in cotton harvesting use astrictive or attractive grippers since cotton lint does not require careful handling like other fruits [97].
In 2019, a team in India designed a rigid vacuum cleaner machine as the best alternative for a cotton harvesting end effector (Figure 9 and Figure 10). The gRoboMac team did not report execution time, which was a very important parameter for effective cotton harvesting. [97] obtained a preliminary execution time of 17 s per boll. Both groups ([97] and gRoboMac) reported manipulators that used two-DOF and four-DOF manipulators, respectively. Simple manipulators have a high execution time [81]. For example, [81] reported that the eight-DOF apple harvesting robot was 10,000 times slower compared to the five-DOF manipulator robot. However, the eight-DOF robot was flexible to reach most of the fruits hence provided high success rates. A Clemson University team (Figure 11) also proposed a similar approach but using a small rover riding in between the rows. [97] reported the use of a vacuum end effector with rotating tines to remove bolls (Figure 6), which has been widely used by humans to pick cotton in China and other developing countries. [97] modified the system to be used in robotic systems.

6. Challenges in Commercial Deployment of Agricultural Robots

The initial investment in row crop robotics systems may become very big for an average farmer [10]. As much as USD 319,864 for an 850 ha farm is required for investment in intelligent machines to achieve maximum break-even point [104]. Fortunately, [104,105] concluded that farming robots would bring profitable business to farmers because robots can reduce 20% of the scouting costs for cereals, 12% for sugar beet weeding, and 24% for inter-row weeding. Robots can work like a swarm of small robots to accomplish farm operation at a very competitive cost compared to current machines [106]. Non-horticultural crops like maize, soybean, barley, potato, wheat, and cotton have not been given priority in economic studies on robotic systems after evaluating several studies in databases such as GreenFILE, Business Source Complete, AgEcon Search, Food Science Source, Emerald, CAB Abstract, ScienceDirect, and [10]. Fortunately, the same challenges in agricultural robotics cut across different farming operations and crops.
There are five commercial parameters, and at least one of them should be unlocked for agricultural robotics to succeed [11]. Firstly, the cost of the new robot should be lower than the current methods used. Secondly, the introduction of robots should increase the capability, productivity, quality, and profitability of production. Thirdly, the introduction of robots should increase uniformity and quality in farm production and decrease variability and uncertainty. Fourthly, the use of robots may increase and fasten farm management decisions that are not able to be achieved by the current methods. Lastly, the use of robotics should remove human beings from operating on environmental risky tasks, particularly the use of heavy machines and chemicals, hence reducing labor and insurance payment for labor. Also, there are other factors that can be indirectly important for farmers, such as the ease of use and maintenance of the robot compared to current methods and reduction in soil compaction [11].
The design of the manipulators may also be a great challenge in the agricultural field. Single-arm robot design also may not be effective for large farms. However, the challenge of agricultural robotics with more than three DOF has been “sensing and moving” at rapid harvesting rates [102]. It has been a challenge for on-the-go field harvesting due to the robotic arm moving the branches of the target; hence, camera feedback was necessary to determine the latest position of the target before harvesting by the manipulator [102]. So, it was concluded that research and development of commercial harvesting systems should concentrate on increasing the speed and accuracy of robots in a harsh and varying environment.
The current research in the cotton harvesting robot our team is developing provides a MIAR prototype for cotton production. To our knowledge, no research has been conducted to develop robotic systems for other cotton operations, as seen in Table 3. An MIAR that would work on multiple farming tasks like sowing, spraying, weeding, scouting, and soil sampling would be useful. Cotton Inc has committed itself to funding robotic systems research in cotton and emphasizes the adoption of open-source robotics. Open-source systems have the advantage of open collaboration, multiple partners, and continuous updating. The Robotic Operating System (ROS) is a good example of open-source adoption and continuous improvements and additions through the community of open-source users [107]. Thus, open-source creates a harmonized environment for researchers that is cost-effective and can speed up development efforts. It also encourages reuse of the core libraries in the development of robots, hence reducing costs [107] and enabling more cost-effective commercialization of robotic platforms. The robotics industry is a profitable industry to engage in now. In 2019, the IDTechEx research company analyzed the robotic market and technology development growth and predicted the agricultural robotics industry would be worth $12 billion worldwide by 2027 [108]. There is an advantage of using robots as the economics models show that the net returns can increase by up to 22% compared to the current practice of using conventional machines in row crop production [104].

7. Conclusion and Future Work

In this paper, we performed a literature review on robotics in agriculture. We have looked at the relationship and similarities of the robotics systems in agriculture that can accelerate the development of cotton harvesting robots. We also examined aspects of mobility, sensing, path planning, and manipulator design. Our aim in this study was to highlight the recent opportunities and challenges of agricultural systems and the promising future of cotton harvesting robotic systems.
Sensor development for machine vision is advancing quickly, and commercial products that support sensing have also been realized. Despite modern technological advancement, the algorithm to allow a smooth interpretation of visual sensing is still a challenge in agricultural fields [109]. The sensitivity, aperture, and resolution are improving, and the present technologies in deep learning have surpassed human eye accuracy in object classification and identification [110]. Machine learning, especially deep learning algorithms, has brought high accuracy in the identification of weeds, plant cultivars, fruit counting and estimation, land cover classification, and crop and fruit type classification [109,110,111]. Most of the navigation and motion planning algorithms to navigate in row crops do not provide fully autonomous capability compared to tree crops [109,111]. Cotton needs color sensors to differentiate open bolls from semi-open bolls and flowers during harvesting.
Mobility in a cotton field may use four-wheel-drive systems to increase the speed of harvesting as reported by some researchers because cotton fields are big and require speedy and long navigation. Trained robots cannot be used since cotton is produced on outdoor farms. However, it has shown good adoption in greenhouses. Path planning is needed in four-wheel-drive systems because the robot needs to pass over the rows carefully so as to not break branches or knock cotton bolls onto the ground.
Manipulators have shown good performance when fewer degrees of freedom are used. However, for the careful handling of fruits, more degrees of freedom are required. This is not the case for cotton plants, for which the fruit is the lint. The grippers may just use astrictive or attractive grippers without destroying the lint. This is the main reason most of the research in cotton harvesting has focussed on two-DOF, three-DOF, and four-DOF manipulators.
Future designs of cotton harvesting robots need effective manipulators and sensing that can locate and pick cotton bolls located at the bottom of the canopy. Designs that involve multiple manipulators will provide fast harvesting that can match current harvesting machines. Manipulators that use fewer degrees of freedom will provide fast picking of cotton, which is critical to get to one boll every 3 s. Future design and development research should also include alternative energy sources to decrease energy costs. Studies to determine power requirements, footprint, and cost are necessary for robots to be developed for multipurpose functions and work in collaborative “swarms.”

Author Contributions

Conceptualization, K.G.F., and G.C.R.; Methodology, K.G.F., and G.C.R.; Writing—Original Draft Preparation, K.G.F.; Writing—Review and Editing, K.G.F., W.M.P., E.M.B. and G.C.R.; Project Administration, W.M.P., E.M.B., and G.C.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Cotton Incorporated, Agency Assignment #17-038.

Acknowledgments

Authors would like to thank the following; Ricky Fletcher for mechanical works, Logan Moran for Field Testing, William Hill and Tearston Adams for labeling the images, and Jesse Austin Stringfellow for 3D design and printing.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. USDA/NASS. 2017 State Agriculture Overview for Georgia; USDA/NASS: Washington, DC, USA, 2018.
  2. Fue, K.G.; Porter, W.M.; Rains, G.C. Real-Time 3D Measurement of Cotton Boll Positions Using Machine Vision Under Field Conditions. In Proceedings of the 2018 Beltwide Cotton Conferences, NCC, San Antonio, TX, USA, 3–5 January 2018; pp. 43–54. [Google Scholar]
  3. UGA. Georgia Cotton Production Guide. In Ugacotton.Org; UGA Extension Team: Tifton, GA, USA, 2019. [Google Scholar]
  4. Antille, D.L.; Bennett, J.M.; Jensen, T.A. Soil compaction and controlled traffic considerations in Australian cotton-farming systems. Crop Pasture Sci. 2016, 67, 1–28. [Google Scholar] [CrossRef]
  5. Boman, R. Estimating Cotton Yield Using Boll Counting. In Cotton.Okstate.Edu; OSU Southwest Research and Extension Center: Altus, OK, USA, 2012. [Google Scholar]
  6. Prostko, E.; Lemon, R.; Cothren, T. Field Estimation of Cotton Yields. In The Texas A&M University System; College Station: Brazos, TX, USA, 2018; Available online: http://publications.tamu.edu/ (accessed on 8 August 2019).
  7. Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. Visual Row Detection Using Pixel-Based Algorithm and Stereo Camera for Cotton Picking Robot. In Proceedings of the 2018 Beltwide Cotton Conferences, NCC, New Orleans, LA, USA, 8–10 January 2019. [Google Scholar]
  8. Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  9. Roldán, J.J.; del Cerro, J.; Garzón-Ramos, D.; Garcia-Aunon, P.; Garzón, M.; de León, J.; Barrientos, A. Robots in agriculture: State of art and practical experiences. Serv. Robot. 2018. [Google Scholar] [CrossRef] [Green Version]
  10. Lowenberg-DeBoer, J.; Huang, I.Y.; Grigoriadis, V.; Blackmore, S. Economics of robots and automation in field crop production. Precis. Agric. 2019. [Google Scholar] [CrossRef] [Green Version]
  11. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  12. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
  13. Powers, D.M.W. Evaluation: From precision, recall and F-measure to ROC, informedness, markedness & correlation. J. Mach. Learn. Technol. 2011, 2, 37–63. [Google Scholar]
  14. Burud, I.; Lange, G.; Lillemo, M.; Bleken, E.; Grimstad, L.; From, P.J. Exploring robots and UAVs as phenotyping tools in plant breeding. IFAC Pap. OnLine 2017, 50, 11479–11484. [Google Scholar] [CrossRef]
  15. Iida, M.; Kang, D.; Taniwaki, M.; Tanaka, M.; Umeda, M. Localization of CO2 source by a hexapod robot equipped with an anemoscope and a gas sensor. Comput. Electron. Agric. 2008, 63, 73–80. [Google Scholar] [CrossRef]
  16. Reiser, D.; Sehsah, E.-S.; Bumann, O.; Morhard, J.; Griepentrog, H.W. Development of an Autonomous Electric Robot Implement for Intra-Row Weeding in Vineyards. Agriculture 2019, 9, 18. [Google Scholar] [CrossRef] [Green Version]
  17. Auat Cheein, F.; Steiner, G.; Paina, G.P.; Carelli, R. Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection. Comput. Electron. Agric. 2011, 78, 195–207. [Google Scholar] [CrossRef]
  18. Ouadah, N.; Ourak, L.; Boudjema, F. Car-Like Mobile Robot Oriented Positioning by Fuzzy Controllers. Int. J. Adv. Robot. Syst. 2008, 5, 25. [Google Scholar] [CrossRef]
  19. Cheein, F.A.A.; Carelli, R.; Cruz, C.D.l.; Bastos-Filho, T.F. SLAM-based turning strategy in restricted environments for car-like mobile robots. In Proceedings of the 2010 IEEE International Conference on Industrial Technology IEEE, Vina del Mar, Chile, 14–17 March 2010. [Google Scholar]
  20. Xue, J.; Zhang, L.; Grift, T.E. Variable field-of-view machine vision based row guidance of an agricultural robot. Comput. Electron. Agric. 2012, 84, 85–91. [Google Scholar] [CrossRef]
  21. Farzan, S.; Hu, A.-P.; Davies, E.; Rogers, J. Modeling and control of brachiating robots traversing flexible cables. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, Brisbane, QLD, Australia, 21–25 May 2018. [Google Scholar]
  22. Davies, E.; Garlow, A.; Farzan, S.; Rogers, J.; Hu, A.-P. Tarzan: Design, Prototyping, and Testing of a Wire-Borne Brachiating Robot. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) IEEE, Madrid, Spain, 1–5 October 2018. [Google Scholar]
  23. Grimstad, L.; From, P.J. The Thorvald II agricultural robotic system. Robotics 2017, 6, 24. [Google Scholar] [CrossRef] [Green Version]
  24. Rahaman, M.M.; Chen, D.; Gillani, Z.; Klukas, C.; Chen, M. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci. 2015, 6, 619. [Google Scholar] [CrossRef] [Green Version]
  25. Sun, S.; Li, C.; Paterson, H.A. In-Field High-Throughput Phenotyping of Cotton Plant Height Using LiDAR. Remote Sens. 2017, 9, 377. [Google Scholar] [CrossRef] [Green Version]
  26. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
  27. Safren, O.; Alchanatis, V.; Ostrovsky, V.; Levi, O. Detection of Green Apples in Hyperspectral Images of Apple-Tree Foliage Using Machine Vision. Trans. ASABE 2007, 50, 2303–2313. [Google Scholar] [CrossRef]
  28. Cubero, S.; Aleixos, N.; Moltó, E.; Gómez-Sanchis, J.; Blasco, J. Advances in Machine Vision Applications for Automatic Inspection and Quality Evaluation of Fruits and Vegetables. Food Bioprocess Technol. 2011, 4, 487–504. [Google Scholar] [CrossRef]
  29. Deery, D.; Jimenez-Berni, J.; Jones, H.; Sirault, X.; Furbank, R. Proximal Remote Sensing Buggies and Potential Applications for Field-Based Phenotyping. Agronomy 2014, 4, 349–379. [Google Scholar] [CrossRef] [Green Version]
  30. Dong, J.; Burnham, J.G.; Boots, B.; Rains, G.; Dellaert, F. 4D crop monitoring: Spatio-temporal reconstruction for agriculture. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017. [Google Scholar]
  31. Bak, T.; Jakobsen, H. Agricultural robotic platform with four wheel steering for weed detection. Biosyst. Eng. 2004, 87, 125–136. [Google Scholar] [CrossRef]
  32. Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramechanics 2013, 50, 211–232. [Google Scholar] [CrossRef]
  33. Kim, G.; Kim, S.; Hong, Y.; Han, K.; Lee, S. A robot platform for unmanned weeding in a paddy field using sensor fusion. In Proceedings of the 2012 IEEE International Conference on Automation Science and Engineering (CASE) IEEE, Seoul, Korea, 20–24 August 2012. [Google Scholar]
  34. Sengupta, S.; Lee, W.S. Identification and determination of the number of immature green citrus fruit in a canopy under different ambient light conditions. Biosyst. Eng. 2014, 117, 51–61. [Google Scholar] [CrossRef]
  35. Wang, C.; Lee, W.S.; Zou, X.; Choi, D.; Gan, H.; Diamond, J. Detection and counting of immature green citrus fruit based on the Local Binary Patterns (LBP) feature using illumination-normalized images. Precis. Agric. 2018, 19, 1062–1083. [Google Scholar] [CrossRef]
  36. Moghimi, A.; Aghkhani, M.H.; Golzarian, M.R.; Rohani, A.; Yang, C. A Robo-vision Algorithm for Automatic Harvesting of Green Bell Pepper. In Proceedings of the 2015 ASABE Annual International Meeting ASABE, St. Joseph, MI, USA, 26–29 July 2015; p. 1. [Google Scholar]
  37. Qureshi, W.; Payne, A.; Walsh, K.; Linker, R.; Cohen, O.; Dailey, M. Machine vision for counting fruit on mango tree canopies. Precis. Agric. 2017, 18, 224–244. [Google Scholar] [CrossRef]
  38. Choi, D.; Lee, W.S.; Schueller, J.K.; Ehsani, R.; Roka, F.; Diamond, J. A performance comparison of RGB, NIR, and depth images in immature citrus detection using deep learning algorithms for yield prediction. In Proceedings of the 2017 ASABE Annual International Meeting ASABE, St. Joseph, MI, USA, 16–19 July 2017; p. 1. [Google Scholar]
  39. Hannan, M.W.; Burks, T.F.; Bulanon, D.M. A Real-time Machine Vision Algorithm for Robotic Citrus Harvesting. In Proceedings of the 2007 ASAE Annual Meeting, ASABE, St. Joseph, MI, USA, 17–20 June 2007. [Google Scholar]
  40. Tao, Y.; Heinemann, P.H.; Varghese, Z.; Morrow, C.T.; Sommer, H.J., III. Machine Vision for Color Inspection of Potatoes and Apples. Trans. ASAE 1995, 38, 1555–1561. [Google Scholar] [CrossRef]
  41. Chang, Y.; Zaman, Q.; Schumann, A.; Percival, D.; Esau, T.; Ayalew, G. Development of color co-occurrence matrix based machine vision algorithms for wild blueberry fields. Appl. Eng. Agric. 2012, 28, 315–323. [Google Scholar] [CrossRef]
  42. Bergerman, M.; Billingsley, J.; Reid, J.; van Henten, E. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer: Cham, Switzerland, 2016; pp. 1065–1077. [Google Scholar]
  43. Wang, N.; Zhang, N.; Wang, M. Wireless sensors in agriculture and food industry—Recent development and future perspective. Comput. Electron. Agric. 2006, 50, 1–14. [Google Scholar] [CrossRef]
  44. Jaulin, L. Path planning using intervals and graphs. Reliab. Comput. 2001, 7, 1–159. [Google Scholar] [CrossRef]
  45. Jensen, M.A.F.; Bochtis, D.; Sørensen, C.G.; Blas, M.R.; Lykkegaard, K.L.J.C.; Engineering, I. In-field and inter-field path planning for agricultural transport units. Comput. Ind. Eng. 2012, 63, 1054–1061. [Google Scholar] [CrossRef]
  46. Grötschel, M.; Lovász, L.; Schrijver, A. Geometric Algorithms and Combinatorial Optimization; Springer Science & Business Media: Berlin, Germany, 2012; Volume 2. [Google Scholar]
  47. Zeng, J.; Ju, R.; Qin, L.; Hu, Y.; Yin, Q.; Hu, C. Navigation in Unknown Dynamic Environments Based on Deep Reinforcement Learning. Sensors 2019, 19, 3837. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Qixin, C.; Yanwen, H.; Jingliang, Z. An evolutionary artificial potential field algorithm for dynamic path planning of mobile robot. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, Beijing, China, 9–15 October 2006. [Google Scholar]
  49. Shvalb, N.; Moshe, B.B.; Medina, O. A real-time motion planning algorithm for a hyper-redundant set of mechanisms. Robotica 2013, 31, 1327–1335. [Google Scholar] [CrossRef] [Green Version]
  50. Subramanian, V.; Burks, T.F.; Arroyo, A.A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
  51. ASABE. Coming soon to an orchard near you: The Global Unmanned Spray System (GUSS). In Resource Magazine; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2019; pp. 9–10. [Google Scholar]
  52. Cao, P.M.; Hall, E.L.; Zhang, E. Soil sampling sensor system on a mobile robot. In Intelligent Robots and Computer Vision XXI: Algorithms, Techniques, and Active Vision; International Society for Optics and Photonics: Providence, RI, USA, 2003. [Google Scholar]
  53. Monta, M.; Kondo, N.; Ting, K.C. End-Effectors for Tomato Harvesting Robot. In Artificial Intelligence for Biology and Agriculture; Panigrahi, S., Ting, K.C., Eds.; Springer: Dordrecht, The Netherlands, 1998; pp. 1–25. [Google Scholar]
  54. Tai, K.; El-Sayed, A.-R.; Shahriari, M.; Biglarbegian, M.; Mahmud, S. State of the Art Robotic Grippers and Applications. Robotics 2016, 5, 11. [Google Scholar] [CrossRef] [Green Version]
  55. Monkman, G.J. Robot Grippers for Use with Fibrous Materials. Int. J. Robot. Res. 1995, 14, 144–151. [Google Scholar] [CrossRef]
  56. Rodríguez, F.; Moreno, J.C.; Sánchez, J.A.; Berenguel, M. Grasping in Agriculture: State-of-the-Art and Main Characteristics. In Grasping in Robotics; Carbone, G., Ed.; Springer: London, UK, 2013; pp. 385–409. [Google Scholar]
  57. Paul, R.P. Robot Manipulators: Mathematics, Programming, and Control: The Computer Control of Robot Manipulators; The MIT Press: Cambridge, MA, USA; London, UK, 1981. [Google Scholar]
  58. Cho, S.I.; Chang, S.J.; Kim, Y.Y.; An, K.J. AE—Automation and Emerging Technologies. Biosyst. Eng. 2002, 82, 143–149. [Google Scholar] [CrossRef]
  59. Kondo, N.; Ting, K.C. Robotics for Plant Production. Artif. Intell. Rev. 1998, 12, 227–243. [Google Scholar] [CrossRef]
  60. Bakker, T.; Bontsema, J.; Müller, J. Systematic design of an autonomous platform for robotic weeding. J. Terramechanics 2010, 47, 63–73. [Google Scholar] [CrossRef]
  61. Bakker, T.; van Asselt, K.; Bontsema, J.; Müller, J.; van Straten, G. An Autonomous Weeding Robot for Organic Farming. In Field and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  62. Haruhisa, K.; Suguru, M.; Hideki, K.; Satoshi, U. Novel Climbing Method of Pruning Robot. In Proceedings of the 2008 SICE Annual Conference, IEEE, Tokyo, Japan, 20–22 August 2008. [Google Scholar]
  63. Devang, P.S.; Gokul, N.A.; Ranjana, M.; Swaminathan, S.; Binoy, B.N. Autonomous arecanut tree climbing and pruning robot. In Proceedings of the 2010 International Conference on Emerging Trends in Robotics and Communication Technologies, WikiCFP, Chennai, India, 3–5 December 2010. [Google Scholar]
  64. Botterill, T.; Paulin, S.; Green, R.; Williams, S.; Lin, J.; Saxton, V.; Mills, S.; Chen, X.; Corbett-Davies, S. A Robot System for Pruning Grape Vines. J. Field Robot. 2017, 34, 1100–1122. [Google Scholar] [CrossRef]
  65. Ueki, S.; Kawasaki, H.; Ishigure, Y.; Koganemaru, K.; Mori, Y. Development and experimental study of a novel pruning robot. Artif. Life Robot. 2011, 16, 86–89. [Google Scholar] [CrossRef]
  66. Fentanes, J.P.; Gould, I.; Duckett, T.; Pearson, S.; Cielniak, G. 3-D Soil Compaction Mapping Through Kriging-Based Exploration With a Mobile Robot. IEEE Robot. Autom. Lett. 2018, 3, 3066–3072. [Google Scholar] [CrossRef]
  67. Scholz, C.; Moeller, K.; Ruckelshausen, A.; Hinck, S.; Goettinger, M. Automatic soil penetrometer measurements and GIS based documentation with the autonomous field robot platform bonirob. In Proceedings of the 12th International Conference of Precision Agriculture, Sacramento, CA, USA, 20–23 July 2014. [Google Scholar]
  68. Kicherer, A.; Herzog, K.; Pflanz, M.; Wieland, M.; Rüger, P.; Kecke, S.; Kuhlmann, H.; Töpfer, R. An Automated Field Phenotyping Pipeline for Application in Grapevine Research. Sensors 2015, 15, 4823–4836. [Google Scholar] [CrossRef] [PubMed]
  69. Salas Fernandez, M.G.; Bao, Y.; Tang, L.; Schnable, P.S. A High-Throughput, Field-Based Phenotyping Technology for Tall Biomass Crops. Plant Physiol. 2017, 174, 2008–2022. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  70. Obregón, D.; Arnau, R.; Campo-Cossio, M.; Arroyo-Parras, J.G.; Pattinson, M.; Tiwari, S.; Lluvia, I.; Rey, O.; Verschoore, J.; Lenza, L.; et al. Precise Positioning and Heading for Autonomous Scouting Robots in a Harsh Environment. In From Bioinspired Systems and Biomedical Applications to Machine Learning; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar]
  71. Young, S.N.; Kayacan, E.; Peschel, J.M. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precis. Agric. 2018, 20, 697–722. [Google Scholar] [CrossRef] [Green Version]
  72. Sammons, P.J.; Furukawa, T.; Bulgin, A. Autonomous pesticide spraying robot for use in a greenhouse. In Proceedings of the 2005 Australasian Conference on Robotics and Automation, ARAA, Sydney, Australia, 5–7 December 2005. [Google Scholar]
  73. Sharma, S.; Borse, R. Automatic Agriculture Spraying Robot with Smart Decision Making. In Intelligent Systems Technologies and Applications 2016; Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar]
  74. Nakao, N.; Suzuki, H.; Kitajima, T.; Kuwahara, A.; Yasuno, T. Path planning and traveling control for pesticide-spraying robot in greenhouse. J. Signal Process. 2017, 21, 175–178. [Google Scholar]
  75. Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A Small Versatile Electrical Robot for Autonomous Spraying in Agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef] [Green Version]
  76. Haibo, L.; Shuliang, D.; Zunmin, L.; Chuijie, Y. Study and Experiment on a Wheat Precision Seeding Robot. J. Robot. 2015, 2015, 1–9. [Google Scholar] [CrossRef]
  77. Srinivasan, N.; Prabhu, P.; Smruthi, S.S.; Sivaraman, N.V.; Gladwin, S.J.; Rajavel, R.; Natarajan, A.R. Design of an autonomous seed planting robot. In Proceedings of the 2016 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Agra, India, 21–23 December 2016. [Google Scholar]
  78. Bac, C.W.; Hemming, J.; van Tuijl, B.A.J.; Barth, R.; Wais, E.; van Henten, E.J. Performance Evaluation of a Harvesting Robot for Sweet Pepper. J. Field Robot. 2017, 34, 1123–1139. [Google Scholar] [CrossRef]
  79. Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
  80. Kohan, A.; Borghaee, A.M.; Yazdi, M.; Minaei, S.; Sheykhdavudi, M.J. Robotic harvesting of rosa damascena using stereoscopic machine vision. World Appl. Sci. J. 2011, 12, 231–237. [Google Scholar]
  81. Hohimer, C.J.; Wang, H.; Bhusal, S.; Miller, J.; Mo, C.; Karkee, M. Design and Field Evaluation of a Robotic Apple Harvesting System with a 3D-Printed Soft-Robotic End-Effector. Trans. ASABE 2019, 62, 405–414. [Google Scholar] [CrossRef]
  82. Faverjon, B.; Tournassoud, P. A local based approach for path planning of manipulators with a high number of degrees of freedom. In Proceedings of the 1987 IEEE International Conference on Robotics and Automation, Raleigh, NC, USA, 31 March–3 April 1987. [Google Scholar]
  83. Zion, B.; Mann, M.; Levin, D.; Shilo, A.; Rubinstein, D.; Shmulevich, I. Harvest-order planning for a multiarm robotic harvester. Comput. Electron. Agric. 2014, 103, 75–81. [Google Scholar] [CrossRef]
  84. Lili, W.; Bo, Z.; Jinwei, F.; Xiaoan, H.; Shu, W.; Yashuo, L.; Zhou, Q.; Chongfeng, W. Development of a tomato harvesting robot used in greenhouse. Int. J. Agric. Biol. Eng. 2017, 10, 140–149. [Google Scholar] [CrossRef]
  85. Noguchi, N.; Terao, H. Path planning of an agricultural mobile robot by neural network and genetic algorithm. Comput. Electron. Agric. 1997, 18, 187–204. [Google Scholar] [CrossRef]
  86. Zuo, G.; Zhang, P.; Qiao, J. Path planning algorithm based on sub-region for agricultural robot. In Proceedings of the 2nd International Asia Conference on Informatics in Control, Automation and Robotics, Wuhan, China, 6–7 March 2010. [Google Scholar]
  87. Hayashi, S.; Yamamoto, S.; Saito, S.; Ochiai, Y.; Kamata, J.; Kurita, M.; Yamamoto, K. Field operation of a movable strawberry-harvesting robot using a travel platform. Jpn. Agric. Res. Q. JARQ 2014, 48, 307–316. [Google Scholar] [CrossRef] [Green Version]
  88. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  89. Yaguchi, H.; Nagahama, K.; Hasegawa, T.; Inaba, M. Development of an autonomous tomato harvesting robot with rotational plucking gripper. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016. [Google Scholar]
  90. Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
  91. Mu, L.; Liu, Y.; Cui, Y.; Liu, H.; Chen, L.; Fu, L.; Gejima, Y. Design of End-effector for Kiwifruit Harvesting Robot Experiment. In Proceedings of the 2017 ASABE Annual International Meeting, ASABE, St. Joseph, MI, USA, 16–19 July 2017; p. 1. [Google Scholar]
  92. Feng, Q.; Wang, X.; Wang, G.; Li, Z. Design and test of tomatoes harvesting robot. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015. [Google Scholar]
  93. Chen, W.; Xu, T.; Liu, J.; Wang, M.; Zhao, D. Picking Robot Visual Servo Control Based on Modified Fuzzy Neural Network Sliding Mode Algorithms. Electronics 2019, 8, 605. [Google Scholar] [CrossRef] [Green Version]
  94. Yuanshen, Z.; Gong, L.; Liu, C.; Huang, Y. Dual-arm robot design and testing for harvesting tomato in greenhouse. IFAC Pap. Online 2016, 49, 161–165. [Google Scholar]
  95. Ritchie, G.L.; Bednarz, C.W.; Jost, P.H.; Brown, S.M. Cotton Growth and Development; University of Georgia: Athens, GA, USA, 2007. [Google Scholar]
  96. Rains, G.C.; Faircloth, A.G.; Thai, C.; Raper, R.L. Evaluation of a simple pure pursuit path-following algorithm for an autonomous, articulated-steer vehicle. Appl. Eng. Agric. 2014, 30, 367–374. [Google Scholar]
  97. Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. Visual Inverse Kinematics for Cotton Picking Robot. In Proceedings of the 2019 Beltwide Cotton Conferences, NCC, New Orleans, LA, USA, 8–10 January 2019; pp. 730–742. [Google Scholar]
  98. Mulan, W.; Jieding, W.; Jianning, Y.; Kaiyun, X. A research for intelligent cotton picking robot based on machine vision. In Proceedings of the 2008 International Conference on Information and Automation IEEE, Changsha, China, 20–23 June 2008. [Google Scholar]
  99. Wang, Y.; Zhu, X.; Ji, C. Machine Vision Based Cotton Recognition for Cotton Harvesting Robot. In Computer and Computing Technologies in Agriculture; Springer: Boston, MA, USA, 2008; Volume II. [Google Scholar]
  100. Li, Y.; Cao, Z.; Lu, H.; Xiao, Y.; Zhu, Y.; Cremers, A.B. In-field cotton detection via region-based semantic image segmentation. Comput. Electron. Agric. 2016, 127, 475–486. [Google Scholar] [CrossRef]
  101. Fue, K.G.; Porter, W.M.; Rains, G.C. Deep Learning based Real-time GPU-accelerated Tracking and Counting of Cotton Bolls under Field Conditions using a Moving Camera. In Proceedings of the 2018 ASABE Annual International Meeting, ASABE, St. Joseph, MI, USA, 29 July–1 August 2018; p. 1. [Google Scholar]
  102. Ramin Shamshiri, R.; Weltzien, C.; Hameed, I.A.; Yule, I.J.; Grift, T.E.; Balasundram, S.K.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and development in agricultural robotics: A perspective of digital farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–11. [Google Scholar] [CrossRef]
  103. Higuti, V.A.H.; Velasquez, A.E.B.; Magalhaes, D.V.; Becker, M.; Chowdhary, G. Under canopy light detection and ranging-based autonomous navigation. J. Field Robot. 2019, 36, 547–567. [Google Scholar] [CrossRef]
  104. Shockley, J.M.; Dillon, C.R. An economic feasibility assessment for adoption of autonomous field machinery in row crop production. In Proceedings of the 2018 International Conference on Precision Agriculture ICPA, Montreal, QC, Canada, 24–27 June 2018. [Google Scholar]
  105. Pedersen, S.M.; Fountas, S.; Blackmore, S. Agricultural robots—Applications and economic perspectives. In Service Robot Applications; IntechOpen: London, UK, 2008. [Google Scholar]
  106. Gaus, C.C.; Urso, L.-M.; Minßen, T.-F.; de Witte, T. Economics of mechanical weeding by a swarm of small field robots. In Proceedings of the 57th Annual Conference of German Association of Agricultural Economists (GEWISOLA), Weihenstephan, Germany, 13–15 September 2017. [Google Scholar]
  107. Koubâa, A. Studies in Computational Intelligence. In Robot Operating System (ROS), 1st ed.; Springer: Berlin, Germany, 2017; p. 655. [Google Scholar]
  108. Ghaffarzadeh, K. Agricultural Robots and Drones 2018–2038: Technologies, Markets and Players; IDTechEx Research: Berlin, Germany, 2019. [Google Scholar]
  109. Kamilaris, A.; Prenafeta-Boldú, F. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  110. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  111. Liakos, G.K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Framework organization of this paper.
Figure 1. Framework organization of this paper.
Agriengineering 02 00010 g001
Figure 2. Some of the agricultural robots with a different arrangement of components; (a) AgAnt (source: cleantechnica.com), (b) Fraunhofer Institute for Production Systems and Design Technology IPK dual-arm robot (source: agromarketing.mx), (c) Tarzan swing robot [21,22] (d) Weeding Robot [16] (e) Thorvald II Agricultural Robotic System Modules [23] (f) Fuji industry Robot (source: fuji.co.uk) (g) RAL Space Agribot with robot arm weeding raspberries (source: autonomous.systems.stfc.ac.uk) (i) SwagBot, omnidirectional electric ground vehicle (source: confluence.acfr.usyd.edu.au).
Figure 2. Some of the agricultural robots with a different arrangement of components; (a) AgAnt (source: cleantechnica.com), (b) Fraunhofer Institute for Production Systems and Design Technology IPK dual-arm robot (source: agromarketing.mx), (c) Tarzan swing robot [21,22] (d) Weeding Robot [16] (e) Thorvald II Agricultural Robotic System Modules [23] (f) Fuji industry Robot (source: fuji.co.uk) (g) RAL Space Agribot with robot arm weeding raspberries (source: autonomous.systems.stfc.ac.uk) (i) SwagBot, omnidirectional electric ground vehicle (source: confluence.acfr.usyd.edu.au).
Agriengineering 02 00010 g002
Figure 3. Tracking the robot using wheel odometry of the camera Inertial Measurement Units (IMU) (the autonomous rover can be tracked while working on the farm by using visual SLAM and GPS); (a) Rover is starting to navigate, (b) Rover is about to finish the farm (c) Rover can generate the returning path by using history navigation (d) Blue is the predicted path going back while red is the path taken by the rover.
Figure 3. Tracking the robot using wheel odometry of the camera Inertial Measurement Units (IMU) (the autonomous rover can be tracked while working on the farm by using visual SLAM and GPS); (a) Rover is starting to navigate, (b) Rover is about to finish the farm (c) Rover can generate the returning path by using history navigation (d) Blue is the predicted path going back while red is the path taken by the rover.
Agriengineering 02 00010 g003
Figure 4. Possible movements for robot manipulators (sensing.honeywell.com).
Figure 4. Possible movements for robot manipulators (sensing.honeywell.com).
Agriengineering 02 00010 g004
Figure 5. The undefoliated cotton field at UGA farms.
Figure 5. The undefoliated cotton field at UGA farms.
Agriengineering 02 00010 g005
Figure 6. The cotton robotic system proposed by our team [96,97].
Figure 6. The cotton robotic system proposed by our team [96,97].
Agriengineering 02 00010 g006
Figure 7. Cotton picker robot prototype (source: www.kas32.com).
Figure 7. Cotton picker robot prototype (source: www.kas32.com).
Agriengineering 02 00010 g007
Figure 8. Cotton picking robot prototype proposed by Clemson University (source: www.agweb.com).
Figure 8. Cotton picking robot prototype proposed by Clemson University (source: www.agweb.com).
Agriengineering 02 00010 g008
Figure 9. Green Robot Machinery (gRoboMac) manipulator trying to get the cotton boll (source: www.grobomac.com).
Figure 9. Green Robot Machinery (gRoboMac) manipulator trying to get the cotton boll (source: www.grobomac.com).
Agriengineering 02 00010 g009
Figure 10. Old design of Green Robot Machinery (gRoboMac) manipulator trying to get the cotton boll (source: thetechpanda.com).
Figure 10. Old design of Green Robot Machinery (gRoboMac) manipulator trying to get the cotton boll (source: thetechpanda.com).
Agriengineering 02 00010 g010
Figure 11. Cotton robot testing at Clemson University (source: agweb.com).
Figure 11. Cotton robot testing at Clemson University (source: agweb.com).
Agriengineering 02 00010 g011
Table 1. Comparison of the conventional machine and robot for cotton harvesting.
Table 1. Comparison of the conventional machine and robot for cotton harvesting.
ParametersConventional MachineRobot
Number of bolls per acre303,178303,178
Times to harvest per acre (pass)125
Time to harvest an acre(hours)0.1250
Unit Cost$725,000$7000
Table 2. Methods and parameters to measure the performance of the robot in the field (adapted from [12]).
Table 2. Methods and parameters to measure the performance of the robot in the field (adapted from [12]).
MeasureDescription
CT: Cycle Time (s)The average time required to finish a specific action in a task. (e.g., harvesting a cotton boll, spraying herbicides, scouting with camera)
OT: Operation Time under real-time conditions (s)The average time required to finish an intended task under real-time in an agricultural field. This can be time taken from the start of robot planning, navigation, sensing, and manipulation.
OV: Operation Velocity under real-time conditions (inch s-1)Average velocity taken by the robot to finish a mission (navigation can be very complex or simple depending on-farm management task)
PR: Production Rate (lbs h-1, ac h-1, number of actions h-1, etc.)Amount of successful actions or task (e.g., number of cotton bolls picked) treated per time unit
CORT: Capability to Operate under Real-Time conditions (CORT+ or CORT-)The ability of a robot to accomplish tasks under real-time conditions presented in binary form: either can operate under real-time conditions, CORT+, or cannot operate under real-time conditions, CORT-. This can be achieved if navigation, sensing, and manipulation are well designed.
DC: Detection Capability (DC+ or DC-)The ability of robot sensors to detect objects to accomplish a specific mission and it is presented in binary form; either a robot can detect an object, DC+; or cannot detect an object, DC-
DP: Detection Performance (%)Performance of the robot in detecting objects for its mission. Detection results can be True Positives (TP), False Positives (FP), True Negatives (TN), and False Negatives (FN). DP is the sum of the True positives and True Negatives over all the elements that were presented for detection. Other parameters like accuracy, recall, precision, and F1 score can be calculated [13].
ASR: Action Success Ratio (%)The ratio of successful actions performed by the robot without destroying the plant over the total number of actions
ADM: Appropriate Decision-Making (%)The ratio of the number of correct decisions made over all the decisions done by the robot while accomplishing an agricultural task
PEa: Position Error Average and PEsd: Position Error Standard Deviation (inch, etc.)The standard deviation and average of positioning error made by a robot from true locations where it is located to reported location sensed by the robot’s sensors.
SafetyIt the parameter that describes robot behavior on the farm that cannot threaten other objects around the farm. It is the safe actions of the robot while operating in an agricultural field.
WholenessThe ability of the robot to execute tasks as required or as designed to full completion using its autonomous coordination of actions to accomplish all the tasks.
Table 3. Other agricultural robots for weeding, soil sampling, scouting/phenotyping, pruning, spraying, and sowing.
Table 3. Other agricultural robots for weeding, soil sampling, scouting/phenotyping, pruning, spraying, and sowing.
ActivityReferenceMobility SensingPath PlanningManipulation
Weeding[60,61]Four-wheel vehicleCamera, GPS, and angle sensorsHough transform method for detection of rowsN/A
[31]Four-wheel vehicleCamera, GPS, gyroscope, magnetometerStrategic planning (based on previous knowledge of weed population), adaptive planning (for the unexpected occurrence of weeds) and path tracking controlN/A
[33]Continuous track vehicleIMU and LRFPath Tracking methodsThe inter-row spacing weeder was made of three spiral-type cutters (three arms and three weeder plows) [2DOF]
Pruning[62]Four active wheels are set at regular intervals around the treeN/AClimbing method (implementing rotation of wheels along the vertical direction and diameter of the trunk). 2DOF (with cutting blade)
[63]Two active wheelsN/AClimbing method (implementing rotation of wheels along the vertical direction and diameter of the trunk). Arm trajectory motion planning with a search mechanism9DOF (with cutting blade)
[64]Four-wheel vehicle3D camerasThe randomized path planner [random tree (RRT)-based planner, RRT-Connect]6DOF (cutting tool consists of a router mill-end attached to a high-speed motor)
[65]Four active wheels3D position measurement device and 3D orientation sensor Innovative climbing strategy [grid based]2DOF
Soil Sampling[52]Two-wheel robotGPS, encoderGPS path tracking [Adaptive grid-based Navigation]2DOF (Linear actuator and Cone penetrometer)
[66]Four-wheel vehicle (Thorvald)RTK-GPS, force sensor, measurement device, soil moisture sensorGPS tracking method [grid-based]2 DOF (penetrometer)
[67]Four-wheel vehicle (BoniRob)RTK-GPS, and soil moisture sensorGPS tracking method [grid-based]2 DOF (penetrometer)
Scouting or phenotyping[68] Four-wheel vehicleRTK-GPS, NIR camera, and RGB Multicamera systemGPS Auto steering methodsN/A
[69]Four-wheel tractorRGB Stereo camera, RTK-GPSGPS Auto steering methodN/A
[70] Four-wheel tractorGPS, RGB camera, inertial sensors, 3D LIDAR, 2D security lasers, IMUSimultaneous Localization And MappingN/A
[71] Continuous trackRGB Stereo cameras, single-chip ToF sensor, IR sensor, RTK-GPS gyroscope, and optical encodersExtended Kalman filter (EKF) and nonlinear model predictive controlN/A
Spraying[72] Sliding on rails vehiclesInduction sensors, IR sensors, bump sensorsN/A since it was following the railsN/A
[73] Four-wheel vehicleCamera, temperature, humidity, soil moisture sensors, GSM modemN/AN/A
[74] Four-wheel vehicleLRF sensor, GPS and magnetic sensorPath tracking method and self-positioning methodN/A
[75] Four-wheel vehicleLRF sensor, ultrasonic, laser scanner, stereo camera, encoders and GPSPath tracking using planned trajectoryN/A
Sowing[76] Four-wheel vehicleEncoder, angle sensor, pressure sensor, IR sensorPath tracking methods2DOF (sowing device)
[77]Continuous track [caterpillar treads ]Magnetometer, the ultrasonic sensorNavigation by using sensor data to follow rows2DOF (sowing device)
Table 4. Recent robotic systems developed for harvesting agricultural produce.
Table 4. Recent robotic systems developed for harvesting agricultural produce.
Reference/CropMobility SensingPath PlanningManipulation
[78] for Sweet pepperThe railed vehicle robot platformA ToF camera, RGB camerasRobot over the rails. Manipulator used Arm trajectory motion planning with a search mechanism9DOF, Fin Ray end effector (scissors and fingers) and Lip-type end effector (knife and vacuum sensor).
[84] for TomatoFour-wheel vehiclebinocular stereo visionPID control for Ackerman steering geometry. The manipulator used C-space and the A* search algorithm5DOF harvesting manipulator
[88] for strawberryFour-wheel vehicle [Thorvald II]RGB-D camera, IR sensorVehicle controlled manually by a joystick, but manipulator used motion sequence planning algorithm5DOF arm with a cable-driven gripper
[79] for cherry-tomatoThe railed vehicle robot platformRGB Stereo camera, Laser sensorArm trajectory motion planning for the manipulator6DOF with double cutter end-effector
[91] for Kiwi-fruitFour-wheel vehicle robotLaser sensors, Hall position sensor, Pressure sensor, Optical fiber sensorArm trajectory motion planning without search mechanism2DOF with 3D printed bionic fingers end-effector
[83] for MellonThe 2-m wide rectangular frame which spans the melon bed robot with four wheelsRTK-GPS, encoders, RBB stereo camerasArm trajectory motion planning without search mechanism3DOF Multiple Cartesian manipulators
[92] for TomatoesThe railed vehicle robot platformRGB Cameras, wheel encoders, a gyroscope and an ultra-wideband (UWB) indoor positioning systemArm trajectory motion planning without search mechanism6DOF manipulator with a 3D printed gripper
[93] for ApplesFour-wheel vehicleRGB cameras, wheel Encoders A visual servo algorithm based on fuzzy neural network adaptive sliding mode control for vehicle and manipulator5DOF manipulator
[94] for TomatoesThe railed vehicle robot platformRGB stereo cameraInverse kinematics for manipulator and no navigation algorithm for vehicle and Arm trajectory motion planningTwo 3-DOF Cartesian type robot manipulators with saw cutting type end-effector

Share and Cite

MDPI and ACS Style

Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting. AgriEngineering 2020, 2, 150-174. https://doi.org/10.3390/agriengineering2010010

AMA Style

Fue KG, Porter WM, Barnes EM, Rains GC. An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting. AgriEngineering. 2020; 2(1):150-174. https://doi.org/10.3390/agriengineering2010010

Chicago/Turabian Style

Fue, Kadeghe G., Wesley M. Porter, Edward M. Barnes, and Glen C. Rains. 2020. "An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting" AgriEngineering 2, no. 1: 150-174. https://doi.org/10.3390/agriengineering2010010

APA Style

Fue, K. G., Porter, W. M., Barnes, E. M., & Rains, G. C. (2020). An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting. AgriEngineering, 2(1), 150-174. https://doi.org/10.3390/agriengineering2010010

Article Metrics

Back to TopTop