Next Article in Journal
SPCN: An Innovative Soybean Pod Counting Network Based on HDC Strategy and Attention Mechanism
Previous Article in Journal
Estimation of Fiber Fragility and Digestibility of Corn Silages and Cool Season Pastures
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects

1
Faculty of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming 650504, China
2
Foshan-Zhongke Innovation Research Institute of Intelligent Agriculture and Robotics, Guangzhou 528251, China
3
College of Intelligent Manufacturing and Modern Industry, Xinjiang University, Urumqi 830046, China
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(8), 1346; https://doi.org/10.3390/agriculture14081346
Submission received: 6 July 2024 / Revised: 3 August 2024 / Accepted: 9 August 2024 / Published: 12 August 2024

Abstract

:
Berries are nutritious and valuable, but their thin skin, soft flesh, and fragility make harvesting and picking challenging. Manual and traditional mechanical harvesting methods are commonly used, but they are costly in labor and can damage the fruit. To overcome these challenges, it may be worth exploring alternative harvesting methods. Using berry fruit-picking robots with perception technology is a viable option to improve the efficiency of berry harvesting. This review presents an overview of the mechanisms of berry fruit-picking robots, encompassing their underlying principles, the mechanics of picking and grasping, and an examination of their structural design. The importance of perception technology during the picking process is highlighted. Then, several perception techniques commonly used by berry fruit-picking robots are described, including visual perception, tactile perception, distance measurement, and switching sensors. The methods of these four perceptual techniques used by berry-picking robots are described, and their advantages and disadvantages are analyzed. In addition, the technical characteristics of perception technologies in practical applications are analyzed and summarized, and several advanced applications of berry fruit-picking robots are presented. Finally, the challenges that perception technologies need to overcome and the prospects for overcoming these challenges are discussed.

1. Introduction

With the development of the world economy and the change in people’s consumption concept, people have high requirements for the quality and nutritional composition of fruits. Berry fruits are rich in flavonoids, anthocyanins, resveratrol, etc. These substances make the berry fruits have high medicinal and nutritional value, which can improve blood circulation, prevent cardiovascular and cerebral vascular diseases, delay aging, prevent cancer, promote the function of the immune system, regulate the function of the endocrine system, promote tissue regeneration, are an anti-infective, and can protect the eyesight, and so on [1,2,3].
Berry means a fruit whose flesh is mostly berry-like when ripe and whose flesh has a high-water content. At the same time, the berry fruit is relatively soft and weak; the exocarp is a thin layer of the epidermis, while the flesh is soft and juicy, such as raspberries, grapes, blueberries, prickly pear, currants, kiwifruits, strawberries, etc. [4]. Therefore, due to the softer and more fragile berry fruit, manual picking may result in too much force in picking berry fruits, leading to berry rupture, which poses great difficulty for berry fruit picking.
Due to the high cost of repetitive and single manual work of picking berry fruits, which makes berry fruits expensive, to replace manual picking, nowadays people use traditional machinery to pick; most traditional machines used for harvesting berry fruits use both direct and vibratory harvesting to harvest the berries. As shown in Figure 1a, Guojun Du et al. harvested strawberries using a homemade double-crank mechanism-driven picking claw device. The device consists of a picking mechanism, a longitudinal transfer collection device, and a transmission mechanism, which is driven by a motor to drive the longitudinal transfer collection device and the picking mechanism at the same time, wherein a picking claw drives the picking mechanism to complete the process of strawberry picking by using the expansion and contraction characteristics of the double crank mechanism [5]. This traditional harvesting mechanism significantly improves picking efficiency compared to manual picking of berry fruit; the single reciprocating motion driven by a motor can lead to pulp damage, fruit drop, etc. As shown in Figure 1b, Panpan Yuan et al. designed a crankshaft-type vibratory threshing and harvesting device for wine grapes, which consists of 3. crankshaft, 1. an elastic gripping vibration mechanism, 2. a transmission system, 4. motor, 5. pitch adjustment lever, and 6. frequency converter. They use parallel left and right monoblocs to achieve in-phase, isotropic, and synchronous motion to achieve harvesting of wine grapes and also use similar elastic clamping vibration mechanisms and left and suitable monoblocs for the same purpose [6]. However, due to the amplitude, the fruit branches are broken at the end, and the excitation force cannot be accurately controlled, so it is impossible to perform precise picking.
Even though traditional machinery has reduced harvesting costs and accelerated harvesting efficiency, there are still problems of fruit damage, leakage, and mis-harvesting. To avoid the shortcomings of conventional mechanical harvesting and picking of berry fruits, manipulators instead of traditional fruit harvesting and picking machinery are necessary. The manipulator can imitate the human hand to pick berry fruits and achieve damage-free and precise picking. The end-effector is an essential part of the robotic manipulator. The end-effector is installed in the equipment for gripping, handling, and placing the workpiece manipulator end tool, end-effector through the electric, pneumatic, or other forms of power to drive the fingers of the opening and closing to achieve the workpiece grasping and releasing, to complete the workpiece grasping tasks [7].
However, the design and application of robotic manipulators and end-effectors for berry fruit picking face many challenges. Tasks that appear simple to humans, such as selecting and placing objects with widely varying shapes, sizes, materials, and surface properties, can be challenging for robots [8]. Moreover, because the agricultural environment is less structured, it is more difficult to employ robots to automate different agricultural processes [9]. While humans can usually pick crops by coordinating their various body parts to achieve precise and efficient picking without damage, robots can still not mimic the movements of human beings [10]. Moreover, because of the variability in fruits in terms of shape, size, weight, etc., the requirements for end-effector adaptation to the variability of fruits are very high [11], leading to the limitations of the existing end-effector, which cannot be used in a variety of fruits with different attributes to be picked, and this is also true for berry fruits. Since berries are soft, designing an end-effector specifically for picking berry fruits is necessary.
To make the operation of the end-effector more accurate, the application of perception technology is essential, so it is necessary to use sensors to obtain various types of information and accurately complete the operation requirements [12]. For the small size, soft texture, thin skin, and other characteristics of berry fruits, and to achieve accurate and efficient picking operations like human beings, it is even more necessary to use a variety of sensors on the end-effector and robotic manipulator to meet the requirements of picking [13].
The gripping requirements for agricultural products are higher than those for workpieces whose objects are metals, ceramics, plastics, and other materials in the industry. A variety of visual sensors, tactile sensors, force sensors, and bending sensors are mounted on the end-effector to make the gripping effect more reliable, flexible, and controllable [14]. Various perception technologies must be integrated to achieve the gripping needs by considering the characteristics of berry fruits and distinguishing them from other agricultural products. A list of visual perception systems applications in berry fruit picking is shown below in Figure 2a. Yucheng Jin et al. designed a vision sensor-based end-effector of a grape-picking robot, which achieves accurate positioning of the picking point through a vision perception system, and the gripper finger of the end-effector grips the stalks on top of the bunches of grapes to achieve the grape picking [15]. As shown in Figure 2b, Ma Li et al. devised a robotic system for harvesting kiwifruit. The system employs RGB-D sensors to identify and retrieve kiwifruit after accurately positioning them [16]. As shown in Figure 2c, Parsa S et al. designed an autonomous strawberry-picking robot equipped with four visual sensors on the end-effector and a finger gripping the stem above the grape bunch for picking. The robot also features a top RGB-D sensor and three RGB sensors at the bottom. The top RGB-D sensor captures the strawberry plant’s RGB image and 3D point cloud from a distance, while the bottom RGB sensors provide a close-up view [17].
Until now, the fruit-picking robot has had more applications for visual perception. However, for berry fruit picking, due to the characteristics of berry fruits, visual perception alone is not enough; it also needs to be paired with a variety of perceptual technologies such as tactile perception, distance measurement, switch sensors, and other perceptual technologies in the end-effector on the collaborative picking, i.e., multi-modal sensors deployed perceptual system. The wide deployment of multimodal sensors generates a large amount of data characterized by high capacity, wide diversity, and high integrity in various fields [18]. The integration of diverse sensor modalities and knowledge represents a pivotal aspect of precision agriculture in the forthcoming era [19]. A good end-effector that is reliable and tries to limit the redundant movements of the robotic manipulator in motion requires kinematic calculations and the use of sensors and other data needed for fruit picking, which can be used to localize the end-effector accurately and can enable the end-effector and the manipulator to avoid collisions with other objects during operation [20,21].
From the above examples, it is evident that perception technology plays a vital role in the application of berry fruit-picking robots, which is mainly realized by relying on sensors. Perception technology has become possible in almost all areas of life due to technological advances and size reduction, and perception measures physical properties and converts them into signals for the observer [22]. Combining perception, modeling, inference, and actuation can achieve better performance, improving the robot’s movement and awareness [23]. Sensors play a crucial role in agricultural robots by providing environment perception and localization information to enable autonomous navigation, crop detection, and map building [24].
The methodology employed for the retrieval of articles for this review involved utilizing a range of academic search platforms to identify the recent applications of perception technologies in berry fruit-picking robots over the past five years. Databases such as Web of Science, Google Scholar, Wiley Online Library, IEEE Xplore, and ScienceDirect were consulted. The search terms included “berry fruit picking robot”, “perception technology”, “sensors”, “deep learning”, and “precision agriculture”, as well as synonyms and near-synonyms, such as “picking robots”, “harvesting robots”, and “harvester”. An initial search on the Web of Science yielded approximately 500 articles related to fruit-picking robots within the specified timeframe. After excluding studies focused on non-berry fruits, over 200 articles remained. Subsequent evaluations emphasized the relevance of these articles to perception technologies and sensors, leading to data collection on various parameters, including perception technology type, mechanical structure, sensors, performance, control strategy, and cost. Ultimately, approximately 200 articles were excluded as they did not meet the criteria for further investigation in the literature review.
Therefore, this review elucidated the principles, picking and gripping mechanisms, and structures of berry fruit-picking robots. Building on this foundation, it examined the methods of perception technologies utilized in berry fruit-picking robots, analyzed and summarized the advantages and disadvantages of each technology, and discussed their technical characteristics. Finally, it addressed the challenges and opportunities berry fruit-picking robots encountered in the berry-picking process through applied perception technologies.

2. Mechanism of Berry Fruit-Picking Robots

2.1. Principle

The picking steps and standardized process of the berry fruit-picking robot are shown in Figure 3. The robot system, through the combination of various types of sensors composed of a perception system, captures and obtains all kinds of characteristics of berry fruit data and the data transmitted to the computer. The computer controls the robotic manipulator and end-effector to achieve the picking of berry fruits, ensuring standardization throughout the process. The steps involved in grasping can be summarized as follows [25,26]:
  • Approaching the object: The end-effector is open, and the robot system positions the end-effector close to the object.
  • Coming into contact: The end-effector is in contact with the gripping object.
  • Increasing gripping force: The end-effector outputs a certain amount of gripping force to ensure the object is gripped.
  • Securing the object: When the gripping force meets the object, the gripping force should be adjusted to stop increasing once it securely grips the object.
  • Lifting the object: The end-effector grips the object, moving it to the desired position, combined with the robot system moving the object to the desired position.
  • Release the object: The end-effector releases its grip, separating from the gripped object.
  • Monitoring the grasp: the gripping process is monitored by sensors to judge the end-effector’s contact with the gripped object and the effectiveness of the grip.
Due to the difficulty of gripping and harvesting berry fruits, there are still challenges in reducing fruit loss and damage in practical berry fruit harvesting [27]. In the harvesting of berry fruits using fruit-picking robots, end-effectors and berry picking are associated with the process of fruit gripping and picking, either by gripping the berries directly using the stalk and fruit or by separating the fruit from the plant by pulling, twisting the fruit, or cutting the stalk [28].

2.2. Picking and Gripping

The berry fruit gripping methods can be broadly categorized into three types: gripping the fruit stalk, adsorbing the fruit, and gripping the fruit. In clamping the fruit stalk, the principle is to connect the stalk of the berry through the end-effector without directly contacting the fruit itself. The end-effector uses suction-based methods, utilizing a suction head to adsorb the fruits. This is an effective way to make up for the error of the visual sensor and can reduce the positional error when gripping the fruits and cutting the stalks [29]. Direct clamping of the fruit is the most natural way. Berry fruits are very soft during such operations because they are very smooth and delicate, so the chances of bruising during the operation are higher. When subjected to compressive forces, the berry fruits suffer more damage [30].
The methods of pulling and twisting fruit require direct contact with the fruit during clamping. Kurpaska et al. designed a strawberry-harvesting robot based on pneumatic suction cups to experimentally investigate adsorption force, stress, and fruit damage. The effects of the structure, application position, and orientation of the suction cups on the picking effect were investigated [31].
Cutting the stalks and vibratory harvesting are harvesting methods that do not require direct contact with the fruit. For example, Xiong et al. designed a strawberry-harvesting robot based on a cable-driven end effector. The end effector is mounted on an industrial manipulator. The cable-driven gripper opens its finger to wrap around the strawberry fruit body. When the finger is closed, the cutter, which consists of two curved blades, rotates rapidly to cut the stalk. The cutter is hidden inside the finger to prevent damage to the strawberry and surrounding strawberries [32].

2.3. Structures

From a kinematic perspective, it could be argued that the hardware structure of berry fruit-picking robots is inextricably linked to three critical pieces of machinery: the mobile platform, the manipulator, and the end-effector. Collectively, these elements form the mobile unit, the operational unit, and the execution unit of the berry fruit-picking robots, intending to facilitate a specific picking action.

2.3.1. Mobile Platform

Mobile platforms are essential in practical applications as mobile units of fruit-picking robots. In recent years, there has been a notable surge in interest in deploying robotic platforms in the agricultural sector, where their role in replacing humans in the cultivation of expansive fields is of significant consequence [33].
The current robot mobile platforms utilized in agricultural applications can be classified into wheeled and tracked bases. This classification is illustrated in Figure 4. Wheeled chassis can be subdivided into differential drive bases: Ackermann steering base, steer-by-wire base, and Mecanum wheels base. The mobile platforms depicted in the figure are from Yuhesen Technology, Ltd. (Shenzhen, China) and PAL Robotics, Ltd. (Barcelona, Spain), two prominent companies that develop mobile platforms for robots. Further information can be found on their official websites: https://yuhesen.com/en/index.aspx (accessed on 1 May 2024) and https://pal-robotics.com/robots (accessed on 1 May 2024).
The differential drive base is a mechanism that enables direction and movement control by regulating wheel speed on opposing sides. It is a prevalent configuration in robotic systems, particularly in agricultural applications, where it offers a competitive advantage due to its simplicity and reliable maneuvering stability, even in challenging and complex environments [34]. However, differential drives are a conventional technology with limitations such as restricted torque and a lack of recommended test protocols. Furthermore, they present several technical challenges, including the necessity for steering performance to be coupled to all wheel speeds [35,36].
Compared to a differential drive base, Ackermann steering improves the base’s performance in steady turning and prolongs tire service life [37,38]. The Ackermann steering linkage demonstrates optimal synthesis, reducing design parameters and providing helpful information for engineers [39]. However, the traditional four-link mechanism has a limited capacity to track points, restricting Ackermann’s steering geometry [40]. Special equipment and techniques may be required to repair and adjust the Ackermann steering base in the event of failure.
The steering-by-wheel base is the steering wheel’s deflection angle that determines the direction of travel. This deflection results in a change to the trajectory in front, which enables steering. The principle provides comparable steering performance to conventional passenger cars without requiring dedicated steering devices. The innovative steer-by-wire concept, which employs in-wheel motors and a differential steering principle, offers steering performance comparable to that of traditional passenger cars, eliminating the necessity for dedicated steering devices [41]. However, using rudder wheels results in a higher cost, which increases the expense of steering-by-wheel bases.
A Mecanum wheel is a unique wheel that allows a base to move in any direction without steering. Mecanum wheels facilitate omnidirectional mobility in vehicles by arranging rolls around the wheel axis, thereby elucidating their underlying geometry and kinematics [42]. An omnidirectional robotic system equipped with Mecanum wheels can navigate confined areas with reduced time and space requirements for maneuvers. Furthermore, the system’s motion conversion process is relatively computationally efficient [43]. However, the inappropriate selection of free rollers in a Mecanum wheel construction can result in undesired vertical movement, which may lead to difficulties in positioning and the necessity for stabilization [44].
The tracked mobile-based application is a more widely utilized technology in agricultural robotics. It offers multidirectional mobility, traversing various terrains and conditions while preventing tipping when carrying external payloads on inclined surfaces [45]. Nevertheless, the issue of adjusting the ground clearance and degree of vehicle chassis remains unresolved [46].
Choosing the right mobile platform for robots that pick berry fruits is essential, considering the specific operating environment and conditions. Most modern agricultural robots use batteries due to their high energy density and long operational endurance. Some are also designed with solar panels to recharge their batteries, promoting eco-friendly operations. While some robots still use small internal combustion engines for mobile chassis, this is becoming less common. The choice of power source generally depends on the robot’s design requirements, operating environment, and cost factors.

2.3.2. Manipulator

As the operational unit of the berry-picking robots, the robotic manipulator plays a crucial role in improving efficiency and accuracy. To mimic the skill of the human manipulator when picking fruit, the manipulator needs sufficient reach and Degrees of Freedom (DoF) to reach the object with the correct orientation [47]. However, the equations of motion are always highly coupled and nonlinear when multiple DoFs are required to satisfy the flexibility of a robotic manipulator, which is a great challenge to solve [48].
The kinematic analysis of a robotic manipulator can be classified into two distinct categories: Forward Kinematics Problem (FKP) and Inverse Kinematics Problem (IKP). In robotics, FKP is a method that utilizes the parameters of a robot’s joints, such as the angle, position, and velocity, to determine the spatial position and attitude of the robot’s end-effector. Conversely, IKP is a process of repositioning expressive motion onto a mechanical system with feedback loops that enable precise control of the position and orientation of the end-effector and center of mass [49]. FKP is straightforward, and IKP is computationally expansive and challenging for real-time control [50]. In the present era, most IKPs, including trajectory planning and pose estimation, are addressed through integrating deep learning methodologies to enhance the operational efficacy of robotic systems [51,52,53].
Trajectory planning for manipulators is crucial for maximizing the mobility of the manipulator’s configuration path and optimizing time-minimum trajectories. Consequently, trajectory planning and analysis are essential for ensuring the accuracy of manipulator movements [54,55]. Pose control of a manipulator entails determining the appropriate input torques for the joints to achieve specified positions, velocities, and accelerations. Due to the nonlinear and highly coupled nature of manipulators, effective control presents significant challenges [56]. Therefore, it is necessary to perform kinematic analysis to plan the robot manipulator’s motion trajectory and pose control. This is necessary to achieve a smooth and precise motion during the execution of the task.
Figure 5 illustrates the kinematic analysis of the manipulator of the berry fruit-picking robot. FKP and IKP analysis determine the robotic manipulator’s trajectory planning and pose control. As a case study, the figure employs a 6 DoFs manipulator, a common component in the design of fruit-picking robots.
The initial step was to conduct an FKP analysis. To obtain the morphological parameters of the manipulator, it is necessary to establish the D-H coordinate system using the Denavit–Hartenberg (D-H) method. This involves defining a reference coordinate system designated as O 0 , and local coordinate systems for each joint, designated as O i ,   i = 1 , 2 6 . The end-effector coordinate system is O h . Subsequently, the D-H parameters of the robotic manipulator are obtained, comprising θ ,   α ,   d ,   and   a , where θ represents the joint angle, α represents the twist angle, d represents the offset of linkages, a represents the length of linkages.
Calculating the transformation matrix between two neighboring links can be calculated by obtaining the D-H parameters of the manipulator in Equation (1).
T i = cos θ i sin θ i cos α i sin θ i sin α i a i cos θ i sin θ i cos θ i cos α i cos θ i sin α i a i sin θ i 0 sin α i cos α i d i 0 0 0 1    
where T i represents the transformation matrix for each linkage.
Then, the transformation matrices of all joints are multiplied to obtain the total transformation matrix T . The spatial parameters are derived from the transformation matrix, thereby facilitating the acquisition of spatial information about the movement of the robotic manipulator.
Subsequently, an IKP analysis is conducted. By employing deep learning, spatial data associated with manipulator movements is fed into a trained neural network model. This model then predicts the joint angles, which are subsequently mapped to their corresponding joint configurations based on the object position. This process facilitates pose control and trajectory planning.
In addition to kinematic analysis, it is essential to understand the manipulator’s role as the operational unit of the fruit-picking robot. To achieve this, the robotic manipulator must be integrated with various perception technologies, including real-time measurement of joint angles, dynamic obstacle avoidance in conjunction with visual perception, and other capabilities. Visual perception plays a crucial role in controlling robotic arm motion, as the accurate identification of target positions and the provision of necessary positional parameters hinge on this perceptual process [57].

2.3.3. End-Effector

The actuating element for picking berry fruits is the end-effector. With the continuous development of science and technology, there are many end-effectors for fruit picking. The end-effector for berry-picking robots can be broadly classified, as shown in Figure 6. The end-effector for berry-picking robots can be roughly classified according to the drive mode, the number of fingers, the harvesting mode, and finger materials, which include electric gripper, hydraulic gripper, pneumatic gripper, and hybrid gripper, according to the form of drive, two-finger gripper, three-finger gripper, four-finger gripper, and bionic hand gripper for classification, according to the number of fingers. The harvesting mode can be divided into mechanical, flexible, vacuum, and parallel grippers. According to the division of fingers, clamping materials can consist of metal, plastic, rubber, and other materials for classification.
Based on the above categorization, several end-effectors with specific characteristics are enumerated, as shown in Figure 7. Dimeas F et al. proposed a three-finger end-effector for strawberry picking, as shown in Figure 7a, where the end-effector is equipped with force and pressure profile sensors using the fuzzy logic principle, which detects misplaced strawberries on the grippers or uneven distribution of the force. The fingers grasp the strawberries using a crank-slider mechanism driven by stepper motors [58]. As shown in Figure 7b, Gunderman A et al. designed a soft robotic three-finger end-effector for blackberry picking. The end-effector is covered with force sensors, and two parts of silicone material are used to fabricate the soft fingers, where one part is used to provide softness and another part to the design of the soft finger considers the variation in the size and shape of the blackberry. The contact area with the surface of the blackberry can be increased by changing the shape of the finger [59]. Zhang Y et al. designed a picking robot to achieve automated strawberry picking in Monopoly. The end-effector of the picking robot is shown in Figure 7c, and the end-effector is located at the end joints of the strawberry-picking manipulator. The drive motor of the end-effector controls the movement of the cam mechanism. It transmits the model through the reflective sensors to control the opening and closing. It ends, ultimately realizing the picking work of strawberries in Monopoly [60]. As shown in Figure 7d, Feng Q et al. designed a pneumatic end-effector for strawberry-picking robots. Its end-effector is paired with a vision sensor to complete the end-effector consists of a casing, a rotary cutter, a fixed cutter, a fixed finger, and an oscillating finger, where the oscillating finger is mounted under the oscillating cutter and the fixed finger is mounted under the fixed cutter. The oscillating cutter is driven by a cylinder to open and close the gripper. When the gripper closes, the stalk is cut off from the plant, the fingers grasp the stalk left on the fruit, and finally, the gripper opens to complete the picking [61]. As shown in Figure 7e, Williams H et al. designed a kiwifruit-picking robot incorporating visual sensors. The end-effector consists of an asymmetric four-bar mechanism driven by a cylinder and a soft silicone structure consisting of a finger. The end-effector apparatus rotates the kiwifruit around the stem and then pulls it downwards, ultimately picking it from the canopy without damage and removing it from the fruit [62].
From the mechanism of berry fruit-picking robots, it is easy to see that almost all have used some perception technology to pick and harvest berry fruits. Different perception techniques are required for all working units of berry fruit-picking robots to execute the picking commands, which reinforces the crucial importance of sensors, as expressed in the introduction of this review. Therefore, we have focused on applying different perception techniques in berry fruit-picking robots.

3. An Overview of Perception Technologies for Berry Fruit-Picking Robots

Perception technology is dependent on sensors. Perception technology acquires and processes environmental information using sensors, which are essential to perception technology. Sensors can be classified into tactile and non-tactile sensors. Tactile sensors require a force or moment acting directly on the measuring device, and non-tactile sensors depend on the distance between the sensor and the monitored object. Non-tactile sensors include visual sensors such as cameras. These sensors work with light as the measuring medium. Other principles conclude resistance to change, ultrasonic, inductive, capacitive, magnetic, and other effects. In automation systems, only the sensors on the end-effector contact the tactile information of the workpiece and, therefore, need to be integrated [63].
  • Visual Perception
Visual perception allows robotic systems to acquire images through visual sensors, which are converted into digital signals based on pixel distribution and information such as brightness and color. The imaging system then extracts the characteristics of the object and controls the field equipment based on the discriminatory results.
Fruit-picking robots detect and locate fruits using vision sensors, such as binocular vision, laser vision, Kinect, multispectral or other vision sensors, and other methods. In detecting crops and fruits, vision cameras and their control systems can be used as hardware support for vision perception technology and as a communication interface between the external environment and the robot. In recent years, extensive use of machine learning and classical image processing techniques for various fruit detection applications has been seen [64].
In computer vision, there are several fundamental visual recognition problems: image classification, object detection, instance segmentation, and semantic segmentation. Therefore, these four methods can recognize and detect berry fruits with vision sensors [65].
Image classification: this method only needs to assign category labels to images. Image classification, as a classic research topic in recent years, is one of the core problems of computer vision and is the basis of various fields of visual recognition; the primary process of image classification includes image data preprocessing, feature extraction and representation, classifier design, etc. Object detection not only predicts the category labels but also locates each object instance through the bounding box. Object detection aims to determine whether an object instance from a given category exists in an image. Semantic segmentation aims at predicting the category label of each pixel without distinguishing between object instances, and semantic segmentation obtains fine-grained inference by predicting the label of each image pixel, with each pixel labeled as a category based on the object or region it contains. Instance segmentation is a particular setting of object detection that distinguishes between different objects using a pixel-level segmentation mask to distinguish other object instances. Instance segmentation provides various labels for different instances of objects belonging to the same object class, so instance segmentation can be defined as finding simultaneous solutions for object detection and semantic segmentation [65,66,67,68].
  • Tactile Perception
Tactile perception involves converting tactile signals into electrical signals using flexible electronic devices like tactile sensors. These sensors have significant potential applications in wearable electronic devices, health and motion monitoring, biomedicine, soft robotics, human–computer interaction, and other intelligent systems. Tactile sensors are perception elements that measure tactile-related properties of an object through physical contact between the sensor and the object. A gripper with tactile perception can acquire various force–tactile information, such as pressure, strain, position, curvature, stiffness, hardness, roughness, temperature, shape, and size. Slip sense can also be detected, which measures and detects an object’s motion relative to the sensor. Tactile perception is detecting and measuring contact force at a specific point. The nature of tactile perception is, in a sense, like the information obtained from binary systems, which detect the presence and absence of tactile [69,70,71,72].
Due to variations in size, shape, and structure, natural products differ from one another [73]; the challenges of applying tactile perception to berry fruit harvesting are significant for achieving stable access and flexible, non-invasive operations. It is essential to ensure that berry fruits are harvested and picked efficiently and effectively. Various tactile sensors are available for berry picking, including piezoresistive pressure sensors, capacitive pressure sensors, piezoelectric tactile sensors, and triboelectric tactile sensors. Table 1 displays the performance of some of these sensors.
Tactile sensors’ performance is directly related to stability, response, range, sensitivity, and detection limit parameters. Stability measures the strength of the sensor output signal, indicating the degree of change in the sensor output under the same operating conditions. High-stability sensors can provide more reliable and accurate measurement results. Response measures the speed at which the sensor responds to changes in the input signal. A sensor with a fast response speed can promptly capture environmental changes and output the corresponding signal. Range measures the maximum and minimum values the sensor can measure, and sensors with a wide range can be adapted to a broader range of measurement needs. Sensitivity measures how sensitive a sensor is to changes in the input signal. High-sensitivity sensors can detect minor changes in the signal, improving measurement accuracy. Sensors with a lower detection limit can detect weaker signals, improving measurement accuracy. The detection limit indicates the minimum signal strength that the sensor can detect.
Piezoresistive tactile sensors have the advantages of high-frequency response, simple signal processing, simple structure, and low cost, and their resistivity varies with external pressure stimulation. Still, at the same time, piezoresistive perception suffers from a severe hysteresis phenomenon, resulting in lower and unfortunate frequency response [86,87]. Capacitive tactile sensors have attracted much attention because of their simple structure, good repeatability, low loss, temperature independence, large fabrication area, high sensitivity to high voltage, and low sensitivity to temperature drift. However, capacitive tactile sensors have problems such as a complicated calibration process, a relatively small dynamic range, and sensitivity to environmental disturbances [88,89,90]. Piezoelectric tactile sensors’ excellent high-frequency response can quickly generate a piezoelectric potential in response to external mechanical stimuli, making them the best choice for vibration measurement. Piezoelectric sensors are commonly used to measure dynamic stresses such as acoustic vibration, slip detection, etc. However, piezoelectric sensors cannot measure static deformation due to their high internal resistance and non-negligible temperature sensitivity [71,87,91]. Triboelectric tactile sensors are mainly based on frictional initiation and electrostatic induction effects to affect charge flow and have the advantages of high instantaneous power and self-powering, etc. On the other hand, signal interference caused by external electrostatic induction is a significant problem that needs to be solved in their application [92,93].
  • Distance Measurement
Distance measurement is one of the essential parts of perception technology. Typically, displacement sensors are widely used to measure the distance between the gripper and the object so that the robot controller can update the object position according to the program; they can also be used to measure the size of the gripped object so that the gripper can evaluate and adjust the finger separation. In addition, these sensors can be used to measure the state of the gripper, including gripping speed, acceleration, force/torque, angle, and slip [14]. Distance measurement sensors that can be widely applied to berry-picking robots include infrared, ultrasonic, and Light Detection and Ranging (LiDAR) sensors.
An infrared sensor usually consists of an infrared emitter and an infrared receiver. The emitter emits infrared radiation, which the receiver detects and converts into an electrical signal. When an object enters the perception range of the infrared sensor, it absorbs or reflects the infrared radiation, causing the receiver to pick up different signals. The sensor detects changes in these signals to determine information such as the object’s presence, distance, or temperature. Infrared sensors are commonly used for measuring distance, making them ideal for obstacle avoidance in robots due to their low cost and fast response time [94]. Indoor infrared sensors perform better than outdoor infrared sensors. However, this also depends on the type and color of the object being detected [95].
The concept behind ultrasonic sensors is straightforward: they emit sound waves and receive them after interacting with the process being studied. The ultrasonic signal carries information about the measured parameter and arrives at the receiving end [96]. Ultrasound refers to sound waves produced by an object’s vibration. These waves have a frequency higher than 20 kHz, beyond the upper limit of human hearing. The perception of ultrasound is widely used in various fields, including medicine, industry, and science. Ultrasonic sensors convert electrical signals into ultrasound waves [97].
LiDAR technology accurately determines an object’s distance and velocity information using light waves with shorter wavelengths than radio waves for more precise three-dimensional perception [98]. LiDAR sensors emit laser pulses that reflect off object objects, allowing the sensors to detect distance and velocity. The sensors record the reflected pulses. The language is clear, objective, and value-neutral, with a formal register and precise word choice. The text follows the conventional structure and adheres to formatting guidelines with no grammatical or spelling mistakes. The LiDAR sensor receives laser pulses reflected by the object and records the reception time. The LiDAR sensor calculates the distance between the object and the sensor by measuring the time interval between the emission and reception of the laser pulses. By continuously transmitting and receiving the laser pulses, the LiDAR sensor creates a point cloud map of the object et object in three-dimensional space, enabling high-precision measurement and imaging of the object.
Machine vision is also a distance measurement technique for berry fruit-picking robots. A stereo vision system that uses two or more cameras captures an image of an object and then calculates the distance between the object and the camera by measuring the difference in the object’s position in the image across the different cameras. This method provides accurate distance measurements. Alternatively, depth cameras, such as those based on structured light or time flight technology, can also be used. These cameras capture an object’s depth information and calculate the distance between the object and the camera by analyzing the depth information in the image.
  • Switching Sensors
Switching sensors are frequently used to determine whether a specific position has been reached or to provide a trigger signal to the mechanical system [14]. Switching control is extensively used in natural robotics systems. It is commonly used for actuators with only two states: on or off. The non-modelled on-off switching control, particularly popular in soft robots, can also be regarded as an open-loop control strategy [99]. Switch sensors used for berry fruit-picking robots include photoelectric, proximity, and pressure switches.
The photoelectric switch typically comprises a light-emitting diode and a photosensitive diode. The light from the diode is reflected to the photoelectric device when an object approaches, and the detection circuit generates a corresponding output signal [100]. The photoelectric sensor switch emits infrared or other types of light, which is then detected by a light-sensitive element, such as a photodiode. The photosensitive element generates a current that triggers the switch action when it senses the emitted light. Conversely, when the light is blocked or absent, the receiver stops generating current, and the switch turns off. The photoelectric sensor switch operates by detecting the presence or absence of light using the photoelectric effect, enabling switch control. These sensor switches are widely used in automated control systems to detect object presence or position and measure object distance and speed.
The proximity switch operates on the principle of wireless electromagnetic induction. The system comprises four primary circuits that generate a magnetic field throughout the working area by passing an alternating current through two power supplies. The wireless proximity switches in the working area receive energy from the magnetic field through small coils and convert it into electrical energy. These sensors have small radio transceivers and low-power electronics to facilitate wireless communication. The sensors communicate with the input module through an antenna mounted in the working area [101]. Proximity switches detect object activity within a perception field by generating discrete binary output signals. This signal coupling can be achieved through various methods, including capacitive, inductive, fluid, acoustic reflection, optical reflection, and optical transmission [102].
The pressure switch is a device that controls a circuit based on pressure changes. It operates by using a pressure sensor to convert the pressure signal into an electrical signal, which then controls the state of the switch through an electrical circuit. The switch changes state when the pressure sensor detects a change in external pressure, generating an electrical signal. The signal is transmitted to the control circuit, which determines whether the switch’s trigger conditions have been met based on a preset pressure threshold. If the pressure exceeds the set threshold, the control circuit triggers the switch to close and energize the circuit. Otherwise, the switch opens and de-energizes the circuit, thus controlling the switch. Real-time perception of ambient pressure is crucial for the aerospace, automotive, manufacturing, and medical industries [103].

4. Methods and Analysis

4.1. Visual Perception

This chapter outlines the methodology and critically evaluates the advantages and disadvantages of three methods of visual perception—object detection, semantic segmentation, and instance segmentation—employed in berry fruit-picking robots.

4.1.1. Methods

Table 2 reviews visual perception methods for picking berry fruits. Image classification has been excluded from the table as it is considered less relevant for berry fruit picking than the other three methods.
The object detection technique applied to berry fruit picking is shown in Figure 8. Figure 8a shows an example of kiwifruit image detection by a deep learning model on an image taken in the morning with a flash; the yellow circle denotes undetected and incorrectly detected kiwifruit, and the purple wireframe denotes the recognized kiwifruit, the improved DY3TNet model accurately detects kiwifruits in orchards with minimal data weight [113]. Figure 8b shows the improved model for cherry fruit detection results. The cherry purple wireframe, blue wireframe, and green wireframe indicate the different intersections and ratios of the mode, and this method is a cherry fruit detection method based on an improved YOLO—V4 model. Ripe, semi-ripe, unripe fruits, and ripe cherry fruits can be detected accurately [106]. Figure 8c shows the results of R-YOLO detection of strawberries; the R-YOLO model significantly improves the localization precision, increasing the harvest rate and real-time performance of strawberry harvesting robots [112]. Figure 8d shows the results of tomato detection based on the improved YOLOv3 YOLO-Tomato model; this method effectively detects tomatoes in complex environments, outperforming state-of-the-art methods [110].
The semantic segmentation technique applied to berry fruit picking is shown in Figure 9. Figure 9a shows the recognition of grapes by the model network in darkness and sunlight. The figure shows that the grapes can be accurately recognized in both cases, and this method can detect and mask single berry objects with a semantic segmentation network by using a class ‘edge’ to separate single objects from each other fruits [123]. Figure 9b shows the segmentation results of bayberry in real-life environments, a CNN-based model for prune segmentation in complex environments that resists the limitations of light variations and occlusion [122]. Figure 9c shows the outcomes of strawberry segmentation and ripeness detection, assessing six distinct ripeness levels of strawberries in a challenging field environment, with a final average accuracy rating of high [116]. Figure 9d shows the outcome of identifying tomato picking points. The left figure illustrates the algorithm’s implementation, while the correct figure demonstrates the model’s precise identification of tomato-picking points in natural light. Additionally, this model effectively identifies tomato varieties and ripening stages [121].
The instance segmentation technique applied to berry fruit picking is shown in Figure 10. Figure 10a shows the model for assigning ripeness ratings to ten berries on a branch. Ripe fruits are numbered 1, 2, 4, and 9, while unripe fruits are numbered 3, 5, 6, 7, 8, and 10. This method can detect and mask single-berry objects using a semantic segmentation network by employing a class ‘edge’ to separate individual objects [124]. Figure 10b shows the test results for grape recognition and instance segmentation in various environments. Despite not being trained, the improved model achieved high detection and segmentation accuracy and inter-varietal generalization performance in complex growth environments [126]. Figure 10c displays the segmentation model, which accurately segments the strawberry. This method for strawberry instance segmentation demonstrates efficiency in a natural system and generates a representative database with images and detailed entries [131]. Figure 10d illustrates the detection and segmentation of waxberries in various lighting conditions, including good light, leaf shade, and overlapping fruits. The waxberry identification network was evaluated, and the results demonstrated its high precision and robustness to occlusion and overlap [127].

4.1.2. Analysis

The analysis of object detection applications has revealed that image recognition techniques can efficiently and accurately identify various fruit items, especially in contexts involving large volumes of fruit. However, these methods face limitations in detecting small, occluded, or densely packed objects and are prone to errors in complex backgrounds or varied lighting conditions. Future challenges in object detection include developing scalable proposal generation strategies, effectively encoding contextual information, automating machine learning for object detection, establishing new benchmarks, achieving low-shot detection capabilities, and designing robust backbone architectures for detection tasks [65].
Semantic segmentation facilitates the pixel-wise annotation of categories in berry fruit images, providing a more detailed semantic representation than object detection. This technique is beneficial for understanding complex scenes, image segmentation, and practical applications such as berry picking. However, it is computationally intensive and time-consuming, with suboptimal performance for small, complex, or indistinct objects. Future challenges in semantic segmentation include acquiring more challenging datasets, balancing accuracy with inference speed, enhancing domain-adaptive methods, leveraging contextual knowledge to improve model accuracy, exploring temporal correlations in video and image sequences, advancing weakly supervised segmentation techniques, and addressing profound structure catastrophes, including catastrophic forgetting [134].
Instance segmentation effectively segments individual berry fruit instances, precisely delineating multiple objects within the same category and detailed boundary data. This capability enhances object tracking and analysis. However, instance segmentation is computationally intensive and time-consuming, and its performance may be less reliable than semantic segmentation in scenarios with dense objects or complex backgrounds. Future advancements in instance segmentation may focus on single-location and single-mask approaches, multi-level feature integration, real-time processing, memory management, handling occlusions and disconnections, addressing small objects, developing unified segmentation frameworks, improving fine annotations, and implementing weakly and semi-supervised methods [135].

4.2. Tactile Perception

This chapter outlines the methodology and critically evaluates the advantages and disadvantages of four methods of tactile perception—piezoresistive, capacitive, piezoelectric, and triboelectric tactile sensors—employed in berry fruit-picking robots.

4.2.1. Methods

Table 3 shows a literature review of the methods used for tactile sensors in berry fruit-picking robots.
The related technologies of tactile sensors for berry fruit-picking robots are demonstrated in Figure 11.
Figure 11a–d show piezoresistive tactile sensors. Figure 11a demonstrates that the finger of this end-effector exhibits an overextended pinch force, with the center of pressure located at the mid-point of the distal phalanx. The end-effector is equipped with a linear orbital system for in-measurement transducerization and resistive force sensors, and grapes can be held in contact with the finger, an anthropomorphic end-effector that combines the adhesion principle with a multi-contact design with piezoresistive tactile sensors to have grapes [137]. Figure 11b shows the end-effector with pressure sensors for real-time force estimation during strawberry-picking tasks, demonstrating the integration of a soft-sensitized gripper with a robotic system capable of recognizing and picking small fruits such as strawberries [138]. Figure 11c shows that the flexible gripper can grip soft and hard kiwifruit. Force-sensitive resistive sensors are attached to the inner surface of the flexible fingers to detect the pressure between the flexible gripper and the fruit contact surface in real-time during the kiwifruit gripping process. This approach allows force sensors to detect and classify kiwifruit based on fruit hardness during kiwifruit picking [139]. Figure 11d shows a fully assembled artificial tactile feedback system with bimodal sensors. The system can monitor the force in real-time while grasping tomatoes and the state of tomatoes during grasping, the grasping of tomatoes while combining tactile perception information and algorithms to classify the size and ripeness of tomatoes [141].
Figure 11e,f show capacitive tactile sensors. Figure 11e demonstrates a lightweight gripping system that can be reliably used for effectively gripping tomatoes and detecting the size of the tomato through the combination of a voltage switch and capacitive tactile sensors [143]. Figure 11f exploits the ability of anisotropic wedge microstructures to induce bending deformations and designs capacitive pressure sensors with integrated anisotropic wedge microstructure dielectric layers. The sensors consist of silicone rubber and silver, capacitive pressure sensors integrated with a robotic gripper allow gentle picking of strawberries, and the data output from the sensors is susceptible [144].
Figure 11g,h show piezoelectric tactile sensors. Figure 11g demonstrates a 3D-printed bionic manipulator and the integration of a perception system with flexible piezoelectric and strain sensors on the fingers. The bionic manipulator has adaptive gripping capabilities and combines piezoelectric and strain sensors to enable tomato softness measurement [146]. Figure 11h shows a versatile, low-cost, highly stable tactile sensor for static and dynamic load measurement. The sensor comprises a three-layer basic structure, a fiber core layer, and electrically conductive fabric electrodes. These provide excellent electrical conductivity and allow the end effector to grip the tomato firmly without additional support so the fruit does not break. They can measure the softness of the tomato [145].
Figure 11i,j show triboelectric tactile sensors. Figure 11i shows the TENG (triboelectric nanogenerator) sensor consisting of a buckling electrode, liquid metal, silicone, and soft gripper, coupled with a robotic manipulator to enable the picking of fruits such as tomatoes; the finger has excellent flexion morphology at different degrees of bending [149]. Figure 11j demonstrates that the flexible finger of the TENG sensor is made of Silicone rubber and is divided into four segments by three 45° triangular cuts to facilitate its bending and to achieve a better envelope profile for conformal contact with objects with complex geometries. A good envelope profile during gripping improves maneuvering stability and facilitates pressure perception and energy harvesting in smart actuator designs. Adjacent knuckles with 45° triangular cuts at the shoulder provide 90° angular displacement during flexion. The fingertip phalanges are rounded, and the cable is inserted through the elastomer on the bent side, to which one end is attached. To protect the actuator body and reduce potential friction, a plastic tubing piece is placed inside each tubing piece drilled hole. The end-effector consists of a flexible finger, gripper stands, and adjust distance; a three-finger actuator combined with a triboelectric tactile sensor has been designed to monitor and clamp tomatoes with precision [147].

4.2.2. Analysis

Piezoresistive sensors are distinguished by affordability, durability, resilience, and capacity to accurately gauge static and dynamic pressure/strain [150,151]. They are widely utilized in berry picking due to their uncomplicated design, dependable operation, reliability, and economical cost. Furthermore, these sensors exhibit remarkable resilience to wear and corrosion, ensuring prolonged functionality in challenging field conditions without susceptibility to damage. However, piezoresistive sensors are sensitive to environmental variables like humidity and temperature, which can influence their performance. They also have a restricted measurement range, potentially limiting their ability to accommodate different sizes and hardness levels of berries. Furthermore, piezoresistive sensors can be influenced by external interferences like machine vibrations or mechanical collisions, compromising accuracy. A significant drawback is their reliance on an external power supply to drive the resistance signal detection process. In summary, while piezoresistive sensors offer simplicity, reliability, and cost-effectiveness advantages, they face challenges concerning environmental adaptability, accuracy, data processing requirements, and power dependency in berry-picking applications [152].
Capacitive sensors offer several significant advantages for berry fruit-picking robots, including precise accuracy, a wide measurement range, enhanced adaptability, rapid response times, and robust anti-interference capabilities. Furthermore, capacitive sensors can accurately detect and pick fruit while maintaining a relatively simple structure, exhibiting excellent linearity, very low hysteresis, and low power consumption [153]. However, capacitive sensors may experience performance fluctuations due to environmental factors such as temperature, humidity, and stray capacitance. These variations can lead to inconsistent sensor outputs. Precise installation and calibration are critical for capacitive sensors to assess fruit ripeness and hardness accurately. Capacitive sensors are susceptible to electromagnetic interference compared to piezoresistive sensors. They also generate substantial datasets that require advanced data processing algorithms for meaningful information extraction. Other challenges associated with capacitive sensors include ensuring biocompatibility, enhancing material durability, achieving precise structural perception, and developing self-powering capabilities.
As piezoelectric sensors can detect dynamic stress factors [124], Piezoelectric sensors offer high sensitivity and a broad measuring range, enabling precise detection of fruit pressure dynamics with rapid response times. They also exhibit enhanced durability, stability, and lower power consumption. However, Piezoelectric sensors are sensitive to environmental factors such as temperature and humidity, which can influence their operational effectiveness. They generally incur higher manufacturing costs compared to other sensor types. Accurately processing their output signals to obtain precise pressure information necessitates advanced computational algorithms. Additionally, piezoelectric sensors can be susceptible to interference from robot motion or vibrations, potentially compromising their ability to measure fruit pressure accurately. Addressing these challenges involves advancements in piezoelectric materials, optimizing sensor structures, exploring various vibration modes, and integrating multiple operational modes for improved robustness and accuracy in fruit pressure detection applications [154].
The most significant attribute of triboelectric sensors is their capacity to generate electrostatic charges through the friction between materials, facilitating energy conversion and sensing functions. These sensors employ the electrostatic effect generated by materials during contact and separation to detect and quantify an object’s contact force, state of motion, or surface properties. Furthermore, these sensors can detect the pressure exerted by a robotic manipulator or gripper on the fruit, ensuring gentle handling to prevent damage or crushing. However, practical triboelectric effects depend on the material’s response to charge separation during friction and contact. Addressing these challenges involves optimizing triboelectric materials for enhanced sensitivity and durability, mitigating sensitivity to environmental conditions, and integrating sensors effectively into robotic systems for optimal performance in fruit harvesting applications [155].

4.3. Distance Measurement

This chapter outlines the methodology and critically evaluates the advantages and disadvantages of four distance measurement methods—infrared, LiDAR, ultrasonic, and multimodal sensors—employed in berry fruit-picking robots.

4.3.1. Methods

Table 4 displays the methods used for distance measurement techniques related to berry fruit-picking robots. Figure 12 shows a review of distance measurement-related techniques applied to berry fruit-picking robots.
Figure 12a–c demonstrate three examples of infrared sensors used in berry fruit-picking robots. Figure 10a displays the end-effector of a cherry tomato-picking robot, which includes an RGB-D camera (infrared sensors), a housing, a gripping device, and a drive unit. The end-effector successfully picked cherry tomatoes in the greenhouse thanks to the recognition and range of the RGB-D camera [159]. Figure 12b shows the detection of strawberries in a natural environment using an on-board strawberry detection task. The RGB-D camera captures the frame and displays the pixel coordinates of the center of mass of the strawberry in the lower right corner. RGB-D cameras use infrared and depth information to estimate the position of strawberries in three-dimensional space [158]. Figure 12c displays the internal view of the end-effector of the strawberry-picking robot. It comprises a cable housing, pulley, IR sensor (infrared sensor), torsion spring, curved blade, and sponge. The manipulator utilizes three internal infrared sensors to identify the object and then directs the manipulator to the optimal cutting position [32].
Figure 12d–f present three LiDAR sensor applications for berry fruit-picking robots. Figure 12d shows an automated trussed tomato harvesting platform. The platform uses LiDAR and IMU (Inertial Measurement Unit) for autonomous navigation. It also includes the laboratory map created by the cartographer, raw LiDAR data, and robot modalities using LiDAR. The mobile platform has environmental awareness and navigation capabilities for autonomous movement and object localization [162]. Figure 12e displays the end-effector of the strawberry picking robot, which comprises a cutting blade, a laser beam sensor, a camera, a slideway, and a space cam gripper. The robot is equipped with laser sensors that accurately measure the distance of the strawberries and pick them. Additionally, the fusion of laser sensors with monocular cameras improves navigation accuracy and system fault tolerance [160]. Figure 12f displays a tomato-picking robot before a greenhouse tomato plant. The robot is fixed to a platform and can travel along a track between rows of plants. An ABB IRB1200 robotic manipulator is equipped with a Live Sense L515 camera on its end-effector, which, with the aid of LIDAR sensors, can accurately detect and position crops, including tomatoes, even in occlusion [161].
Figure 12g presents the application of an ultrasonic sensor to a berry fruit-picking robot. The image displays an Automated Ultrasonic System mounted on a farm vehicle, which is used to detect blueberries. The field vegetation map of wild blueberries demonstrates the Ultrasonic System’s ability to distinguish tall weeds and bare areas from the wild blueberry plants within the selected field extent. The Ultrasonic System is particularly effective at distinguishing tall weeds and bare patches from wild blueberry plants. Ultrasonic sensors can detect weeds and empty areas and determine the plant’s height by measuring the distance between the plant and the sensor [163].
Figure 12. Distance measurement-related technology applied to berry fruit-picking robots: (a) A pneumatic finger-like end-effector and harvesting cherry tomato from Ref. [159]. (b) The strawberry localization in a ridge planting from Ref. [158]. (c) A strawberry harvesting robot with a cable-driven gripper from Ref. [32]. (d) A tomato harvesting robot with LiDAR from Ref. [162]. (e) A gripper with laser beam sensor and its application from Ref. [160]. (f) A tomato harvesting robot with LiDAR from Ref. [161]. (g) Detection of wild blueberry using an ultrasonic sensor from Ref. [163]. Copyright 2009, ASABE. (h) Optical sensing and LiDAR to determine tomato plant spacing from Ref. [164]. (i) Adaptive end-effector with multimodal sensors for tomato-harvesting robots from Ref. [165].
Figure 12. Distance measurement-related technology applied to berry fruit-picking robots: (a) A pneumatic finger-like end-effector and harvesting cherry tomato from Ref. [159]. (b) The strawberry localization in a ridge planting from Ref. [158]. (c) A strawberry harvesting robot with a cable-driven gripper from Ref. [32]. (d) A tomato harvesting robot with LiDAR from Ref. [162]. (e) A gripper with laser beam sensor and its application from Ref. [160]. (f) A tomato harvesting robot with LiDAR from Ref. [161]. (g) Detection of wild blueberry using an ultrasonic sensor from Ref. [163]. Copyright 2009, ASABE. (h) Optical sensing and LiDAR to determine tomato plant spacing from Ref. [164]. (i) Adaptive end-effector with multimodal sensors for tomato-harvesting robots from Ref. [165].
Agriculture 14 01346 g012
Figure 12h,i demonstrate using multimodal sensors in berry fruit-picking robots. Figure 12h shows the sensor platform used to measure tomato crops, which includes an emitter infrared sensor, a receiver infrared sensor, and a wheel with an encoder. In contrast, the tractor carries a platform consisting of a Lateral Scanning Lidar sensor and a Kinect sensor. Infrared sensors measure plant position and spacing, while LiDAR acquires 3D information about the plants. RGB-D cameras are then used to validate the LiDAR results [164]. Figure 12i displays the hardware system of the tomato picking robot, comprising a mobile platform, a 6DOF robotic manipulator UR3e, an end-effector, a depth camera, a 3D LIDAR, and an industrial control machine. The infrared camera and projector of the LIDAR and depth camera enable navigation, mapping, tomato detection, and center point location [165].

4.3.2. Analysis

From the above overview, it is evident that all methods applied to berry fruit picking have certain advantages and disadvantages.
Infrared sensors are berry fruit-picking robots’ most used distance measurement tool. This preference is likely attributed to their ability to accurately detect fruit ripeness and location, thereby ensuring that only ripe fruits are harvested, enhancing overall efficiency. Furthermore, the automation afforded by infrared sensors minimizes the need for human intervention, resulting in cost savings. These sensors also improve the quality and ripeness of the fruits picked and can adapt to varying environmental conditions and lighting, enabling accurate fruit harvesting in diverse situations. Consequently, infrared sensors hold significant potential for application in berry fruit-picking robots. However, the use of infrared sensors does present certain disadvantages. Firstly, the high cost of infrared sensors can increase the overall manufacturing cost of the robot. Secondly, their performance may be compromised under adverse weather conditions, such as rain or snow. Additionally, infrared sensors may be susceptible to interference from stray light and other environmental factors, further affecting their accuracy in fruit picking. The main challenges facing infrared sensors include cost reduction, improved energy efficiency, enhanced flexibility, and size reduction [166].
LiDAR sensors are widely utilized for distance measurement in berry fruit picking due to their superior accuracy compared to infrared sensors. They provide precise information regarding the position and distance of fruits, enabling robots to accurately identify fruit location and size, perform precise picking, and minimize damage to plants. Additionally, LiDAR sensors can detect obstacles and perceive the environment in real-time, facilitating safe robot navigation and collision avoidance. These sensors operate effectively under various lighting conditions and can adapt to different orchard environments. However, LiDAR sensors are generally more expensive than infrared sensors. Furthermore, LiDAR sensors typically require more energy, leading to higher energy consumption. The installation and commissioning of LiDAR sensors may also be more complex, necessitating technical support and ongoing maintenance. There is a pressing need for compact LiDAR sensors that offer fast, high-resolution imaging with a large field of view and low power consumption [98,167].
Ultrasonic sensors offer a cost-effective alternative to LiDAR, making them suitable for budget-conscious fruit-picking robots. These sensors excel in ranging performance for berry fruit harvesting by accurately detecting fruit distance and position while effectively identifying obstacles like branches and leaves. This capability helps mitigate the risk of fruit damage and collisions. However, ultrasonic sensors have a limited range, necessitating the deployment of multiple sensors to cover more extensive orchards. Environmental factors can also impact sensor performance, underscoring the importance of effective ecological conditioning and noise filtering. Compared to LiDAR, ultrasonic sensors generally exhibit lower accuracy. Their adoption of berry fruit picking remains limited due to the complexity and constraints associated with ultrasound technology, particularly in challenging environmental conditions [148].
Incorporating multimodal sensors is crucial due to the single sensor’s limitations. Integrating multiple sensors enhances a robot’s ability to comprehensively perceive and interpret its environment, including factors such as fruit location, distance, shape, and obstacles. Employing diverse sensor types improves the robot’s performance across various environments and conditions, reducing reliance on any single sensor’s constraints. While the integration of multiple sensor fusion technologies can enhance the accuracy and authenticity of environmental data, it also brings about its own set of challenges [168]. This integration also enables the robot to adapt to challenging scenarios like low light conditions, dense fruit clusters, and complex terrain. Effectively integrating data from multiple sensors and addressing future challenges remains critical [96].

4.4. Switching Sensors

This chapter outlines the methodology and critically evaluates the advantages and disadvantages of three methods of switching sensors—photoelectric, proximity, and pressure switches—employed in berry fruit-picking robots.

4.4.1. Methods

Table 5 presents a literature review of the methods for switching sensors in berry fruit picking. Figure 13 shows the technologies related to switching sensors applied to berry fruit-picking robots.
Figure 13a,b demonstrate the application of photoelectric switches in berry fruit-picking robots. Figure 13a depicts a tomato-picking robot experiment in a lab. The robot uses a photoelectric switch to guide the end-effector to tomato fruits. A photoelectric switching sensor in the soft gripper offers positional feedback to the vision system for tomato gripping [169]. Figure 13b shows the kiwifruit-picking process using an end-effector. The end-effector’s finger includes a cutting mechanism and an infrared switch sensor. The sensor detects the kiwi’s position and signals the stepper motor to clamp the fruit [170].
Figure 13. Switching sensors related technology applied to berry fruit-picking robots. (a) The cherry tomato picking robot with photoelectric switches from Ref. [169]. (b) A soft gripper with photoelectric switches for kiwifruit from Ref. [170]. (c) A berry-picking robot with proximity switches from Ref. [173]. (d) A harvesting robot with a vacuum sensor for sweet pepper from Ref. [178]. (e) An underactuated gripper with a pressure sensor for grasping grape cluster from Ref. [174]. Copyright 2023, IEEE.
Figure 13. Switching sensors related technology applied to berry fruit-picking robots. (a) The cherry tomato picking robot with photoelectric switches from Ref. [169]. (b) A soft gripper with photoelectric switches for kiwifruit from Ref. [170]. (c) A berry-picking robot with proximity switches from Ref. [173]. (d) A harvesting robot with a vacuum sensor for sweet pepper from Ref. [178]. (e) An underactuated gripper with a pressure sensor for grasping grape cluster from Ref. [174]. Copyright 2023, IEEE.
Agriculture 14 01346 g013
Figure 13c illustrates proximity switches in a berry fruit-picking robot. The figure shows a 1.8 mm OD magnetic sensor and source installed on the end-actuator of the tomato-picking robot system. Magnetic proximity switches are primarily used for end-effector position control and to achieve control of the tomato-picking position of the robot’s gripper [173].
Figure 13d,e demonstrate pressure switches in a berry fruit-picking robot and a sweet pepper-picking robot, respectively. Figure 13d displays the sweet pepper-picking robot, which is composed of an end-actuator and a robotic manipulator. The end-actuator is mainly comprised of a vacuum sensor and a knife. Vacuum-switching sensors detect successful fruit gripping, after which harvesting operations can proceed [178]. Figure 13e shows the end-effector of the grape harvesting robot, which can flexibly clamp the grapes by switching its fingers based on the contact force with the grapes and the hydraulic actuator [174].

4.4.2. Analysis

Photoelectric switches enable precise fruit position and size detection, ensuring accurate picking through rapid response times that enhance harvesting efficiency. Their integration into robotic control systems facilitates automated harvesting, thereby minimizing the need for human intervention. These devices are characterized by consistent performance and extended operational lifespans, allowing reliable function in challenging environments. However, environmental factors such as light intensity and dust can adversely affect sensor sensitivity, impacting operational effectiveness. Regular maintenance and cleaning are essential to ensure optimal functionality, as these sensors play a critical role in robotics. The high costs associated with precision photoelectric switches can increase overall manufacturing expenses, and their specific environmental requirements may limit their applicability across various fruit-picking scenarios. While these sensors share fundamental principles with infrared and LiDAR technologies for distance measurement, they are employed in distinct applications. Future challenges for photoelectric switches include achieving lower costs, reducing energy consumption, increasing flexibility, minimizing size, and enhancing overall performance [98,166].
Proximity switches are more frequently employed in berry fruit-picking robots than photoelectric switches due to their effective performance in various environmental conditions, including light intensity and dust fluctuations. Their straightforward installation and operation require minimal commissioning and calibration, reducing manufacturing costs. Proximity switches are recognized for their high stability and reliability, maintaining consistent functionality over extended periods and exhibiting resistance to external interference. However, proximity switches have limitations, such as a restricted detection range that may be insufficient for long-distance applications. They may also exhibit reduced sensitivity and precision in object detection compared to photoelectric switches. While proximity switches can identify the presence of objects within their range, they cannot differentiate between distinct types of fruit, posing challenges in applications requiring specific identification. Additionally, metal interference can lead to erroneous detections or operational issues in certain proximity switches. Addressing challenges related to detection range, measurement accuracy, durability, and stability is crucial to ensuring the reliability and longevity of proximity sensors. Furthermore, effectively integrating these sensors with robotic systems for real-time data transmission and processing is essential, along with developing methods to extract valuable insights from the extensive data generated by these sensors [179].
Pressure switches are versatile tools used to develop automated fruit-picking robots. These devices are unaffected by variations in light intensity, dust presence, or other environmental factors, allowing them to operate effectively under diverse conditions. Their installation and operation are straightforward, requiring no complex commissioning or calibration processes. Pressure switches are known for their high stability and reliability, maintaining consistent functionality over extended periods without significant influence from external factors. However, pressure switches typically necessitate direct contact with the fruit to activate, which may damage soft or delicate produce. As a result, the robot’s picking action may lack the necessary gentleness, increasing the likelihood of fruit damage. Furthermore, pressure switches can only indicate whether fruits have been picked and cannot accurately determine their position or shape. Moreover, pressure switches are susceptible to contamination from fruit residue or dust, necessitating regular cleaning and maintenance to ensure optimal operation. As the primary distinction between pressure switches and tactile sensors lies in their functionality, they share similar challenges, including accuracy, environmental adaptability, data processing, durability, and structural optimization [152,154,155].

5. Discussion and Prospect

5.1. The Technical Characteristics of the Perception Technology

In this review of perception techniques applied to berry-picking robots, we summarized the technical characteristics of different perception techniques to discuss the features of each perception technology. Figure 14 describes the technical characteristics of these perception techniques when applied to berry fruit-picking robots.
  • Visual perception: the visual sensors exhibit high resolution and can accurately identify the fruit’s location, size, and ripeness. They facilitate real-time monitoring of fruit location and status, enabling the robot to adjust its actions accordingly. Additionally, visual perception can automatically distinguish between different types of fruits, ensuring the accuracy of the robot’s picking operations. Moreover, one of the primary challenges in practical visual perception is the accuracy of perception in different scenes and conditions [180]. It is, therefore, necessary to adapt visual perception to different light and background conditions in different environments to improve the reliability and accuracy of fruit-picking tasks.
  • Tactile perception: The fruit’s shape, hardness, and surface properties enable the robot to grasp precisely. The tactile sensor delivers real-time data feedback, assisting the robot in adjusting its picking actions to enhance efficiency and quality. Additionally, the tactile sensor detects the pressure exerted by the robot upon contact with the fruit, preventing excessive force that could potentially damage it. Furthermore, the tactile sensor adapts to the varying characteristics of different fruit types and sizes, ensuring precise and reliable harvesting.
  • Distance measurement: the distance measurement sensor accurately gauges the distance between the robot and the object fruit, ensuring the fruit can be grasped without damage. Additionally, the sensor monitors real-time changes in distance, enabling prompt adjustments to the robot’s movements for a stable picking process. The sensor also detects distances to obstacles, allowing the robot to avoid collisions and ensuring the safety of the robot and its environment. Furthermore, it aids the robot in accurately locating fruit, thereby enhancing the efficiency and precision of the picking process.
  • Switching sensors: Firstly, they enable control of the robot’s actions based on detected state information, such as initiating or halting the movement of the manipulator and directing the robot’s movement. Secondly, they provide precise control over the robot’s motions. The switch sensors monitor the status of various robot components, including the opening and closing of the robotic manipulator and overall movement. They can also detect contact between the robot and obstacles, immediately stopping movement upon collision to prevent damage to both the robot and its surroundings. Furthermore, the switch sensors assess the operational status of various components, facilitating timely fault detection, diagnosis, and repair.

5.2. The Advanced Technical of Berry Fruit Picking

In berry fruit picking, it is essential to continuously integrate advanced technologies to address the limitations of existing ones. Figure 15 illustrates several advanced technologies applied to berry fruit picking.
Figure 15a displays a raspberry harvesting robot by Kai Junge et al. based on physical twinning. This review introduces the application of physical twinning in agricultural harvesting robots. Physical twinning involves modeling and simulating real-world systems to transfer and sub-advance controllers. Physical twins enable a high success rate of controller transfer and demonstrate the same faults as in real-world scenarios. The end-effector consists of an electrically controlled parallel gripper and silicone fingers. Extending the physical twin concept to more complex raspberry models can further improve the robot’s functionality. The process involves selecting a physical twin model by a human, which is then used by the picking robot to pick raspberries. The robot’s picking results are compared to those of the human. This ensures accuracy and efficiency in the picking process [181].
Figure 15b illustrates a bubble-casting soft robot developed by Trevor Jones et al. to harvest blackberries. (i) figure presents a schematic diagram of the bubble-casting manufacturing method, wherein bubbles are injected into a mold previously filled with a polymer melt. (ii)The polymer residue is subsequently discharged and solidified to yield an anisotropic actuator that facilitates ease of use during demolding. (ii)This actuator may also function as a clip for blackberry picking. The method is characterized by flexibility, robustness, and predictability, thereby accelerating the development of soft robots capable of assembling complex actuators. Additionally, incorporating elongated, dodging, or vascular structures can result in new functionalities derived from geometrical and material nonlinearities while minimizing costs [182].
Figure 15c displays a hydraulically amplified self-healing electrostatic actuator designed by E. Acome et al. The schematic shows a stack of five annular hydraulically amplified self-healing electrostatic actuators with neighboring electrodes at the same potential. The end-effector is used to power a soft body gripper, which is actuated by a stack of force-applied amplifying self-healing electrostatic actuators. The possibility of amplifying self-healing electrostatic actuators for the next generation of soft-bodied robotic devices is illustrated by a gripper for raspberries, a soft body gripper for manipulating delicate objects, and a self-perception artificial muscle for powering a robotic manipulator. These examples demonstrate the importance of clear and concise language, logical structure, and precise word choice in technical writing [183].

5.3. Challenges and Future Opportunities

In recent decades, the development of manipulators, sensors, end-effectors, and algorithms has led to the widespread use of picking robots in berry fruit picking. Today’s picking perception technologies allow fruit-picking robots to significantly reduce costs, improve efficiency, and reduce harvest time, distinguishing them from manual and traditional mechanical picking. However, despite these advancements, challenges in picking perception technology persist, and several urgent problems require attention.
  • Accuracy: The accuracy of a robot in berry fruit picking hinges on its ability to precisely identify and locate fruits, assess their ripeness, and execute picking actions with minimal damage or errors. Reliable perception technology is essential for correctly identifying different types of fruits. However, current technology often fails to achieve this desired level of accuracy. Challenges such as visual perception errors caused by lighting conditions, occlusion, wind effects, and background interference contribute to omissions and misdetections in optical perception systems. As a result, ensuring consistent accuracy in the berry fruit-picking process remains uncertain.
  • Adaptability: Adaptability in robotics refers to the ability of a robot to adjust its operations and techniques in response to varying conditions and situations. This includes accommodating different types of berries, adapting to changes in weather conditions, navigating different soil types, and handling various crop layouts. The goal is to optimize efficiency and effectiveness in harvesting. During berry picking, environmental factors like light, humidity, and temperature can affect sensor performance, impacting perception capabilities. It is essential for perception technology to be adaptable in natural environments, enabling robots to perform tasks such as obstacle avoidance, object recognition, and precise object separation. These tasks can be challenging due to the unpredictable characteristics of natural environments.
  • Durability: Durability in robotics refers to a robot’s ability to withstand wear, corrosion, and physical damage in its operational environment. This includes resisting high temperatures, humidity, dust, and the vibrations and shocks encountered during berry picking. Ensuring sensors remain stable and functional over extended periods is crucial for reliable performance in challenging conditions.
  • Data processing: Data processing in robotics involves the collection, analysis, and utilization of sensor data. Sensors gather extensive information during operations, such as the location, ripeness, and size of berries. Real-time processing is essential to adjust the robot’s movements promptly. The application of numerical techniques for the acquisition of sensory data may be susceptible to the influence of internal bias [184]. Additionally, data may contain noise or errors, requiring cleaning and correction. Integration of data from various sensors is vital for comprehensive analysis. Thus, data processing poses a major challenge in perceptual technology.
  • Low cost: Low cost is defined as a cost-effectiveness that reduces research and development expenses, thereby making the technology affordable and accessible for farms of all sizes. The sensor is a crucial component of the perception system. However, high-quality sensors with solid performance are still expensive, and they may become faulty or damaged during use, requiring maintenance or replacement. Therefore, it is necessary to reduce the cost of sensors while ensuring their quality and performance.
The advancement of future perception technologies must address several challenges to enhance the efficiency and precision of berry fruit-picking robots. Firstly, developing more efficient, adaptable, and flexible algorithms that accurately identify the location, ripeness, and quality of fruits is essential. Secondly, integrating data from various complex sensors is necessary to improve data processing accuracy and reliability. New sensor materials should be utilized to enhance performance and quality. A perception system layout incorporating multimodal sensors must be designed to accommodate different types and shapes of berries, alongside various picking methods and operational modes, to fulfill diverse fruit-picking requirements. The quality and performance of the sensors must be ensured while minimizing costs.

6. Conclusions

In response to the issues associated with manual picking and conventional mechanical harvesting of berry fruits, this review delineated the fundamental principles, operational mechanisms, and structural components of berry fruit-picking robots. The pivotal role of perception technologies in elucidating the berry-picking process was underscored. It was established that diverse perception technologies were employed in various ways within berry fruit-picking robots, inherently distinguished by their technical advantages and disadvantages.
This review offers a comprehensive examination of the methodologies utilized in the development of perception technologies for berry fruit-picking robots, with a particular emphasis on visual perception, tactile perception, distance measurement, and switching sensors. It evaluates the advantages and disadvantages of these technologies and synthesizes their applications in berry picking. The technical requirements for perceptual technologies are contingent upon the operational condition. In the context of greenhouse cultivation, for instance, the importance of accuracy and efficiency cannot be overstated. In such conditions, semantic segmentation has been demonstrated to be a more effective visual method than other techniques. Conversely, field-grown berries necessitate adaptability and durability, rendering berry fruit-picking robots equipped with robust piezoresistive tactile sensors and adaptable LiDAR sensors a more suitable choice for variable field conditions. Moreover, the review presents an overview of advanced fruit-picking technologies and identifies potential future applications of perception technologies in berry fruit-picking robots. By elucidating the strengths and limitations of current technologies, this review offers valuable insights for practitioners and researchers alike, facilitating interdisciplinary knowledge integration and guiding technological advancements.

Author Contributions

Conceptualization, C.W., W.P. and Q.H.; methodology, C.W., X.Z. and T.Z.; investigation, W.P., C.L. and Q.H.; resources, W.P., H.W. and T.Z.; writing—original draft preparation, C.L., H.W. and J.Y.; writing—review and editing, C.W. and W.P.; project administration, T.Z. and X.Z.; funding acquisition, C.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Guangdong Province International Cooperation Project (grant number 2023A0505050133) and the Guangdong Basic and Applied Basic Research Foundation (grant number 2022A1515140162).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Yang, L.; Sun, G. Research progress of berry and berry juice. Food Res. Dev. 2008, 5, 183–188. [Google Scholar] [CrossRef]
  2. Kähkönen, M.P.; Hopia, A.I.; Vuorela, H.J.; Rauha, J.-P.; Pihlaja, K.; Kujala, T.S.; Heinonen, M. Antioxidant Activity of Plant Extracts Containing Phenolic Compounds. J. Agric. Food Chem. 1999, 47, 3954–3962. [Google Scholar] [CrossRef] [PubMed]
  3. Hertog, M.G.L.; Feskens, E.J.M.; Kromhout, D.; Hertog, M.G.L.; Hollman, P.C.H.; Hertog, M.G.L.; Katan, M.B. Dietary Antioxidant Flavonoids and Risk of Coronary Heart Disease: The Zutphen Elderly Study. Lancet 1993, 342, 1007–1011. [Google Scholar] [CrossRef] [PubMed]
  4. Li, J.; Gao, C.; Xiao, B. Wild Fruit Development and Comprehensive Utilization, 1st ed.; Scientific and Technical Documentation Press: Beijing, China, 1998; ISBN 7-5023-0922-5. [Google Scholar]
  5. Du, G.; Yao, F.; Cao, J.; Li, Y.; Li, F.; Jia, Q. Design of the artificial-assisted single-drive device for picking multi-fruit strawberries of ridge cultivation. J. Mach. Des. 2020, 37, 19–23. [Google Scholar] [CrossRef]
  6. Yuan, P.; Zhu, X.; You, J.; Han, C.; Zhang, X.; Guo, H. Development of crankshaft vibration threshing and harvesting equipment for wine grape. Trans. CSAE 2020, 36, 67–74. [Google Scholar] [CrossRef]
  7. Tai, K.; El-Sayed, A.-R.; Shahriari, M.; Biglarbegian, M.; Mahmud, S. State of the Art Robotic Grippers and Applications. Robotics 2016, 5, 11. [Google Scholar] [CrossRef]
  8. Brown, E.; Rodenberg, N.; Amend, J.; Mozeika, A.; Steltz, E.; Zakin, M.R.; Lipson, H.; Jaeger, H.M. Universal Robotic Gripper Based on the Jamming of Granular Material. Proc. Natl. Acad. Sci. USA 2010, 107, 18809–18814. [Google Scholar] [CrossRef]
  9. Muscato, G.; Prestifilippo, M.; Abbate, N.; Rizzuto, I. A Prototype of an Orange Picking Robot: Past History, the New Robot and Experimental Results. Ind. Robot 2005, 32, 128–138. [Google Scholar] [CrossRef]
  10. Clement, R.G.E.; Bugler, K.E.; Oliver, C.W. Bionic Prosthetic Hands: A Review of Present Technology and Future Aspirations. Surgeon 2011, 9, 336–340. [Google Scholar] [CrossRef] [PubMed]
  11. Pettersson, A.; Davis, S.; Gray, J.O.; Dodd, T.J.; Ohlsson, T. Design of a Magnetorheological Robot Gripper for Handling of Delicate Food Products with Varying Shapes. J. Food Eng. 2010, 98, 332–338. [Google Scholar] [CrossRef]
  12. Huang, S.; Wang, B.; Zhao, Z.; Wang, L.; Weng, L. Recognition of Magnetostrictive Tactile Sensor Array Applied to Manipulator. Trans. China Electrotech. Soc. 2021, 36, 1416–1424. [Google Scholar] [CrossRef]
  13. Iñiguez-Moreno, M.; González-González, R.B.; Flores-Contreras, E.A.; Araújo, R.G.; Chen, W.N.; Alfaro-Ponce, M.; Iqbal, H.M.N.; Melchor-Martínez, E.M.; Parra-Saldívar, R. Nano and Technological Frontiers as a Sustainable Platform for Postharvest Preservation of Berry Fruits. Foods 2023, 12, 3159. [Google Scholar] [CrossRef] [PubMed]
  14. Zhang, B.; Xie, Y.; Zhou, J.; Wang, K.; Zhang, Z. State-of-the-Art Robotic Grippers, Grasping and Control Strategies, as Well as Their Applications in Agricultural Robots: A Review. Comput. Electron. Agric. 2020, 177, 105694. [Google Scholar] [CrossRef]
  15. Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near Combined Positioning of Picking-Point Based on Depth Data Features for Horizontal-Trellis Cultivated Grape. Comput. Electron. Agric. 2022, 194, 106791. [Google Scholar] [CrossRef]
  16. Ma, L.; He, Z.; Zhu, Y.; Jia, L.; Wang, Y.; Ding, X.; Cui, Y. A Method of Grasping Detection for Kiwifruit Harvesting Robot Based on Deep Learning. Agronomy 2022, 12, 3096. [Google Scholar] [CrossRef]
  17. Parsa, S.; Debnath, B.; Khan, M.A.; Amir, G.E. Modular Autonomous Strawberry Picking Robotic System. J. Field Robot. 2023, 1–21. [Google Scholar] [CrossRef]
  18. Tang, Q.; Liang, J.; Zhu, F. A Comparative Review on Multi-Modal Sensors Fusion Based on Deep Learning. Signal Process. 2023, 213, 109165. [Google Scholar] [CrossRef]
  19. Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine Learning Approaches for Crop Yield Prediction and Nitrogen Status Estimation in Precision Agriculture: A Review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
  20. Van Henten, E.J.; Schenk, E.J.; Van Willigenburg, L.G.; Meuleman, J.; Barreiro, P. Collision-Free Inverse Kinematics of the Redundant Seven-Link Manipulator Used in a Cucumber Picking Robot. Biosyst. Eng. 2010, 106, 112–124. [Google Scholar] [CrossRef]
  21. Ting, K.C.; Giacomelli, G.A.; Shen, S.J.; Kabala, W.P. Robot Workcell for Transplanting of Seedlings Part II—End-effector Development. Trans. ASAE 1990, 33, 1013–1017. [Google Scholar] [CrossRef]
  22. Aqeel-ur-Rehman; Abbasi, A.Z.; Islam, N.; Shaikh, Z.A. A Review of Wireless Sensors and Networks’ Applications in Agriculture. Comput. Stand. Interfaces 2014, 36, 263–270. [Google Scholar] [CrossRef]
  23. Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  24. Weiss, U.; Biber, P. Plant Detection and Mapping for Agricultural Robots Using a 3D LIDAR Sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
  25. Fantoni, G.; Gabelloni, D.; Tilli, J. Concept Design of New Grippers Using Abstraction and Analogy. Proc. Inst. Mech. Eng. Part B: J. Eng. Manuf. 2013, 227, 1521–1532. [Google Scholar] [CrossRef]
  26. Fantoni, G.; Santochi, M.; Dini, G.; Tracht, K.; Scholz-Reiter, B.; Fleischer, J.; Kristoffer Lien, T.; Seliger, G.; Reinhart, G.; Franke, J.; et al. Grasping Devices and Methods in Automated Production Processes. CIRP Ann. 2014, 63, 679–701. [Google Scholar] [CrossRef]
  27. Williamson, J.G.; Cline, W.O. Mechanized Harvest of Southern Highbush Blueberries for the Fresh Market: An Introduction and Overview of the Workshop Proceedings. HortTechnology 2013, 23, 416–418. [Google Scholar] [CrossRef]
  28. S, V.R.; Parsa, S.; Parsons, S.; E, A.G. Peduncle Gripping and Cutting Force for Strawberry Harvesting Robotic End-Effector Design. In Proceedings of the 2022 4th International Conference on Control and Robotics (ICCR), Guangzhou, China, 2–4 December 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 59–64. [Google Scholar]
  29. Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a Strawberry-Harvesting Robot in a Field Test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
  30. Aliasgarian, S.; Ghassemzadeh, H.R.; Moghaddam, M.; Ghaffari, H. Mechanical Damage Of Strawberry During Harvest And Postharvest Operations. Acta Technol. Agric. 2015, 18, 1–5. [Google Scholar] [CrossRef]
  31. Kurpaska, S.; Sobol, Z.; Pedryc, N.; Hebda, T.; Nawara, P. Analysis of the Pneumatic System Parameters of the Suction Cup Integrated with the Head for Harvesting Strawberry Fruit. Sensors 2020, 20, 4389. [Google Scholar] [CrossRef] [PubMed]
  32. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  33. Nevliudov, I.; Novoselov, S.; Sychova, O.; Tesliuk, S. Development of the Architecture of the Base Platform Agricultural Robot for Determining the Trajectory Using the Method of Visual Odometry. In Proceedings of the 2021 IEEE XVIIth International Conference on the Perspective Technologies and Methods in MEMS Design (MEMSTECH), Polyana (Zakarpattya), Ukraine, 12–16 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 64–68. [Google Scholar]
  34. Wang, J.; Luo, Z.; Wang, Y.; Yang, B.; Assadian, F. Coordination Control of Differential Drive Assist Steering and Vehicle Stability Control for Four-Wheel-Independent-Drive EV. IEEE Trans. Veh. Technol. 2018, 67, 11453–11467. [Google Scholar] [CrossRef]
  35. De Santiago, J.; Bernhoff, H.; Ekergård, B.; Eriksson, S.; Ferhatovic, S.; Waters, R.; Leijon, M. Electrical Motor Drivelines in Commercial All-Electric Vehicles: A Review. IEEE Trans. Veh. Technol. 2012, 61, 475–484. [Google Scholar] [CrossRef]
  36. Wu, X.; Xu, M.; Wang, L. Differential Speed Steering Control for Four-Wheel Independent Driving Electric Vehicle. In Proceedings of the 2013 IEEE International Symposium on Industrial Electronics, Taipei, Taiwan, 28–31 May 2013; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
  37. Veneri, M.; Massaro, M. The Effect of Ackermann Steering on the Performance of Race Cars. Veh. Syst. Dyn. 2021, 59, 907–927. [Google Scholar] [CrossRef]
  38. Xu, T.; Ma, S.; Xu, H.; Mo, S.; Li, Y. Application of Ackermann Steering in Obstacle Crossing Platform of Six-Wheeled Robots. In Proceedings of the 2023 2nd International Symposium on Control Engineering and Robotics (ISCER), Hangzhou, China, 17–19 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 239–243. [Google Scholar]
  39. Simionescu, P.A.; Beale, D. Optimum Synthesis of the Four-Bar Function Generator in Its Symmetric Embodiment: The Ackermann Steering Linkage. Mech. Mach. Theory 2002, 37, 1487–1504. [Google Scholar] [CrossRef]
  40. Zhao, J.-S.; Liu, X.; Feng, Z.-J.; Dai, J.S. Design of an Ackermann-Type Steering Mechanism. Proc. Inst. Mech. Eng. Part C: J. Mech. Eng. Sci. 2013, 227, 2549–2562. [Google Scholar] [CrossRef]
  41. Kuslits, M.; Bestle, D. Modelling and Control of a New Differential Steering Concept. Veh. Syst. Dyn. 2019, 57, 520–542. [Google Scholar] [CrossRef]
  42. Gfrerrer, A. Geometry and Kinematics of the Mecanum Wheel. Comput. Aided Geom. Des. 2008, 25, 784–791. [Google Scholar] [CrossRef]
  43. Dickerson, S.L.; Lapin, B.D. Control of an Omni-Directional Robotic Vehicle with Mecanum Wheels. In Proceedings of the NTC ’91—National Telesystems Conference Proceedings, Atlanta, GA, USA, 26–27 March 1991; IEEE: Atlanta, GA, USA, 1991; pp. 323–328. [Google Scholar]
  44. Hryniewicz, P.; Gwiazda, A.; Banaś, W.; Sękala, A.; Foit, K. Modelling of a Mecanum Wheel Taking into Account the Geometry of Road Rollers. IOP Conf. Ser. Mater. Sci. Eng. 2017, 227, 012060. [Google Scholar] [CrossRef]
  45. Ben-Tzvi, P.; Saab, W. A Hybrid Tracked-Wheeled Multi-Directional Mobile Robot. J. Mech. Robot. 2019, 11, 041008. [Google Scholar] [CrossRef]
  46. Sun, Y.; Xu, L.; Jing, B.; Chai, X.; Li, Y. Development of a Four-Point Adjustable Lifting Crawler Chassis and Experiments in a Combine Harvester. Comput. Electron. Agric. 2020, 173, 105416. [Google Scholar] [CrossRef]
  47. Tinoco, V.; Silva, M.F.; Santos, F.N.; Valente, A.; Rocha, L.F.; Magalhães, S.A.; Santos, L.C. An Overview of Pruning and Harvesting Manipulators. IR 2022, 49, 688–695. [Google Scholar] [CrossRef]
  48. Lu, J.; Zou, T.; Jiang, X. A Neural Network Based Approach to Inverse Kinematics Problem for General Six-Axis Robots. Sensors 2022, 22, 8909. [Google Scholar] [CrossRef] [PubMed]
  49. Boryga, M.; Kołodziej, P.; Graboś, A.; Gołacki, K. Mapping Accuracy of Trajectories of Manipulator Motion. ITM Web Conf. 2018, 21, 00009. [Google Scholar] [CrossRef]
  50. Kucuk, S.; Bingul, Z. Robot Kinematics: Forward and Inverse Kinematics. In Industrial Robotics: Theory, Modelling and Control; Cubero, S., Ed.; Pro Literatur Verlag: Berlin, Germany; ARS: Linz, Austria, 2006; ISBN 978-3-86611-285-8. [Google Scholar]
  51. Ames, B.; Morgan, J.; Konidaris, G. IKFlow: Generating Diverse Inverse Kinematics Solutions. IEEE Robot. Autom. Lett. 2022, 7, 7177–7184. [Google Scholar] [CrossRef]
  52. Fang, G.; Tian, Y.; Yang, Z.-X.; Geraedts, J.M.P.; Wang, C.C.L. Efficient Jacobian-Based Inverse Kinematics with Sim-to-Real Transfer of Soft Robots by Learning. IEEE/ASME Trans. Mechatron. 2022, 27, 5296–5306. [Google Scholar] [CrossRef]
  53. Marconi, G.M.; Camoriano, R.; Rosasco, L.; Ciliberto, C. Structured Prediction for CRiSP Inverse Kinematics Learning with Misspecified Robot Models. IEEE Robot. Autom. Lett. 2021, 6, 5650–5657. [Google Scholar] [CrossRef]
  54. Zhao, G.; Jiang, D.; Liu, X.; Tong, X.; Sun, Y.; Tao, B.; Kong, J.; Yun, J.; Liu, Y.; Fang, Z. A Tandem Robotic Arm Inverse Kinematic Solution Based on an Improved Particle Swarm Algorithm. Front. Bioeng. Biotechnol. 2022, 10, 832829. [Google Scholar] [CrossRef] [PubMed]
  55. Pfeiffer, F.; Johanni, R. A Concept for Manipulator Trajectory Planning. IEEE J. Robot. Autom. 1987, 3, 115–123. [Google Scholar] [CrossRef]
  56. Luh, J.; Walker, M.; Paul, R. Resolved-Acceleration Control of Mechanical Manipulators. IEEE Trans. Automat. Contr. 1980, 25, 468–474. [Google Scholar] [CrossRef]
  57. Meng, F.; Li, J.; Zhang, Y.; Qi, S.; Tang, Y. Transforming Unmanned Pineapple Picking with Spatio-Temporal Convolutional Neural Networks. Comput. Electron. Agric. 2023, 214, 108298. [Google Scholar] [CrossRef]
  58. Dimeas, F.; Sako, D.V.; Moulianitis, V.C.; Aspragathos, N.A. Design and Fuzzy Control of a Robotic Gripper for Efficient Strawberry Harvesting. Robotica 2014, 33, 1085–1098. [Google Scholar] [CrossRef]
  59. Gunderman, A.; Collins, J.; Myers, A.; Threlfall, R.; Chen, Y. Tendon-Driven Soft Robotic Gripper for Blackberry Harvesting. IEEE Robot. Autom. Lett. 2022, 7, 2652–2659. [Google Scholar] [CrossRef]
  60. Zhang, Y.; Zhang, K.; Yang, L.; Zhang, D.; Cui, T.; Yu, Y.; Liu, H. Design and Simulation Experiment of Ridge Planting Strawberry Picking Manipulator. Comput. Electron. Agric. 2023, 208, 107690. [Google Scholar] [CrossRef]
  61. Feng, Q.; Chen, J.; Zhang, M.; Wang, X. Design and Test of Harvesting Robot for Table-Top Cultivated Strawberry. In Proceedings of the 2019 WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, 21–22 August 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 80–85. [Google Scholar]
  62. Williams, H.; Ting, C.; Nejati, M.; Jones, M.H.; Penhall, N.; Lim, J.; Seabright, M.; Bell, J.; Ahn, H.S.; Scarfe, A.; et al. Improvements to and Large-scale Evaluation of a Robotic Kiwifruit Harvester. J. Field Robot. 2020, 37, 187–201. [Google Scholar] [CrossRef]
  63. Carbone, G. (Ed.) Grasping in Robotics; Mechanisms and Machine Science; Springer: London, UK, 2013; Volume 10, ISBN 978-1-4471-4663-6. [Google Scholar]
  64. Li, H.; Gu, Z.; He, D.; Wang, X.; Huang, J.; Mo, Y.; Li, P.; Huang, Z.; Wu, F. A Lightweight Improved YOLOv5s Model and Its Deployment for Detecting Pitaya Fruits in Daytime and Nighttime Light-Supplement Environments. Comput. Electron. Agric. 2024, 220, 108914. [Google Scholar] [CrossRef]
  65. Bello, R.-W.; Oladipo, M.A. Mask YOLOv7-Based Drone Vision System for Automated Cattle Detection and Counting. AIA 2024, 2, 129–139. [Google Scholar] [CrossRef]
  66. Chen, L.; Li, S.; Bai, Q.; Yang, J.; Jiang, S.; Miao, Y. Review of Image Classification Algorithms Based on Convolutional Neural Networks. Remote Sens. 2021, 13, 4712. [Google Scholar] [CrossRef]
  67. Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef]
  68. Hafiz, A.M.; Bhat, G.M. A Survey on Instance Segmentation: State of the Art. Int. J. Multimed. Info. Retr. 2020, 9, 171–189. [Google Scholar] [CrossRef]
  69. Kim, D.-H.; Lu, N.; Ghaffari, R.; Kim, Y.-S.; Lee, S.P.; Xu, L.; Wu, J.; Kim, R.-H.; Song, J.; Liu, Z.; et al. Materials for Multifunctional Balloon Catheters with Capabilities in Cardiac Electrophysiological Mapping and Ablation Therapy. Nat. Mater. 2011, 10, 316–323. [Google Scholar] [CrossRef] [PubMed]
  70. Lee, M.H.; Nicholls, H.R. Review Article Tactile Sensing for Mechatronics—A State of the Art Survey. Mechatronics 1999, 9, 1–31. [Google Scholar] [CrossRef]
  71. Qu, J.; Mao, B.; Li, Z.; Xu, Y.; Zhou, K.; Cao, X.; Fan, Q.; Xu, M.; Liang, B.; Liu, H.; et al. Recent Progress in Advanced Tactile Sensing Technologies for Soft Grippers. Adv. Funct. Mater. 2023, 33, 2306249. [Google Scholar] [CrossRef]
  72. Dargahi, J.; Najarian, S. Advances in Tactile Sensors Design/Manufacturing and Its Impact on Robotics Applications—A Review. Ind. Robot 2005, 32, 268–281. [Google Scholar] [CrossRef]
  73. Sam, R.; Nefti, S. Design and Development of Flexible Robotic Gripper for Handling Food Products. In Proceedings of the 2008 10th International Conference on Control, Automation, Robotics and Vision, Hanoi, Vietnam, 17–20 December 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1684–1689. [Google Scholar]
  74. Shi, J.; Wang, L.; Dai, Z.; Zhao, L.; Du, M.; Li, H.; Fang, Y. Multiscale Hierarchical Design of a Flexible Piezoresistive Pressure Sensor with High Sensitivity and Wide Linearity Range. Small 2018, 14, 1800819. [Google Scholar] [CrossRef] [PubMed]
  75. Cao, M.; Su, J.; Fan, S.; Qiu, H.; Su, D.; Li, L. Wearable Piezoresistive Pressure Sensors Based on 3D Graphene. Chem. Eng. J. 2021, 406, 126777. [Google Scholar] [CrossRef]
  76. Yang, T.; Deng, W.; Chu, X.; Wang, X.; Hu, Y.; Fan, X.; Song, J.; Gao, Y.; Zhang, B.; Tian, G.; et al. Hierarchically Microstructure-Bioinspired Flexible Piezoresistive Bioelectronics. ACS Nano 2021, 15, 11555–11563. [Google Scholar] [CrossRef] [PubMed]
  77. Hwang, J.; Kim, Y.; Yang, H.; Oh, J.H. Fabrication of Hierarchically Porous Structured PDMS Composites and Their Application as a Flexible Capacitive Pressure Sensor. Compos. Part B Eng. 2021, 211, 108607. [Google Scholar] [CrossRef]
  78. Yang, J.C.; Kim, J.-O.; Oh, J.; Kwon, S.Y.; Sim, J.Y.; Kim, D.W.; Choi, H.B.; Park, S. Microstructured Porous Pyramid-Based Ultrahigh Sensitive Pressure Sensor Insensitive to Strain and Temperature. ACS Appl. Mater. Interfaces 2019, 11, 19472–19480. [Google Scholar] [CrossRef] [PubMed]
  79. Yang, J.; Luo, S.; Zhou, X.; Li, J.; Fu, J.; Yang, W.; Wei, D. Flexible, Tunable, and Ultrasensitive Capacitive Pressure Sensor with Microconformal Graphene Electrodes. ACS Appl. Mater. Interfaces 2019, 11, 14997–15006. [Google Scholar] [CrossRef] [PubMed]
  80. Lin, W.; Wang, B.; Peng, G.; Shan, Y.; Hu, H.; Yang, Z. Skin-Inspired Piezoelectric Tactile Sensor Array with Crosstalk-Free Row+Column Electrodes for Spatiotemporally Distinguishing Diverse Stimuli. Adv. Sci. 2021, 8, 2002817. [Google Scholar] [CrossRef] [PubMed]
  81. Peng, Y.; Que, M.; Lee, H.E.; Bao, R.; Wang, X.; Lu, J.; Yuan, Z.; Li, X.; Tao, J.; Sun, J.; et al. Achieving High-Resolution Pressure Mapping via Flexible GaN/ ZnO Nanowire LEDs Array by Piezo-Phototronic Effect. Nano Energy 2019, 58, 633–640. [Google Scholar] [CrossRef]
  82. Wang, X.; Zhang, H.; Yu, R.; Dong, L.; Peng, D.; Zhang, A.; Zhang, Y.; Liu, H.; Pan, C.; Wang, Z.L. Dynamic Pressure Mapping of Personalized Handwriting by a Flexible Sensor Matrix Based on the Mechanoluminescence Process. Adv. Mater. 2015, 27, 2324–2331. [Google Scholar] [CrossRef] [PubMed]
  83. Wang, X.; Zhang, H.; Dong, L.; Han, X.; Du, W.; Zhai, J.; Pan, C.; Wang, Z.L. Self-Powered High-Resolution and Pressure-Sensitive Triboelectric Sensor Matrix for Real-Time Tactile Mapping. Adv. Mater. 2016, 28, 2896–2903. [Google Scholar] [CrossRef] [PubMed]
  84. Wang, L.; Liu, Y.; Liu, Q.; Zhu, Y.; Wang, H.; Xie, Z.; Yu, X.; Zi, Y. A Metal-Electrode-Free, Fully Integrated, Soft Triboelectric Sensor Array for Self-Powered Tactile Sensing. Microsyst. Nanoeng. 2020, 6, 59. [Google Scholar] [CrossRef]
  85. Wang, X.; Zhang, Y.; Zhang, X.; Huo, Z.; Li, X.; Que, M.; Peng, Z.; Wang, H.; Pan, C. A Highly Stretchable Transparent Self-Powered Triboelectric Tactile Sensor with Metallized Nanofibers for Wearable Electronics. Adv. Mater. 2018, 30, 1706738. [Google Scholar] [CrossRef] [PubMed]
  86. Pang, C. A Flexible and Highly Sensitive Strain-Gauge Sensor Using Reversible Interlocking of Nanofibres. Nat. Mater. 2012, 11, 795–801. [Google Scholar] [CrossRef] [PubMed]
  87. Tiwana, M.I.; Redmond, S.J.; Lovell, N.H. A Review of Tactile Sensing Technologies with Applications in Biomedical Engineering. Sens. Actuators A Phys. 2012, 179, 17–31. [Google Scholar] [CrossRef]
  88. Duan, Y.; He, S.; Wu, J.; Su, B.; Wang, Y. Recent Progress in Flexible Pressure Sensor Arrays. Nanomaterials 2022, 12, 2495. [Google Scholar] [CrossRef] [PubMed]
  89. Gao, Y.; Xiao, T.; Li, Q.; Chen, Y.; Qiu, X.; Liu, J.; Bian, Y.; Xuan, F. Flexible Microstructured Pressure Sensors: Design, Fabrication and Applications. Nanotechnology 2022, 33, 322002. [Google Scholar] [CrossRef] [PubMed]
  90. Mishra, R.B.; El-Atab, N.; Hussain, A.M.; Hussain, M.M. Recent Progress on Flexible Capacitive Pressure Sensors: From Design and Materials to Applications. Adv. Mater. Technol. 2021, 6, 2001023. [Google Scholar] [CrossRef]
  91. Hammock, M.L.; Chortos, A.; Tee, B.C.-K.; Tok, J.B.-H.; Bao, Z. 25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress. Adv. Mater. 2013, 25, 5997–6038. [Google Scholar] [CrossRef] [PubMed]
  92. Peng, Y.; Yang, N.; Xu, Q.; Dai, Y.; Wang, Z. Recent Advances in Flexible Tactile Sensors for Intelligent Systems. Sensors 2021, 21, 5392. [Google Scholar] [CrossRef] [PubMed]
  93. Zhou, K.; Zhao, Y.; Sun, X.; Yuan, Z.; Zheng, G.; Dai, K.; Mi, L.; Pan, C.; Liu, C.; Shen, C. Ultra-Stretchable Triboelectric Nanogenerator as High-Sensitive and Self-Powered Electronic Skins for Energy Harvesting and Tactile Sensing. Nano Energy 2020, 70, 104546. [Google Scholar] [CrossRef]
  94. Benet, G.; Blanes, F.; Simó, J.E.; Pérez, P. Using Infrared Sensors for Distance Measurement in Mobile Robots. Robot. Auton. Syst. 2002, 40, 255–266. [Google Scholar] [CrossRef]
  95. Abbas, I.; Liu, J.; Faheem, M.; Noor, R.S.; Shaikh, S.A.; Solangi, K.A.; Raza, S.M. Different Sensor Based Intelligent Spraying Systems in Agriculture. Sens. Actuators A Phys. 2020, 316, 112265. [Google Scholar] [CrossRef]
  96. Hauptmann, P.; Hoppe, N.; Püttmer, A. Application of Ultrasonic Sensors in the Process Industry. Meas. Sci. Technol. 2002, 13, R73–R83. [Google Scholar] [CrossRef]
  97. Jiang, Q.; Zhang, M.; Xu, B. Application of Ultrasonic Technology in Postharvested Fruits and Vegetables Storage: A Review. Ultrason. Sonochem. 2020, 69, 105261. [Google Scholar] [CrossRef] [PubMed]
  98. Li, N.; Ho, C.P.; Xue, J.; Lim, L.W.; Chen, G.; Fu, Y.H.; Lee, L.Y.T. A Progress Review on Solid-State LiDAR and Nano-photonics-Based LiDAR Sensors. Laser Photonics Rev. 2022, 16, 2100511. [Google Scholar] [CrossRef]
  99. Wang, J.; Chortos, A. Control Strategies for Soft Robot Systems. Adv. Intell. Syst. 2022, 4, 2100165. [Google Scholar] [CrossRef]
  100. Li, P.; Liu, X. Common Sensors in Industrial Robots: A Review. J. Phys. Conf. Ser. 2019, 1267, 012036. [Google Scholar] [CrossRef]
  101. Apneseth, C.; Dzung, D.; Kjesbu, S.; Scheible, G.; Zimmermann, W. Wireless—Introducing Wireless Proximity Switches. Sens. Rev. 2003, 23, 116–122. [Google Scholar] [CrossRef]
  102. Monkman, G.J.; Hesse, S.; Steinmann, R.; Schunk, H. Robot Grippers, 1st ed.; Wiley: Hoboken, NJ, USA, 2006; ISBN 978-3-527-40619-7. [Google Scholar]
  103. Pallay, M.; Miles, R.N.; Towfighian, S. A Tunable Electrostatic MEMS Pressure Switch. IEEE Trans. Ind. Electron. 2020, 67, 9833–9840. [Google Scholar] [CrossRef]
  104. An, Q.; Wang, K.; Li, Z.; Song, C.; Tang, X.; Song, J. Real-Time Monitoring Method of Strawberry Fruit Growth State Based on YOLO Improved Model. IEEE Access 2022, 10, 124363–124372. [Google Scholar] [CrossRef]
  105. Chen, J.; Wang, Z.; Wu, J.; Hu, Q.; Zhao, C.; Tan, C.; Teng, L.; Luo, T. An Improved Yolov3 Based on Dual Path Network for Cherry Tomatoes Detection. J Food Process Eng. 2021, 44, e13803. [Google Scholar] [CrossRef]
  106. Gai, R.; Chen, N.; Yuan, H. A Detection Algorithm for Cherry Fruits Based on the Improved YOLO-v4 Model. Neural Comput. Appl. 2023, 35, 13895–13906. [Google Scholar] [CrossRef]
  107. Yang, W.; Ma, X.; Hu, W.; Tang, P. Lightweight Blueberry Fruit Recognition Based on Multi-Scale and Attention Fusion NCBAM. Agronomy 2022, 12, 2354. [Google Scholar] [CrossRef]
  108. Fan, Y.; Zhang, S.; Feng, K.; Qian, K.; Wang, Y.; Qin, S. Strawberry Maturity Recognition Algorithm Combining Dark Channel Enhancement and YOLOv5. Sensors 2022, 22, 419. [Google Scholar] [CrossRef] [PubMed]
  109. Habaragamuwa, H.; Ogawa, Y.; Suzuki, T.; Shiigi, T.; Ono, M.; Kondo, N. Detecting Greenhouse Strawberries (Mature and Immature), Using Deep Convolutional Neural Network. Eng. Agric. Environ. Food 2018, 11, 127–138. [Google Scholar] [CrossRef]
  110. Liu, G.; Nouaze, J.C.; Touko Mbouembe, P.L.; Kim, J.H. YOLO-Tomato: A Robust Algorithm for Tomato Detection Based on YOLOv3. Sensors 2020, 20, 2145. [Google Scholar] [CrossRef] [PubMed]
  111. Lawal, M.O. Tomato Detection Based on Modified YOLOv3 Framework. Sci. Rep. 2021, 11, 1447. [Google Scholar] [CrossRef] [PubMed]
  112. Yu, Y.; Zhang, K.; Liu, H.; Yang, L.; Zhang, D. Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot. IEEE Access 2020, 8, 116556–116568. [Google Scholar] [CrossRef]
  113. Fu, L.; Feng, Y.; Wu, J.; Liu, Z.; Gao, F.; Majeed, Y.; Al-Mallahi, A.; Zhang, Q.; Li, R.; Cui, Y. Fast and Accurate Detection of Kiwifruit in Orchard Using Improved YOLOv3-Tiny Model. Precis. Agric. 2021, 22, 754–776. [Google Scholar] [CrossRef]
  114. Zabawa, L.; Kicherer, A.; Klingbeil, L.; Milioto, A.; Topfer, R.; Kuhlmann, H.; Roscher, R. Detection of Single Grapevine Berries in Images Using Fully Convolutional Neural Networks. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Long Beach, CA, USA, 16–17 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2571–2579. [Google Scholar]
  115. Rong, Q.; Hu, C.; Hu, X.; Xu, M. Picking Point Recognition for Ripe Tomatoes Using Semantic Segmentation and Morphological Processing. Comput. Electron. Agric. 2023, 210, 107923. [Google Scholar] [CrossRef]
  116. Tang, C.; Chen, D.; Wang, X.; Ni, X.; Liu, Y.; Liu, Y.; Mao, X.; Wang, S. A Fine Recognition Method of Strawberry Ripeness Combining Mask R-CNN and Region Segmentation. Front. Plant Sci. 2023, 14, 1211830. [Google Scholar] [CrossRef]
  117. Peng, Y.; Wang, A.; Liu, J.; Faheem, M. A Comparative Study of Semantic Segmentation Models for Identification of Grape with Different Varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
  118. Ilyas, T.; Umraiz, M.; Khan, A.; Kim, H. DAM: Hierarchical Adaptive Feature Selection Using Convolution Encoder Decoder Network for Strawberry Segmentation. Front. Plant Sci. 2021, 12, 591333. [Google Scholar] [CrossRef] [PubMed]
  119. Roscher, R.; Herzog, K.; Kunkel, A.; Kicherer, A.; Töpfer, R.; Förstner, W. Automated Image Analysis Framework for High-Throughput Determination of Grapevine Berry Sizes Using Conditional Random Fields. Comput. Electron. Agric. 2014, 100, 148–158. [Google Scholar] [CrossRef]
  120. Milella, A. In-Field High Throughput Grapevine Phenotyping with a Consumer-Grade Depth Camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
  121. Wang, C.; Yang, G.; Huang, Y.; Liu, Y.; Zhang, Y. A Transformer-Based Mask R-CNN for Tomato Detection and Segmentation. IFS 2023, 44, 8585–8595. [Google Scholar] [CrossRef]
  122. Lei, H.; Huang, K.; Jiao, Z.; Tang, Y.; Zhong, Z.; Cai, Y. Bayberry Segmentation in a Complex Environment Based on a Multi-Module Convolutional Neural Network. Appl. Soft Comput. 2022, 119, 108556. [Google Scholar] [CrossRef]
  123. Zabawa, L.; Kicherer, A.; Klingbeil, L.; Töpfer, R.; Kuhlmann, H.; Roscher, R. Counting of Grapevine Berries in Images via Semantic Segmentation Using Convolutional Neural Networks. ISPRS J. Photogramm. Remote Sens. 2020, 164, 73–83. [Google Scholar] [CrossRef]
  124. Gonzalez, S.; Arellano, C.; Tapia, J.E. Deepblueberry: Quantification of Blueberries in the Wild Using Instance Segmentation. IEEE Access 2019, 7, 105776–105788. [Google Scholar] [CrossRef]
  125. Ni, X.; Li, C.; Jiang, H.; Takeda, F. Deep Learning Image Segmentation and Extraction of Blueberry Fruit Traits Associated with Harvestability and Yield. Hortic. Res. 2020, 7, 110. [Google Scholar] [CrossRef] [PubMed]
  126. Chen, Y.; Li, X.; Jia, M.; Li, J.; Hu, T.; Luo, J. Instance Segmentation and Number Counting of Grape Berry Images Based on Deep Learning. Appl. Sci. 2023, 13, 6751. [Google Scholar] [CrossRef]
  127. Wang, Y.; Lv, J.; Xu, L.; Gu, Y.; Zou, L.; Ma, Z. A Segmentation Method for Waxberry Image under Orchard Environment. Sci. Hortic. 2020, 266, 109309. [Google Scholar] [CrossRef]
  128. Luo, L.; Liu, W.; Lu, Q.; Wang, J.; Wen, W.; Yan, D.; Tang, Y. Grape Berry Detection and Size Measurement Based on Edge Image Processing and Geometric Morphology. Machines 2021, 9, 233. [Google Scholar] [CrossRef]
  129. Cai, C.; Tan, J.; Zhang, P.; Ye, Y.; Zhang, J. Determining Strawberries’ Varying Maturity Levels by Utilizing Image Segmentation Methods of Improved DeepLabV3+. Agronomy 2022, 12, 1875. [Google Scholar] [CrossRef]
  130. Xu, P.; Fang, N.; Liu, N.; Lin, F.; Yang, S.; Ning, J. Visual Recognition of Cherry Tomatoes in Plant Factory Based on Improved Deep Instance Segmentation. Comput. Electron. Agric. 2022, 197, 106991. [Google Scholar] [CrossRef]
  131. Pérez-Borrero, I.; Marín-Santos, D.; Gegúndez-Arias, M.E.; Cortés-Ancos, E. A Fast and Accurate Deep Learning Method for Strawberry Instance Segmentation. Comput. Electron. Agric. 2020, 178, 105736. [Google Scholar] [CrossRef]
  132. Hu, H.; Kaizu, Y.; Zhang, H.; Xu, Y.; Imou, K.; Li, M.; Huang, J.; Dai, S. Recognition and Localization of Strawberries from 3D Binocular Cameras for a Strawberry Picking Robot Using Coupled YOLO/Mask R-CNN. Int. J. Agric. Biol. Eng. 2022, 15, 175–179. [Google Scholar] [CrossRef]
  133. Afzaal, U.; Bhattarai, B.; Pandeya, Y.R.; Lee, J. An Instance Segmentation Model for Strawberry Diseases Based on Mask R-CNN. Sensors 2021, 21, 6565. [Google Scholar] [CrossRef] [PubMed]
  134. Mo, Y.; Wu, Y.; Yang, X.; Liu, F.; Liao, Y. Review the State-of-the-Art Technologies of Semantic Segmentation Based on Deep Learning. Neurocomputing 2022, 493, 626–646. [Google Scholar] [CrossRef]
  135. Gu, W.; Bai, S.; Kong, L. A Review on 2D Instance Segmentation Based on Deep Neural Networks. Image Vis. Comput. 2022, 120, 104401. [Google Scholar] [CrossRef]
  136. Li, Y.; Chen, Y.; Li, Y. Pre-Charged Pneumatic Soft Gripper with Closed-Loop Control. IEEE Robot. Autom. Lett. 2019, 4, 1402–1408. [Google Scholar] [CrossRef]
  137. Ruotolo, W.; Brouwer, D.; Cutkosky, M.R. From Grasping to Manipulation with Gecko-Inspired Adhesives on a Multifinger Gripper. Sci. Robot. 2021, 6, eabi9773. [Google Scholar] [CrossRef] [PubMed]
  138. Visentin, F.; Castellini, F.; Muradore, R. A Soft, Sensorized Gripper for Delicate Harvesting of Small Fruits. Comput. Electron. Agric. 2023, 213, 108202. [Google Scholar] [CrossRef]
  139. Jin, L.; Wang, Z.; Tian, S.; Feng, J.; An, C.; Xu, H. Grasping Perception and Prediction Model of Kiwifruit Firmness Based on Flexible Sensing Claw. Comput. Electron. Agric. 2023, 215, 108389. [Google Scholar] [CrossRef]
  140. Lehnert, C.; McCool, C.; Sa, I.; Perez, T. Performance Improvements of a Sweet Pepper Harvesting Robot in Protected Cropping Environments. J. Field Robot. 2020, 37, 1197–1223. [Google Scholar] [CrossRef]
  141. Min, Y.; Kim, Y.; Jin, H.; Kim, H.J. Intelligent Gripper Systems Using Air Gap-Controlled Bimodal Tactile Sensors for Deformable Object Classification. Adv. Intell. Syst. 2023, 5, 2300317. [Google Scholar] [CrossRef]
  142. Shih, B.; Christianson, C.; Gillespie, K.; Lee, S.; Mayeda, J.; Huo, Z.; Tolley, M.T. Design Considerations for 3D Printed, Soft, Multimaterial Resistive Sensors for Soft Robotics. Front. Robot. AI 2019, 6, 30. [Google Scholar] [CrossRef]
  143. Yoder, Z.; Macari, D.; Kleinwaks, G.; Schmidt, I.; Acome, E.; Keplinger, C. A Soft, Fast and Versatile Electrohydraulic Gripper with Capacitive Object Size Detection. Adv. Funct. Mater. 2023, 33, 2209080. [Google Scholar] [CrossRef]
  144. Hu, Z.; Chu, Z.; Chen, G.; Cui, J. Design of Capacitive Pressure Sensors Integrated with Anisotropic Wedge Microstructure-Based Dielectric Layer. IEEE Sens. J. 2023, 23, 21040–21049. [Google Scholar] [CrossRef]
  145. Fastier-Wooller, J.W.; Vu, T.-H.; Nguyen, H.; Nguyen, H.-Q.; Rybachuk, M.; Zhu, Y.; Dao, D.V.; Dau, V.T. Multimodal Fibrous Static and Dynamic Tactile Sensor. ACS Appl. Mater. Interfaces 2022, 14, 27317–27327. [Google Scholar] [CrossRef] [PubMed]
  146. Qiu, Y.; Sun, S.; Wang, X.; Shi, K.; Wang, Z.; Ma, X.; Zhang, W.; Bao, G.; Tian, Y.; Zhang, Z.; et al. Nondestructive Identification of Softness via Bioinspired Multisensory Electronic Skins Integrated on a Robotic Hand. NPJ Flex. Electron. 2022, 6, 45. [Google Scholar] [CrossRef]
  147. Chen, S.; Pang, Y.; Yuan, H.; Tan, X.; Cao, C. Smart Soft Actuators and Grippers Enabled by Self-Powered Tribo-Skins. Adv. Mater. Technol. 2020, 5, 1901075. [Google Scholar] [CrossRef]
  148. Li, N.; Yin, Z.; Zhang, W.; Xing, C.; Peng, T.; Meng, B.; Yang, J.; Peng, Z. A Triboelectric-Inductive Hybrid Tactile Sensor for Highly Accurate Object Recognition. Nano Energy 2022, 96, 107063. [Google Scholar] [CrossRef]
  149. Xu, J.; Xie, Z.; Yue, H.; Lu, Y.; Yang, F. A Triboelectric Multifunctional Sensor Based on the Controlled Buckling Structure for Motion Monitoring and Bionic Tactile of Soft Robots. Nano Energy 2022, 104, 107845. [Google Scholar] [CrossRef]
  150. Amjadi, M.; Kyung, K.; Park, I.; Sitti, M. Stretchable, Skin-Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review. Adv. Funct. Mater. 2016, 26, 1678–1698. [Google Scholar] [CrossRef]
  151. Trung, T.Q.; Lee, N. Flexible and Stretchable Physical Sensor Integrated Platforms for Wearable Human-Activity Monitoringand Personal Healthcare. Adv. Mater. 2016, 28, 4338–4372. [Google Scholar] [CrossRef] [PubMed]
  152. Li, J.; Fang, L.; Sun, B.; Li, X.; Kang, S.H. Review—Recent Progress in Flexible and Stretchable Piezoresistive Sensors and Their Applications. J. Electrochem. Soc. 2020, 167, 037561. [Google Scholar] [CrossRef]
  153. Hannigan, B.C.; Cuthbert, T.J.; Geng, W.; Tavassolian, M.; Menon, C. Understanding the Impact of Machine Learning Models on the Performance of Different Flexible Strain Sensor Modalities. Front. Mater. 2021, 8, 639823. [Google Scholar] [CrossRef]
  154. Kim, K.; Kim, J.; Jiang, X.; Kim, T. Static Force Measurement Using Piezoelectric Sensors. J. Sens. 2021, 2021, 1–8. [Google Scholar] [CrossRef]
  155. Wu, C.; Wang, A.C.; Ding, W.; Guo, H.; Wang, Z.L. Triboelectric Nanogenerator: A Foundation of the Energy for the New Era. Adv. Energy Mater. 2019, 9, 1802906. [Google Scholar] [CrossRef]
  156. Song, C.; Wang, K.; Wang, C.; Tian, Y.; Wei, X.; Li, C.; An, Q.; Song, J. TDPPL-Net: A Lightweight Real-Time Tomato Detection and Picking Point Localization Model for Harvesting Robots. IEEE Access 2023, 11, 37650–37664. [Google Scholar] [CrossRef]
  157. Zhu, Y.; Zhang, T.; Liu, L.; Liu, P.; Li, X. Fast Location of Table Grapes Picking Point Based on Infrared Tube. Inventions 2022, 7, 27. [Google Scholar] [CrossRef]
  158. Mejia, G.; Montes De Oca, A.; Flores, G. Strawberry Localization in a Ridge Planting with an Autonomous Rover. Eng. Appl. Artif. Intell. 2023, 119, 105810. [Google Scholar] [CrossRef]
  159. Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
  160. Ren, G.; Wu, H.; Bao, A.; Lin, T.; Ting, K.-C.; Ying, Y. Mobile Robotics Platform for Strawberry Temporal–Spatial Yield Monitoring within Precision Indoor Farming Systems. Front. Plant Sci. 2023, 14, 1162435. [Google Scholar] [CrossRef] [PubMed]
  161. Rapado-Rincón, D.; Van Henten, E.J.; Kootstra, G. Development and Evaluation of Automated Localisation and Reconstruction of All Fruits on Tomato Plants in a Greenhouse Based on Multi-View Perception and 3D Multi-Object Tracking. Biosyst. Eng. 2023, 231, 78–91. [Google Scholar] [CrossRef]
  162. Miao, Z.; Yu, X.; Li, N.; Zhang, Z.; He, C.; Li, Z.; Deng, C.; Sun, T. Efficient Tomato Harvesting Robot Based on Image Processing and Deep Learning. Precis. Agric 2023, 24, 254–287. [Google Scholar] [CrossRef]
  163. Kishore, C.S.; Qamar, U.Z.; Arnold, W.S.; David, C. Percival Detecting Weed and Bare-Spot in Wild Blueberry Using Ultrasonic Sensor Technology. In Proceedings of the 2009 ASABE Annual International Meeting, Reno, NV, USA, 21–24 June 2009; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2009; p. 096879. [Google Scholar]
  164. Martínez-Guanter, J.; Garrido-Izard, M.; Valero, C.; Slaughter, D.; Pérez-Ruiz, M. Optical Sensing to Determine Tomato Plant Spacing for Precise Agrochemical Application: Two Scenarios. Sensors 2017, 17, 1096. [Google Scholar] [CrossRef] [PubMed]
  165. Wang, D.; Dong, Y.; Lian, J.; Gu, D. Adaptive End-effector Pose Control for Tomato Harvesting Robots. J. Field Robot. 2023, 40, 535–551. [Google Scholar] [CrossRef]
  166. Popa, D.; Udrea, F. Towards Integrated Mid-Infrared Gas Sensors. Sensors 2019, 19, 2076. [Google Scholar] [CrossRef] [PubMed]
  167. Tang, Y.; Qi, S.; Zhu, L.; Zhuo, X.; Zhang, Y.; Meng, F. Obstacle Avoidance Motion in Mobile Robotics. J. Syst. Simul. 2024, 36, 1. [Google Scholar]
  168. Dalla Mura, M.; Prasad, S.; Pacifici, F.; Gamba, P.; Chanussot, J.; Benediktsson, J.A. Challenges and Opportunities of Multimodality and Data Fusion in Remote Sensing. Proc. IEEE 2015, 103, 1585–1601. [Google Scholar] [CrossRef]
  169. Navas, E.; Shamshiri, R.R.; Dworak, V.; Weltzien, C.; Fernández, R. Soft Gripper for Small Fruits Harvesting and Pick and Place Operations. Front. Robot. AI 2024, 10, 1330496. [Google Scholar] [CrossRef]
  170. Li, Z.; Yuan, X.; Yang, Z. Design, Simulation, and Experiment for the End Effector of a Spherical Fruit Picking Robot. Int. J. Adv. Robot. Syst. 2023, 20, 17298806231213442. [Google Scholar] [CrossRef]
  171. Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and Simulation of an Integrated End-Effector for Picking Kiwifruit by Robot. Inf. Process. Agric. 2020, 7, 58–71. [Google Scholar] [CrossRef]
  172. Fu, M.; Guo, S.; Chen, A.; Cheng, R.; Cui, X. Design and Experimentation of Multi-Fruit Envelope-Cutting Kiwifruit Picking Robot. Front. Plant Sci. 2024, 15, 1338050. [Google Scholar] [CrossRef] [PubMed]
  173. Kumar Uppalapati, N.; Walt, B.; Havens, A.; Mahdian, A.; Chowdhary, G.; Krishnan, G. A Berry Picking Robot with A Hybrid Soft-Rigid Arm: Design and Task Space Control. In Proceedings of the Robotics: Science and Systems XVI, Virtual, 12–16 July 2020; Robotics: Science and Systems Foundation: Delft, The Netherlands, 2020. [Google Scholar]
  174. Chappell, D.; Bello, F.; Kormushev, P.; Rojas, N. The Hydra Hand: A Mode-Switching Underactuated Gripper with Precision and Power Grasping Modes. IEEE Robot. Autom. Lett. 2023, 8, 7599–7606. [Google Scholar] [CrossRef]
  175. Zhang, B.; Zhou, J.; Meng, Y.; Zhang, N.; Gu, B.; Yan, Z.; Idris, S.I. Comparative Study of Mechanical Damage Caused by a Two-Finger Tomato Gripper with Different Robotic Grasping Patterns for Harvesting Robots. Biosyst. Eng. 2018, 171, 245–257. [Google Scholar] [CrossRef]
  176. Zheng, Y.; Pi, J.; Guo, T.; Xu, L.; Liu, J.; Kong, J. Design and Simulation of a Gripper Structure of Cluster Tomato Based on Manual Picking Behavior. Front. Plant Sci. 2022, 13, 974456. [Google Scholar] [CrossRef] [PubMed]
  177. Gao, J.; Zhang, F.; Zhang, J.; Guo, H.; Gao, J. Picking Patterns Evaluation for Cherry Tomato Robotic Harvesting End-Effector Design. Biosyst. Eng. 2024, 239, 1–12. [Google Scholar] [CrossRef]
  178. Bac, C.W.; Hemming, J.; Van Tuijl, B.A.J.; Barth, R.; Wais, E.; Van Henten, E.J. Performance Evaluation of a Harvesting Robot for Sweet Pepper. J. Field Robot. 2017, 34, 1123–1139. [Google Scholar] [CrossRef]
  179. Wu, B.; Jiang, T.; Yu, Z.; Zhou, Q.; Jiao, J.; Jin, M.L. Proximity Sensing Electronic Skin: Principles, Characteristics, and Applications. Adv. Sci. 2024, 11, 2308560. [Google Scholar] [CrossRef] [PubMed]
  180. Mokayed, H.; Quan, T.Z.; Alkhaled, L.; Sivakumar, V. Real-Time Human Detection and Counting System Using Deep Learning Computer Vision Techniques. AIA 2022, 1, 221–229. [Google Scholar] [CrossRef]
  181. Junge, K.; Pires, C.; Hughes, J. Lab2Field Transfer of a Robotic Raspberry Harvester Enabled by a Soft Sensorized Physical Twin. Commun. Eng. 2023, 2, 40. [Google Scholar] [CrossRef]
  182. Jones, T.J.; Jambon-Puillet, E.; Marthelot, J.; Brun, P.-T. Bubble casting soft robotics. Nature 2021, 599, 229–233. [Google Scholar] [CrossRef]
  183. Acome, E.; Mitchell, S.K.; Morrissey, T.G.; Emmett, M.B.; Benjamin, C.; King, M.; Radakovitz, M.; Keplinger, C. Hydraulically Amplified Self-Healing Electrostatic Actuators with Muscle-like Performance. Science 2018, 359, 61–65. [Google Scholar] [CrossRef] [PubMed]
  184. Rebahi, Y.; Gharra, M.; Rizzi, L.; Zournatzis, I. Combining Computer Vision, Artificial Intelligence and 3D Printing in Wheelchair Design Customization: The Kyklos 4.0 Approach. AIA 2023. [Google Scholar] [CrossRef]
Figure 1. Traditional harvesting device of berry fruits: (a) The artificial-assisted single-drive device for picking multi-fruit strawberries from Ref. [5]. (b) The crankshaft vibration threshing and harvesting equipment for wine grapes from Ref. [6].
Figure 1. Traditional harvesting device of berry fruits: (a) The artificial-assisted single-drive device for picking multi-fruit strawberries from Ref. [5]. (b) The crankshaft vibration threshing and harvesting equipment for wine grapes from Ref. [6].
Agriculture 14 01346 g001
Figure 2. Application of perception technology in berry fruit picking: (a) The vision system-based end-effector for grape picking robots from Ref. [15]. (b) The kiwifruit harvesting robot is based on deep learning from Ref. [16]. (c) A modular autonomous strawberry-picking robot and its end-effector from Ref. [17].
Figure 2. Application of perception technology in berry fruit picking: (a) The vision system-based end-effector for grape picking robots from Ref. [15]. (b) The kiwifruit harvesting robot is based on deep learning from Ref. [16]. (c) A modular autonomous strawberry-picking robot and its end-effector from Ref. [17].
Agriculture 14 01346 g002
Figure 3. Picking steps and grasping standardized processes of berry-picking robots.
Figure 3. Picking steps and grasping standardized processes of berry-picking robots.
Agriculture 14 01346 g003
Figure 4. Classification of mobile platforms for berry fruit-picking robots. The mobile base is Yuhesen Technology and PAL Robotics.
Figure 4. Classification of mobile platforms for berry fruit-picking robots. The mobile base is Yuhesen Technology and PAL Robotics.
Agriculture 14 01346 g004
Figure 5. Process of analyzing the kinematics of the manipulator of a berry fruit-picking robot.
Figure 5. Process of analyzing the kinematics of the manipulator of a berry fruit-picking robot.
Agriculture 14 01346 g005
Figure 6. Classification of end-effector for berry fruit-picking robots.
Figure 6. Classification of end-effector for berry fruit-picking robots.
Agriculture 14 01346 g006
Figure 7. Part of the end-effector for berry fruit picking: (a) The fuzzy control of a robotic gripper for efficient strawberry harvesting from Ref. [58]. (b) The tendon-driven soft robotic gripper for blackberry harvesting from Ref. [59]. (c) The ridge planting strawberry picking manipulator from Ref. [60]. (d) The harvesting robot for table-top cultivated strawberries and its end-effector from Ref. [61]. Copyright 2019, IEEE. (e) A robotic kiwifruit harvester and its end-effector from Ref. [62].
Figure 7. Part of the end-effector for berry fruit picking: (a) The fuzzy control of a robotic gripper for efficient strawberry harvesting from Ref. [58]. (b) The tendon-driven soft robotic gripper for blackberry harvesting from Ref. [59]. (c) The ridge planting strawberry picking manipulator from Ref. [60]. (d) The harvesting robot for table-top cultivated strawberries and its end-effector from Ref. [61]. Copyright 2019, IEEE. (e) A robotic kiwifruit harvester and its end-effector from Ref. [62].
Agriculture 14 01346 g007
Figure 8. Object detection techniques applied to berry fruit picking: (a) Detection of kiwifruit in the orchard using improved YOLOv3—tiny model from Ref. [113]. Copyright 2021, Springer Nature. (b) Detection of cherry fruits based on the improved YOLOv4 model from Ref. [106]. Copyright 2023, Springer Nature. (c) Visual localization of the picking points for a ridge-planting strawberry from Ref. [112]. (d) Tomato detection based on modified YOLOv3 framework from Ref. [110].
Figure 8. Object detection techniques applied to berry fruit picking: (a) Detection of kiwifruit in the orchard using improved YOLOv3—tiny model from Ref. [113]. Copyright 2021, Springer Nature. (b) Detection of cherry fruits based on the improved YOLOv4 model from Ref. [106]. Copyright 2023, Springer Nature. (c) Visual localization of the picking points for a ridge-planting strawberry from Ref. [112]. (d) Tomato detection based on modified YOLOv3 framework from Ref. [110].
Agriculture 14 01346 g008
Figure 9. Semantic segmentation techniques applied to berry fruit picking: (a) Counting of grapevine berries using convolutional neural networks from Ref. [123]. (b) Bayberry segmentation is based on a multi-module convolutional neural network from Ref. [122]. (c) Recognition of strawberry ripeness combining Mask R-CNN from Ref. [116]. (d) A transformer-based Mask R-CNN for tomato detection and segmentation from Ref. [121].
Figure 9. Semantic segmentation techniques applied to berry fruit picking: (a) Counting of grapevine berries using convolutional neural networks from Ref. [123]. (b) Bayberry segmentation is based on a multi-module convolutional neural network from Ref. [122]. (c) Recognition of strawberry ripeness combining Mask R-CNN from Ref. [116]. (d) A transformer-based Mask R-CNN for tomato detection and segmentation from Ref. [121].
Agriculture 14 01346 g009
Figure 10. Instance segmentation techniques applied to berry fruit picking: (a) Quantifying blueberries in the wild using instance segmentation. from Ref. [124]. (b) Mask R-CNN for instance segmentation of grape cluster from Ref. [126]. (c) Deep learning method for strawberry instance segmentation from Ref. [131]. (d) A segmentation method for waxberry image from Ref. [127].
Figure 10. Instance segmentation techniques applied to berry fruit picking: (a) Quantifying blueberries in the wild using instance segmentation. from Ref. [124]. (b) Mask R-CNN for instance segmentation of grape cluster from Ref. [126]. (c) Deep learning method for strawberry instance segmentation from Ref. [131]. (d) A segmentation method for waxberry image from Ref. [127].
Agriculture 14 01346 g010
Figure 11. Tactile perception applied to berry fruit-picking robots: (a) A multi-finger gripper and its grasping grape cluster from Ref. [137]. (b) A gripper for delicate harvesting of strawberries from Ref. [138]. (c) Grasping perception of kiwifruit gripper and its structure from Ref. [139]. (d) A sweet pepper harvesting robot and its gripper’s structure from Ref. [141]. (e) A gripper with capacitive object size detection from Ref. [143]. (f) The structure of the capacitive sensor and its application from Ref. [144]. Copyright 2023, IEEE. (g) PE sensor integrated on a robotic hand and its grasping from Ref. [146]. (h) A dynamic tactile sensor and its application from Ref. [145]. (i) A finger of the end-effector with a triboelectric sensor is used to monitor the tactile aspect of the robot from Ref. [149]. (j) A finger of end-effector with triboelectric sensor and its application from Ref. [147].
Figure 11. Tactile perception applied to berry fruit-picking robots: (a) A multi-finger gripper and its grasping grape cluster from Ref. [137]. (b) A gripper for delicate harvesting of strawberries from Ref. [138]. (c) Grasping perception of kiwifruit gripper and its structure from Ref. [139]. (d) A sweet pepper harvesting robot and its gripper’s structure from Ref. [141]. (e) A gripper with capacitive object size detection from Ref. [143]. (f) The structure of the capacitive sensor and its application from Ref. [144]. Copyright 2023, IEEE. (g) PE sensor integrated on a robotic hand and its grasping from Ref. [146]. (h) A dynamic tactile sensor and its application from Ref. [145]. (i) A finger of the end-effector with a triboelectric sensor is used to monitor the tactile aspect of the robot from Ref. [149]. (j) A finger of end-effector with triboelectric sensor and its application from Ref. [147].
Agriculture 14 01346 g011
Figure 14. The characteristics of the perception techniques.
Figure 14. The characteristics of the perception techniques.
Agriculture 14 01346 g014
Figure 15. Advanced technologies applied to berry fruit harvesting: (a) Fruit picking using physical twinning from Ref. [181]. (b) A Bubble casts soft robotics by grasping small fruits from Ref. [182]. Copyright 2021, Springer Nature. (c) A hydraulically amplified self-healing electrostatic actuator with muscle-like performance and its application. From Ref. [183].
Figure 15. Advanced technologies applied to berry fruit harvesting: (a) Fruit picking using physical twinning from Ref. [181]. (b) A Bubble casts soft robotics by grasping small fruits from Ref. [182]. Copyright 2021, Springer Nature. (c) A hydraulically amplified self-healing electrostatic actuator with muscle-like performance and its application. From Ref. [183].
Agriculture 14 01346 g015
Table 1. Performance of tactile sensors.
Table 1. Performance of tactile sensors.
Type of Tactile SensorsMaterialsDetection LimitSensitivityRangeResponseStabilityRef.
Piezoresistive Patterned graphene/PDMS 5 Pa1.2 kPa−10~25 kPa1000[74]
3D graphene1 Pa0.152 kPa−10~27 kPa96 ms9000[75]
Polyaniline/polyvinylidene53 kPa−15.2~98.7 kPa38 ms50,000[76]
CapacitiveSilver/PDMS 10 Pa0.18 kPa−10~400 kPa100 ms10,000[77]
Silicon/polystyrene0.14 Pa44.5 kPa−10~100 Pa9 ms5000[78]
adhesive/graphene1 mg3.19 kPa−10~4 kPa30 ms500[79]
PiezoelectricPDMS/silver paste/pvdf7.7 mVkPa−110 ms80,000[80]
Flexible GaN/ZnO NWs180 ms4000[81]
ZnS:Mn particles2.2 cps kPa−10.6~50 Mpa10 ms10,000[82]
TriboelectricPDMS/Ag nanofibers70 ms2800[83]
PET/PDMS/Ag electrodes0.06 kPa−11 kPa70 ms10,000[84]
Ecoflex and PVA/PEI0.063 VkPa−15~50 kpa2250[85]
Table 2. Methods of berry fruit picking using visual perception.
Table 2. Methods of berry fruit picking using visual perception.
MethodsObjectAdvantageDisadvantageRef.
Object
Detection
StrawberryThis method detects mature and immature strawberries in greenhouse images, yielding highly accurate test results.The ripe category struggles with obscured strawberries, while the immature category faces confusion issues.[104]
Cherry
tomato
This method uses an enhanced YOLOv3 algorithm for cherry tomato detection, achieving a precision of 94.29%.Identifying heavily shaded fruits was difficult, and the integration of ripeness in detecting fruits at various growth stages was not performed.[105]
CherryThis cherry detection method is based on an improved YOLOv4 model. It accurately detects ripe, semi-ripe, unripe, and ripe cherry fruits.The speed of object detection is relatively slow for cherries.[106]
BlueberryA YOLOv5 network model that can detect blueberries improved accuracy by 2.4% over the original YOLOv5 network.The disadvantage is that the method has more network parameters, and further research is needed to improve the detection ability.[107]
StrawberryYOLOv5 combined with dark channel enhancement improves strawberry fruit-picking accuracy and robustness in complex environments.This method does not use different image enhancement methods for various periods of lighting conditions.[108]
StrawberryA strawberry growth detection algorithm improves the precision and accuracy of fruit growth state monitoring in complex environments.Deep learning-based strawberry growth state monitoring is server-dependent and still has limitations.[109]
TomatoThe YOLO-Tomato models effectively detect tomatoes in complex environmental conditions, and their performance is excellent.The method is designed to enumerate fruit, and the YOLOv3 model performs poorly in detecting small fruits.[110]
TomatoThe YOLO-Tomato model effectively detects tomatoes in complex environments, outperforming state-of-the-art methods.Models are poorly adapted to different conditions and may lose semantic information under heavily occluded conditions.[111]
StrawberryThe R-YOLO model improves the localization precision, increasing strawberry harvesting robots’ harvest rate and real-time performance.It does not discuss the detection accuracy under different environmental conditions. There is a lack of comparative analysis of existing methods.[112]
KiwifruitThe improved DY3TNet model accurately detects kiwifruits in orchards with minimal data weight.The method mentions the effect of flash on kiwifruit image detection, but no statistical tests were performed.[113]
Semantic
Segmentation
GrapeThis CNN accurately detects grape berries for yield estimation and prediction in viticulture.It shows a notably worse accuracy for the class edge, with 41.7%.[114]
TomatoThis method accurately segments tomatoes, with intersection and pixel accuracies of 82.5% and 89.79%, respectively.This method may be some burrs in the process of extracting tomato stems and calyxes that may affect the identification of the final picking points.[115]
StrawberryThis method accurately assesses strawberry maturity in challenging fields with 93.7% accuracy.Many strawberries in the transition maturity stage may cause classification confusion.[116]
GrapeDeepLabv3+ combined with transfer learning is more suitable for accurately segmenting grape clusters, with better performance.The method mentions increased accuracy with HE image enhancement methods but lacks details on statistical tests.[117]
StrawberryThe Hierarchical adaptive feature fusion method significantly improves real-time strawberry segmentation compared to existing methods.This methodology does not provide detailed comparative results or discussion with other related methods. [118]
GrapeThis method proposes a conditional random field-based approach to identify grapes with adaptivity and multi-feature fusion.The present study conducted experiments with artificially selected reference berries, which may introduce a subjective factor for identification.[119]
GrapeThe method was developed to efficiently estimate canopy volume, detect clusters, and count grapevines in the field.The present method is a more reliable option for sensitivity to light conditions, and the RGB-D sensor’s data quality is better.[120]
TomatoA transformer-based R-CN model effectively accurately identifies tomato varieties and ripening stages.The method does not explore tomato detection in various occluded regions, raising practical applicability concerns.[121]
BayberryA CNN-based model for prune segmentation in complex environments that resists the limitations of light variations and occlusion.Higher false and missed segmentation rates and poorer segmentation of small objects in fruit clusters.[122]
GrapeThis method can detect and mask single berry objects with a semantic segmentation network using a class edge to separate single objects. The training of this method is time and computationally intensive and has limited adaptability to different training systems.[123]
Instance
Segmentation
BlueberryThe ResNet50 backbone with Mask R-CNN accurately quantifies wild blueberries using high-definition images.Many experiments are required to select the best hyper-parameters in order, and this method has a slower recognition speed.[124]
BlueberryThis method effectively detects and segments blueberry fruits, extracting traits related to yield and monitoring fruit development.This method has a slower recognition speed.[125]
GrapeThe enhanced model achieved high detection accuracy and robust generalization across diverse varieties in complex growth environments.Detection accuracy is limited under varying light conditions and occlusion, while the model’s detection speed requires further improvement.[126]
WaxberryThe waxberry identification network achieved high precision and demonstrated strong robustness to occlusion.It did not increase the immature waxberries for fruit counting and yield estimation. It does not solve the problems caused by fruit stacking.[127]
GrapeThis method is versatile for grape berry counting and size detection, enabling precise discernment of berry features.Errors in segmentation may result in the misclassification of non-berry contours as a group.[128]
StrawberryThe improved DeepLabV3+ model accurately segments strawberries with different maturities, reducing environmental factors.This method does not conduct experiments on strawberries of different locations, and the recognition speed is slow.[129]
Cherry
tomato
Using bimodal eigenmaps and a balanced multitask loss model, this model enhances stem segmentation accuracy in cherry tomato picking.The present model has challenges regarding color and shape similarity, loss of stem features, and category differences.[130]
StrawberryA method for strawberry segmentation demonstrated efficiency in a natural system and created a database with images and entries.New methods emerge, making rigorous performance comparison impossible due to long computational times.[131]
StrawberryThe method identified and localized strawberries and provided location information.The model has some drawbacks regarding error detection and speed in the recognition process.[132]
StrawberryA Mask R-CNN model detects strawberries and other diseases, providing plant disease detection.The model’s accuracy is low, and the recognition time is extended.[133]
Table 3. Methods of berry fruit-picking robots using tactile perception.
Table 3. Methods of berry fruit-picking robots using tactile perception.
MethodsObjectAdvantageDisadvantageRef.
PiezoresistiveTomatoServo motors control the bending angle and speed of the soft gripper, with tactile sensors for tomato gripping.The sensor is non-stretchable, which influences the flexibility of the gripper.[136]
GrapeAn anthropomorphic end-effector that combines the adhesion principle with a multi-contact design with piezoresistive tactile sensors.This method has some drawbacks in maximizing the contact area and lacks a direct measurement and adjustment process.[137]
StrawberryA soft, sensitized gripper is introduced with a robotic system for picking small fruits like strawberries.The present gripper jaws’ piezoresistive tactile sensors are unsuitable for handling fragile berries with a normal force range.[138]
BlackberryA tendon-driven gripper for automated blackberry harvesting, incorporating a flexible resistive force sensor for providing force feedback.Complex calibration process and poor signal immunity.[59]
KiwifruitThis methodology enables detecting and classifying kiwifruit according to fruit hardness, utilizing data from force and bending sensors.The gripper may damage the fruit during gripping, and the perception function may be limited due to the limited number of sensors.[139]
Sweet
pepper
The end-effector can successfully pick the pepper, and the vacuum pressure sensor provides feedback that the suction cup has grabbed it.Vacuum pressure sensors are less sensitive at low pressures and are highly influenced by temperature.[140]
TomatoThe method enables the grasping of tomatoes while combining tactile perception information and algorithms to classify the size and ripeness.This method lacks practical scenarios and performance evaluations and raises concerns about reliability and robustness.[141]
StrawberryA piezoresistive tactile sensor manufactured by a 3D printer with no additional modifications can be used with a soft robot to grasp strawberries.The print material’s increased resistance lowers sensor sensitivity. Material deformation causes readings to drift and oscillate.[142]
CapacitiveTomatoElectrohydraulic bending actuators with tactile sensors for real-time gripping detection and fruit size estimation.The manual fabrication process and multi-component design represent drawbacks.[143]
StrawberryCapacitive sensors integrated into robotic gripper jaws enable gentle picking of strawberries, providing susceptible data output.Increasing pressure causes a rise in compressive stiffness, impairs sensor performance, and limits the linear measurement range.[144]
PiezoelectricTomatoPiezoelectric sensors can measure the end-effector’s initial contact with the tomato, signal oscillations, and vibrations.This approach does not solve the problems of poor surface contact and adhesion that affect pressure sensors.[145]
TomatoThe bionic manipulator features adaptive gripping capabilities, using piezoelectric and strain sensors to measure the softness of tomatoes.Without a sensor module, the robot cannot adjust the contact force and is less accurate when using a single piezo to recognize softness.[146]
TriboelectricTomatoA three-finger actuator and a triboelectric tactile sensor have been designed to monitor and clamp tomatoes precisely.Other factors strongly influence sensor performance and are poorly adapted to changes in the shape of the gripped object.[147]
KiwifruitA hybrid sensor integrates a triboelectric perception unit and an inductive sensor. When combined with vision, it accurately detects kiwifruit.The method has some limitations when dealing with complex input signals, including the inability to recognize other features of the object.[148]
TomatoThe end-effector fingers are equipped with TENG sensors, featuring a simple structure, high sensitivity, and durability for harvesting tomatoes.Drawbacks include non-linearity, creep issues, dependency on external light sources, and magnetic fields.[149]
Table 4. Methods used for distance measurement techniques related to berry fruit-picking robots.
Table 4. Methods used for distance measurement techniques related to berry fruit-picking robots.
MethodsObjectAdvantageDisadvantageRef.
InfraredStrawberryThe manipulator uses three internal infrared sensors to optimally identify and position the object.The infrared sensor may cause incorrect positioning due to occlusion, and the sensor may not be detected due to unsuccessful cutting.[32]
TomatoThe camera consists of a pair of stereo infrared sensors. The 3D coordinates of the object tomato in the camera coordinate system were measured.Environmental, background, lighting, occlusion, and overlap issues can make detecting and locating object tomatoes difficult.[156]
GrapeThe infrared sensor detects and synchronizes the position of the grapes for harvesting.Infrared sensors have limited detection range and narrow transmission and reception angles.[157]
StrawberryRGB-D cameras can estimate the position of strawberries in three-dimensional space by combining infrared with depth information.When calculating the strawberry’s position, there is an error due to the offset of the image center from the camera’s mounting position.[158]
Cherry
tomato
RGB-D cameras provide high precision and accuracy in locating cherry tomatoes by acquiring multimodal images.Positioning becomes more complicated when the camera is in front of the fruit.[159]
LiDARStrawberryThe finger is equipped with a pair of LiDAR sensors; when the fruit stem blocks the laser beam, the control module operates to cut off the stem.LiDAR sensors have errors, and the sensor’s small detection distance makes it unable to adapt to narrow and low-channel environments.[160]
StrawberryFusion of laser sensors with monocular cameras for accurate navigation and improved system fault tolerance.LiDAR sensors do not provide semantic information for scene recovery in complex environments and limit accuracy and light changes.[161]
TomatoUsing LIDAR sensors, accurate representation and positioning of crops such as tomatoes can be achieved in occlusion.Tracking systems may suffer from errors, switching between objects, or missed and false detections.[162]
TomatoUsing LIDAR, the mobile platform is provided with environmental awareness and navigation capabilities for autonomous movement.LIDAR data processing requires conversion and fusion with other data, and LIDAR has difficulties in dealing with background interference.[163]
UltrasonicBlueberryUltrasonic sensors can detect weeds and determine a plant’s height by measuring the distance between it and the sensor.Ultrasonic sensors may have some errors in measuring crop height, and the input voltage and the characteristics of the sensor itself limit its output.[164]
MultimodalTomatoInfrared sensors measure plant position, LiDAR acquires 3D information about the plants, and RGB-D cameras validate LiDAR results.Occlusion limits the infrared sensor’s field of view and light signal, and lighting and incident light affect camera image quality.[165]
TomatoUsing LIDAR and infrared cameras, it is possible to navigate, build maps, detect tomatoes, and locate the center point of a tomato.LIDAR and depth camera tomato recognition outdoors is affected by light.[166]
Table 5. Methods of berry fruit-picking robots using switching sensors.
Table 5. Methods of berry fruit-picking robots using switching sensors.
MethodsObjectAdvantageDisadvantageRef.
PhotoelectricKiwifruitThe position data from the switch sensor is transmitted to the stepper motor, which drives the gripping mechanism to clamp the fruit.There may be a discrepancy between the finger and the kiwi due to the light restriction caused by the switch mounting position.[169]
GrapeThe photoelectric switching sensor, integrated into the gripper, provides feedback that enables precise strawberry gripping.Errors in the photoelectric switching sensor may result from occlusion and stacking problems associated with the strawberry fruit.[170]
ProximityKiwifruitThe robot equipped with a hall sensor can discern alterations in the magnetic field and thereby determine the position of the kiwifruit.It is possible that the hall sensor detection is not sufficiently sensitive due to the kiwifruit’s non-magnetic shape.[171]
KiwifruitHall sensors discern alterations in magnetic fields and regulate the direction and number of steps stepper motors undertake.When catching fruit, kiwifruit-picking robots tend to inhale surrounding debris, such as branches and leaves.[172]
TomatoMagnetic proximity switches are used for end-effector control and are used to achieve control of the tomato-picking position of the gripper.Proximity switches are difficult to apply in cluttered, messy, and fragile plant environments.[173]
PressureGrapeFingers can be switched to clamp the grapes flexibly using the information obtained from the contact force with the grapes to the actuator.Due to the hydraulic brake’s low control accuracy, the fruit may be damaged during the clamping process.[174]
TomatoThe gripper, driven by a self-service motor, measures and precisely controls the end-effector gripping tomato using a resistance strain gauge.Temperature variations and electromagnetic interference can affect resistance strain gauges, leading to errors in the results.[175]
TomatoThe gripper has a membrane pressure switch that stops the finger from closing when the pressure reaches a minimum destructive value.The pressure switch’s lack of flexibility can easily damage the tomato rind.[176]
Cherry
tomato
The pressure switch controls the force of the end-effector to pick the tomato after measuring the applied force on a human finger.The pressure switch is not mounted on the end-effector, and due to individual tomato differences, tomatoes may be destroyed.[177]
Sweet pepperVacuum switching sensors detect the results of sweet pepper fruit gripping, and harvesting operations continue after successful fruit gripping.Vacuum switch sensors are more costly and have longer response times.[178]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, C.; Pan, W.; Zou, T.; Li, C.; Han, Q.; Wang, H.; Yang, J.; Zou, X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture 2024, 14, 1346. https://doi.org/10.3390/agriculture14081346

AMA Style

Wang C, Pan W, Zou T, Li C, Han Q, Wang H, Yang J, Zou X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture. 2024; 14(8):1346. https://doi.org/10.3390/agriculture14081346

Chicago/Turabian Style

Wang, Chenglin, Weiyu Pan, Tianlong Zou, Chunjiang Li, Qiyu Han, Haoming Wang, Jing Yang, and Xiangjun Zou. 2024. "A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects" Agriculture 14, no. 8: 1346. https://doi.org/10.3390/agriculture14081346

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop