Next Article in Journal
Performance Optimization of Machine-Learning Algorithms for Fault Detection and Diagnosis in PV Systems
Previous Article in Journal
Sub-6 GHz GaAs SPDT Switch Co-Designed with Shunt Inductor for ESD Protection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Laser-Powered Harvesting Tool for Tabletop Grown Strawberries

by
Mohamed Sorour
1,* and
Pål From
2
1
Insect Robotics Group—Institute for Perception, Action and Behaviour, School of Informatics, University of Edinburgh, Informatics Forum, 10 Crichton St, Edinburgh EH8 9AB, UK
2
Robotics Group, Faculty of Science and Technology, Norwegian University of Life Sciences (NMBU), 1432 Ås, Norway
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(9), 1708; https://doi.org/10.3390/electronics14091708
Submission received: 26 March 2025 / Revised: 19 April 2025 / Accepted: 21 April 2025 / Published: 23 April 2025
(This article belongs to the Section Computer Science & Engineering)

Abstract

:
In this paper, a novel tool prototype for harvesting tabletop-grown strawberries is presented. Demonstrating resilience to localization inaccuracies of up to ± 15 mm and achieving an average cycle time of 8.02 s at half its maximum operational speed, the tool represents a promising step forward in automating strawberry harvesting. It features a compact 35 mm fruit-engagement width, improving accessibility through its small operational footprint. A complete harvesting system is also proposed that can be mounted to a mobile platform for field tests. An experimental demonstration is performed to showcase the new methodology and derive relevant metrics.

1. Introduction

On account of the seasonal nature of manual harvesting [1,2], increasing demand for labor [3,4] in competitive industrial sectors, as well as an aging farming society [5], the need for harvesting automation is well established and justified. Such an unmet need results in elevated cost of living [6] due to the higher costs incurred to secure labor, and a lesser supply of crop, damaged by missing the optimal harvesting window. Harvesting robots developed so far, on the other hand, have an average success rate as low as 66 % [7], subject to drastic drop in a cluttered environment, mostly due to bulky harvesting tools [8].
Automated fruit harvesting essentially requires a tool that can (1) capture the fruit, and (2) detach it. To capture, grasping is conventionally employed in the literature, harvesting apples [9,10,11], plums [12], kiwi [13], tomatoes [14], sweet pepper [15,16], and strawberries [17,18,19,20,21,22,23,24,25]. Mechanical grasping mostly results in a bulky tool interacting with the produce, hindering as such the reachability. Employing vacuum suction instead [26,27,28,29,30], reduces the tool size but leads to complexity in finding the correct spot on the produce to apply, as well as being forceful. To detach, a sharp mechanical cutter is by far the most commonly used method, enlarging the harvesting tool size since the driving source (motor) and transmission must be very close to the cutting tool. Non-conventional cutting means are rare in the literature, including an oscillating blade to cut sweet pepper in [31], and a thermal cutting device in [32,33] for cucumber and pepper, respectively. A laser beam is used in [34,35] to showcase the potential for cutting tomato peduncles and in [36,37,38,39] for weed control. The aforementioned studies employ unconventional cutting methodologies; however, they necessitate the use of bulky grasping mechanisms due to the requirement of embedding the heat source directly within the thermal cutting apparatus. Furthermore, the laser cutting systems referenced in the literature are implemented using a cumbersome, proximity-based configuration, which undermines a key potential advantage of laser-based systems—namely, their capacity for remote deployment. This limitation diminishes the operational efficiency and scalability that such technology might otherwise offer. On the other hand, while human strawberry pickers can achieve cycle times as brief as 1.2 s per fruit [40], this pace cannot be sustained beyond 4 h of continuous work. By contrast, a robotic system with a 6 s harvesting rate achieves parity with human labor productivity when operating over extended 20 h shifts (accounting for battery charging and routine maintenance). The authors believe that any robotic system capable of achieving such a rate or lower is practical for real world deployment.
In this work, the authors present a novel harvesting tool that virtually captures the fruit by surrounding it, trapping the stem into a groove and detaches it by applying a highly focused laser beam from large distance, resulting in minimal interaction with the fruit and its local environment. In addition, the authors propose the prototype harvesting system shown in Figure 1a, with relative distances provided in Figure 2 replicating those normally found in strawberry growing poly-tunnel setups. The system as such can be mounted to a mobile robot system for field harvesting, which is out of scope of the current work. The contribution of our approach is threefold:
  • Productive: with average cycle time of 8 s at 50% of maximum robot velocity. As the laser cutting time is averaging 2.3 s, it is possible to achieve a manipulation time of 2.9 s at 100% robot velocity and in turn a total cycle time of 5.2 s.
  • Small footprint: slim tool with effective interacting width of 35 mm greatly enhances fruit reachability.
  • Robust: precise stem entrapment tolerating strawberry localization error of up to ± 15  mm, in addition to the near maintenance-free laser cutting.

2. Materials and Methods

In this section, we present the developed harvesting tool and the associated systems allowing for autonomous harvesting demonstration. We also present the experimental setup we use to determine and optimize the laser spot diameter in search of the shortest possible stem-cutting time at a given laser power setting.

2.1. Minimal Footprint Harvesting Tool

The developed harvesting tool is shown in Figure 1b; two components interact with the strawberry, namely, the stem-trapping-groove, and the stem trapper. Both are made of mild steel due to its low thermal conductivity (as compared to aluminum, for example); however, any ferrous metal will be as effective. When the laser beam is activated during the stem-cutting process, this feature preserves the heat locally, helping the cutting process, killing plant viruses, and preventing the melt down of other plastic components of the tool. The width of both components is 35 and 30 mm respectively, matching the dimensions of a large strawberry and as such minimum fruit dislocation can be achieved during harvesting, as well as better fruit reachability. The opening between both parts acts as the strawberry entry side in [41], which will approach the fruit from below, completely surrounding it, before performing the stem cut. A convex lens with focal length of 25 cm focuses the fiber laser beam generated at the laser head below [42]. It is servo-controlled to perform a lateral, reciprocating, straight line motion to drive the focal point into a line with controllable length to effectively cut the trapped strawberry stem. Two infrared photo interrupters are used to detect the free falling strawberry after being detached, which is then led to a storage unit by means of a smooth plastic passage, protecting the fruit surface from damage.
The harvesting tool is mounted to a cost-effective 6-DOF collaborative robot arm system as shown in Figure 2 in side and top views. A set of two RGB-D cameras is mounted in different view points facing the strawberry growing trough, to the front and from below. The objective of such an arrangement is to obtain a realistic point cloud of the scene with enough information to determine the location and the size of strawberries. The field of view of both cameras is depicted in the same figure, colored in light grey and red, respectively, showing the common area of focus at which strawberries can be mutually detected. This setup is based on an an average tabletop height of 103 cm readily available at the poly-tunnel of the host institution, and the average Thorvald mobile robot height [43] widely used in field robotics. The robot arm base frame F b is related to the first F c 1 and second F c 2 camera frames by the constant transformation matrices T c 1 b , and  T c 2 b , respectively, and rigidly connected via the arm base plate. As such, the setup (arm, tool, and cameras) can be fitted to a mobile robot for field testing with minor modifications to the control algorithm. The tool tip frame F t is controlled to surround the fruit prior to harvesting; by manipulating the arm, it aligns with the tip of the stem-trapping groove shown in Figure 1b and Figure 2. Let the pose p t b = [ r t b θ t b ] R 6 of the manipulator’s tool tip frame F t expressed in the base frame F b define the task space coordinates, with  r t b = [ x y z ] being the position vector, and  θ t b = [ γ β α ] denoting a minimal representation of orientation (roll, pitch, yaw RPY variation of Euler angles). In this work, the tool tip frame orientation is fixed and identical to that of the base frame; as such, the tool pose p t b notation will be dropped, and only the 3 D tool position r t b will be used in the sequel. The arm configuration shown in Figure 2 corresponds to the HOME tool position r HOME b , at which point the arm is not obstructing the field of vision of the cameras, and the strawberry localization algorithm (to be presented in the next section) can run.
The operation logic of the harvesting tool is depicted in Figure 3. Following strawberry localization, the v-shaped stem-trapping groove (note the sliding trapper is also v-shaped but inverted)—refer to Figure 3c(left)—is positioned immediately behind the berry-to-harvest at a lower z-axis position in Figure 3a, and the tool tip then elevates to isolate the fruit from its surrounding to the rear as shown in Figure 3b. The trapper then slides to trap the stem, with Figure 3c(left) showing the first contact between both, and the eventually fully trapped stem in Figure 3c(right). The laser beam is then activated at full power as in Figure 3d, and the convex lens reciprocates laterally with a stroke of 6 mm driving the focal point across the stem back and forth until the stem is cut. The mid-stroke is adjusted to the groove center point. Although the current tool design can force the stem into a precise location as compared to the author’s previous work [41], a laser beam with a large focal distance is still needed to minimize the volume of the hardware interacting with the fruit during the harvesting process. The trapper end of stroke is adjusted to minimize the stem relocation, and as such the strawberry, while maintaining a metallic background shield as the laser beam performs the cut; this is shown in the enlarged square in Figure 3d. The tool aims to approach the berry from below while perfectly aligned with the trapper groove. However, due to the inaccuracies resulting from the point cloud calculation, and augmenting the depth information from two different sensors, localization errors can add up, in addition to the naturally occurring stem bending. The width of the trapper of 30 mm ensures robustness against such errors within a tolerance of ± 15  mm from the actual strawberry location. This is shown in Figure 3b,c(left), with imperfect strawberry localization. Fruit detachment is detected using two photo interrupters, the infrared beam of which is virtually shown in red color in Figure 3b,c for clarification. Once any of the two beams is interrupted, the laser source is disengaged, terminating the cutting cycle, and the arm either moves to the next fruit to cut, or to the home position. The hardware used the 6-DOF xARM6 collaborative robot [44], two realsense D435 depth cameras, and a 50 Watts Raycus RFL-P50QB module [42] as the fiber laser source. The harvesting tool is 3D printed except for the stem trapper and the stem-trapping groove, which are subjected to extensive laser heat. A microcontroller is used to control the convex lens and trapper movements, and the laser activation, as well as monitoring the strawberry detachment photo interrupters. On the software side, we use the realsense SDK library for interfacing with the cameras, and the Point Cloud Library (PCL) [45] for point cloud processing.

2.2. Laser Spot Dynamics

To determine the optimal laser spot diameter, two sets of experiments are performed for coarse and fine tuning, respectively. The setup is shown in Figure 4, where the laser beam originating from the source to the left is focused through a high-quality convex lens on route to a pre-hanged strawberry at certain distance from the lens corresponding to a laser spot diameter as shown in the respective figure. The focused laser beam is aimed at the strawberry stem at its midst, corresponding to the longest travel distance through the stem, until it penetrates and emerges from the other side. A to-scale focused laser beam colored in red depicts the gradual reduction in the laser spot and, in turn, the effective stem piercing diameter at seven different locations. Such spot diameters range approximately from 3.8 mm to 0.1 mm. Per spot diameter, 10 experimental iterations are performed, making a total of 70 trials for the coarse tuning set of experiments, videos of which are available for the reader https://www.youtube.com/watch?v=MUS7bZu477s, accessed on 25 March 2025. This experiment aims at computing the average piercing velocity v ¯ p of each laser spot diameter candidate, where the recorded videos are used to time the duration from engaging the laser power till the moment of laser piercing the stem designating the piercing time t p in seconds. A calibrated ruler is also used to measure the stem diameter ϕ s in millimeters, assuming a cylindrical stem, and both the information in hand v ¯ p is computed.

3. Results

In this section, we present the results of laser spot diameter experiments. The authors also detail here the algorithms for strawberry localization and motion control that are used in the harvesting experiment, in which the system described in the previous section is used to harvest a collection of nine strawberries in full autonomy. A video of the harvesting experiment is available https://www.youtube.com/watch?v=W3UyDt_7erA, accessed on 25 March 2025.

3.1. Optimal Laser Spot Diameter

The authors believe that the average stem piercing velocity v ¯ p alone is not indicative of the actual speed of cutting; however, when multiplied with the spot diameter ϕ l s , we obtain what we refer to in the sequel by the stem pierce constant C p :
C p = v ¯ p ϕ l s ,
which gives a more accurate figure of the stem-area etching speed. The results for the coarse tuning set of experiments are provided in Table 1. In this table, we observe that although a laser spot diameter of 0.09 mm is 44 times faster than a 3.79 mm spot in piercing velocity, it can be equally bad in stem etching as indicated by an almost identical value of pierce constant C p . The spot diameter value with the most promising result is 0.71 mm, which will be used for fine tuning in a following set of identical experiments.
The final results of the laser spot diameter fine tuning are provided in Table 2 for six more diameter values; videos of those experiments are also available https://www.youtube.com/watch?v=rR-wyuchMrM, accessed on 25 March 2025. One initial observation is that the near optimal value 0.9 mm has a pierce constant less than that of the 0.71 mm in the previous set of experiments. This is because the stem specimens used are kept in storage at low temperature, which are then prepared for the experiments and left in room temperature for varying durations that change from one set of experiments to another. Figure 5 shows the augmented results of the two sets of experiments after linear interpolation of the second set of results in Table 2. From this graph, it can be concluded that the optimal laser spot diameter lies within the range from 0.8 to 1.0 mm. We use a spot diameter of 0.9 mm in our demonstration, and this is set in the design by controlling the distance between the convex lens and the trapper.

3.2. Strawberry Localization

While fruit segmentation is not the primary focus of this study, it remains a critical component for enabling full autonomy in the system’s operation. To ensure the demonstration’s completeness, we propose a simple method tailored for this limited objective. The authors recognize that existing deep learning-based approaches, such as those in [46,47], would yield superior performance, particularly for strawberries growing in dense clusters compared to the simplified solution presented here. The pseudo-code for strawberry localization is provided in Algorithm 1; it takes as input the raw scene point cloud acquired by both RGB-D cameras C s 1 c 1 , and  C s 2 c 2 expressed in the corresponding camera frame. The point cloud of these is depicted in Figure 6. These are then transformed to the arm base frame and augmented (code line 1) to form the scene point cloud C s b . From this, a reduced scene cloud set C r s b C s b is then constructed, where a point c i b s in the scene cloud C s b is added to the reduced cloud if it resides in a spatial window characterized by the maximum and minimum limits x l + , x l , y l + , y l , z l + , z l in x , y , and z coordinates respectively of the arm base frame. This reduces the forthcoming computation to the area of interest that is dexterously accessible by the robot arm; such a reduced point cloud is shown in Figure 6.
Algorithm 1 Strawberry localization algorithm.
Input: colored scene point cloud C s 1 c 1 from RGB-D CAM1,
    colored scene point cloud C s 2 c 2 from RGB-D CAM2.
Output: Vectors defining the bounding box for localized strawberries in arm base frame.
  1:
C s b = T c 1 b C s 1 c 1 + T c 2 b C s 2 c 2
  2:
C r s b =
  3:
for each point c s i b in C s b do
  4: 
if ( c s i b . x < x l +  and  c s i b . x > x l  and  c s i b . y < y l +  and  c s i b . y > y l  and  c s i b . z < z l +  and  c s i b . z > z l ) then
  5:
   C r s b c i b s
  6:
end if
  7:
end for
  8:
C r e d b =
  9:
for each point c r s i b in C r s b do
10:
if ( c r s i b . r > r t h  and  c r s i b . g < g t h  and  c r s i b . b < b t h ) then
11:
   C r e d b c r s i b
12:
end if
13:
end for
14:
C s t r a w b = EuCS ( t , s m i n , s m a x , b C r e d )
15:
for each cluster c s t r a w i b in C s t r a w b do
16:
x s t r a w m i n b MIN ( b c s t r a w i , 0 )
17:
x s t r a w m a x b MAX ( b c s t r a w i , 0 )
18:
y s t r a w m i n b MIN ( b c s t r a w i , 1 )
19:
y s t r a w m a x b MAX ( b c s t r a w i , 1 )
20:
z s t r a w m i n b MIN ( b c s t r a w i , 2 )
21:
z s t r a w m a x b MAX ( b c s t r a w i , 2 )
22:
end for
23:
return  x s t r a w m i n b , x s t r a w m a x b , y s t r a w m i n b , y s t r a w m a x b ,
     z s t r a w m i n b , z s t r a w m a x b
Ripe strawberries are simply extracted by thresholding the red color using the RGB thresholds r t h , g t h , b t h to form the red point cloud C r e d b C r s b . Those thresholds are identified empirically for indoor experiments using both outdoor sun light and indoor lighting conditions. We then use the PCL implemented Euclidean cluster segmentation algorithm to segment each individual strawberry. The output of the function EuCS   ( t , s m i n , s m a x , C r e d b ) is a set of strawberry clusters (set of point clouds, each representing a single strawberry) C s t r a w b arranged in ascending order of the y-axis coordinate values, with  t , s m i n , s m a x denoting the segmentation tolerance, minimum and maximum cluster sizes, respectively. The  MIN   ( b c s t r a w i , 0 ) function, supplied with a point cloud c s t r a w i b and an index, will return the minimum value available at such an index in all point cloud points, whereas, supplied with a vector, it will return the smallest value irrespective of the index. The lengthy pseudo-code for the aforementioned two functions is omitted for convenience. Values of the parameters used in Algorithm 1 are provided in Table 3. The output is a set of vectors defining the bounding boxes of the localized strawberries; these boxes are shown in Figure 6.

3.3. Harvesting Motion

Pseudo-code of the harvesting algorithm is provided in Algorithm 2, and it is also depicted in Figure 7. It takes as input the vectors defining the bounding box of the identified/localized strawberries, that is the output of the localization algorithm in Algorithm 1. The output is a series of commands to control the robot arm movement, as well as controlling the stem trapper and the laser module. In this work, a strawberry cutting cycle, and as such the cycle time, starts and ends with a single strawberry unit being detached. Initially, the algorithm moves the robot arm to the home position r HOME (configuration depicted in Figure 2), the  ROBOT _ MOVE ( ) function in Algorithm 2 is blocking (has to be finished before executing the next line of code). It then computes a minimal z-axis point z m i n b , corresponding to the lowest hanging strawberry observed.
In Figure 7a, a strawberry unit has just been cut, signaling the start of a new cycle, at which the robot arm moves to z m i n b as shown in Figure 7b. By setting r t d b ( 2 ) = z m i n b , only the z-axis component of the desired tool position vector r t d b is changed, while retaining the old values for the other two components (x and y axes). The arm then moves to the x-y coordinates of the next detected strawberry using the information of the corresponding bounding box before moving upwards in z-axis as shown in Figure 7c,d, respectively. In order to compensate for the probable inaccuracy in measuring the actual depth of strawberry, an empirically determined safety factor is added to the maximum boundary computed in x and z axes (code lines 6 and 9 in Algorithm 2). To this end, the robot arm motion for this cycle is finished; what remains is the cutting procedure, which is microcontroller implemented. When given the signal to perform the cut, the microcontroller actuates the trapper forward using TRAP_STEM(), then activates the laser module as well as the convex lens lateral movement using LASER(ON) until any of the photo interrupters IR1 or IR2 (refer to Figure 3) detects the detached fruit, at which instance the laser beam is deactivated and the trapper moves backwards.
Algorithm 2 Harvesting algorithm.
Input:  x s t r a w m a x b , y s t r a w m i n b , y s t r a w m a x b , z s t r a w m i n b , z s t r a w m a x b .
Output: desired tool tip position in arm base frame r t d b .
  1:
ROBOT _ MOVE ( b r HOME )
  2:
z m i n b = MIN ( b z s t r a w m i n ) 10 mm
  3:
for each strawberry unit i in n s t r a w do
  4:
r t d b ( 2 ) = z m i n b
  5:
ROBOT _ MOVE ( b r t d )
  6:
r t d b ( 0 ) = x s t r a w m a x b ( i ) + 10 mm
  7:
r t d b ( 1 ) = ( y s t r a w m i n b ( i ) + y s t r a w m a x b ( i ) ) / 2
  8:
ROBOT _ MOVE ( b r t d )
  9:
r t d b ( 2 ) = z s t r a w m a x b ( i ) + 15 mm
10:
ROBOT _ MOVE ( b r t d )
11:
TRAP_STEM()
12:
while IR1 and IR2 do
13:
  LASER(ON)
14:
end while
15:
LASER(OFF)
16:
RELEASE_STEM()
17:
end for
18:
ROBOT _ MOVE ( b r HOME )
The average cycle and stem-cutting times are 8.02 , and 2.3 s, respectively, with the robot arm operating at 50% of maximum velocity for hardware safety reasons. The localization algorithm consumes 100 ms in the worst case scenario on a standard laptop with intel core-i7 8th generation processor.

4. Discussion

Manual harvesters can operate at cycle times as fast as 1.2 s per fruit for a maximum of a 4 h continuous work session [40]. A robotic system requiring 6 s per harvest matches this human productivity benchmark when functioning continuously for a 20 h shift (assuming 4 h for charging and maintenance periods). The authors contend that any automated solution meeting or exceeding this performance threshold demonstrates viability for real-world implementation.
The authors have demonstrated that the developed system can achieve a strawberry harvesting cycle as fast as 5.2 s at 100% robot velocity with laser cutting and manipulation times averaging 2.3 and 2.9 s, respectively. This is a major milestone in laser cutting-based solutions, as these typically feature longer cutting times in comparison with conventional methods [20,21]. The mostly maintenance-free laser cutting has a long-term positive impact on productivity as compared to literature-dominating mechanical cutting tools that require frequent replacement. In addition, the high temperatures reaching up to 188° Celsius (measured during experiments) prevent local plant diseases from spreading across the whole crop by killing germs. Another major advantage is that the burnt stem wound preserves the water content of the fruit as compared with the sharp cut wound, and in turn prolongs the shelf-life time.
To this end, the authors believe that a 3 D linear system replacing the robot arm, in addition to a 100 Watt laser module, can reduce the average cycle time below the 4 s mark, greatly enhancing its commercial adoption chances. This follows a general conclusions from the authors’ prior work [48], where increasing the laser power resulted in an exponential decrease in the time of cut. It is worth noting that such prior results will need to be revisited, and further experiments might be needed in light of the new findings in this study. Specifically, the notion of optimal pierce constant, where previous experiments are not performed at optimal spot diameter and as such the relationship between laser power and the time of cut might not be exponential. It should be noted that experiments to optimize the laser spot diameter will be needed to identify the effect of increasing the laser power on the optimal value. Further enhancing the fruit identification and localization algorithm would greatly update the current development status [25] thanks to the minimal footprint of the developed tool. The focus of this work is to present the case of laser cutting in hanging-produce harvesting, and to develop a tool that can leverage the advantages of such a technology, namely, the capability to cut from a distance, and the associated small footprint tool size and, in turn, cutting success. This has impacted the development of fruit identification algorithms since this is out of scope of this work. Also for the future work, the authors plan to study the ergonomics of laser cutting as compared to the mechanical counterpart in terms of the expected savings in maintenance cost, diseases prevention, and crop shelf life as opposed to higher power consumption.

Author Contributions

Conceptualization, M.S.; methodology, M.S.; software, M.S.; validation, M.S.; formal analysis, M.S.; investigation, M.S. and P.F.; resources, M.S. and P.F.; data curation, M.S.; writing—original draft preparation, M.S.; writing—review and editing, M.S. and P.F.; visualization, M.S.; supervision, P.F.; project administration, P.F.; funding acquisition, P.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author(s).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Nolte, K.; Ostermeier, M. Labour Market Effects of Large-Scale Agricultural Investment: Conceptual Considerations and Estimated Employment Effects. World Dev. 2017, 98, 430–446. [Google Scholar] [CrossRef]
  2. Government, U. The Impact on the Horticulture and Food Processing Sectors of Closing the Seasonal Agricultural Workers Scheme and the Sectors Based Scheme. 2013. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/257242/migrant-seasonal-workers.pdf (accessed on 14 March 2025).
  3. Bank, T.W. Employment in Agriculture (Percentage of Total Employment) (Modeled ILO Estimate). Available online: https://data.worldbank.org/indicator/SL.AGR.EMPL.ZS (accessed on 12 June 2021).
  4. UK, NFUonline. Establishing the Labour Availability Issues of the UK Food and Drink Sector. 2021. Available online: https://www.nfuonline.com/archive?treeid=152097 (accessed on 18 February 2022).
  5. Duckett, T.; Pearson, S.; Blackmore, S.; Grieve, B. Agricultural Robotics: The Future of Robotic Agriculture. arXiv 2018, arXiv:1806.06762. [Google Scholar]
  6. Cassey, A.; Lee, K.; Sage, J.; Tozer, P. Assessing post-harvest labor shortages, wages, and welfare. Agric. Food Econ. 2018, 6, 17. [Google Scholar] [CrossRef]
  7. Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  8. Kootstra, G.; Wang, X.; Blok, P.M.; Hemming, J.; van Henten, E. Selective Harvesting Robotics: Current Research, Trends, and Future Directions. Curr. Robot. Rep. 2021, 2, 95–104. [Google Scholar] [CrossRef]
  9. De-An, Z.; Jidong, L.; Wei, J.; Ying, Z.; Yu, C. Design and control of an apple harvesting robot. Biosyst. Eng. 2011, 110, 112–122. [Google Scholar] [CrossRef]
  10. Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
  11. Onishi, Y.; Yoshida, T.; Kurita, H.; Fukao, T.; Arihara, H.; Iwai, A. An automated fruit harvesting robot by using deep learning. ROBOMECH J. 2019, 6, 13. [Google Scholar] [CrossRef]
  12. Brown, J.; Sukkarieh, S. Design and evaluation of a modular robotic plum harvesting system utilizing soft components. J. Field Robot. 2021, 38, 289–306. [Google Scholar] [CrossRef]
  13. Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and simulation of an integrated end-effector for picking kiwifruit by robot. Inf. Process. Agric. 2020, 7, 58–71. [Google Scholar] [CrossRef]
  14. Feng, Q.; Wang, X.; Wang, G.; Li, Z. Design and test of tomatoes harvesting robot. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015; pp. 949–952. [Google Scholar]
  15. Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
  16. Bac, C.W.; Hemming, J.; van Tuijl, B.; Barth, R.; Wais, E.; van Henten, E.J. Performance Evaluation of a Harvesting Robot for Sweet Pepper. J. Field Robot. 2017, 34, 1123–1139. [Google Scholar] [CrossRef]
  17. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  18. Parsa, S.; Debnath, B.; Khan, M.A.; E., A.G. Modular autonomous strawberry picking robotic system. J. Field Robot. 2024, 41, 2226–2246. [Google Scholar] [CrossRef]
  19. Tituaña, L.; Gholami, A.; He, Z.; Xu, Y.; Karkee, M.; Ehsani, R. A small autonomous field robot for strawberry harvesting. Smart Agric. Technol. 2024, 8, 100454. [Google Scholar] [CrossRef]
  20. Ochoa, E.; Mo, C. Design and Field Evaluation of an End Effector for Robotic Strawberry Harvesting. Actuators 2025, 14, 42. [Google Scholar] [CrossRef]
  21. Chang, C.L.; Huang, C.C. Design and Implementation of an AI-Based Robotic Arm for Strawberry Harvesting. Agriculture 2024, 14, 2057. [Google Scholar] [CrossRef]
  22. Li, Z.; Yuan, X.; Yang, Z. Design, simulation, and experiment for the end effector of a spherical fruit picking robot. Int. J. Adv. Robot. Syst. 2023, 20, 2057. [Google Scholar] [CrossRef]
  23. Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation. J. Field Robot. 2020, 37, 202–224. [Google Scholar] [CrossRef]
  24. De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. IFAC-PapersOnLine 2018, 51, 14–19. [Google Scholar] [CrossRef]
  25. Zhou, H.; Wang, X.; Au, W.; Kang, H.; Chen, C. Intelligent robots for fruit harvesting: Recent developments and future challenges. Precis. Agric. 2022, 23, 1573–1618. [Google Scholar] [CrossRef]
  26. Baeten, J.; Donné, K.; Boedrij, S.; Beckers, W.; Claesen, E. Autonomous Fruit Picking Machine: A Robotic Apple Harvester. In Field and Service Robotics: Results of the 6th International Conference, Chamonix, France, 9–12 July 2003; Laugier, C., Siegwart, R., Eds.; Springe: Berlin/Heidelberg, Germany, 2008; pp. 531–539. [Google Scholar]
  27. Tanigaki, K.; Fujiura, T.; Akase, A.; Imagawa, J. Cherry-harvesting robot. Comput. Electron. Agric. 2008, 63, 65–72. [Google Scholar] [CrossRef]
  28. Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a strawberry-harvesting robot in a field test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
  29. Hayashi, S.; Yamamoto, S.; Tsubota, S.; Ochiai, Y.; Kobayashi, K.; Kamata, J.; Kurita, M.; Inazumi, H.; Peter, R. Automation technologies for strawberry harvesting and packing operations in Japan 1. J. Berry Res. 2014, 4, 19–27. [Google Scholar] [CrossRef]
  30. Hu, G.; Chen, C.; Chen, J.; Sun, L.; Sugirbay, A.; Chen, Y.; Jin, H.; Zhang, S.; Bu, L. Simplified 4-DOF manipulator for rapid robotic apple harvesting. Comput. Electron. Agric. 2022, 199, 107177. [Google Scholar] [CrossRef]
  31. Lehnert, C.; English, A.; McCool, C.; Tow, A.W.; Perez, T. Autonomous Sweet Pepper Harvesting for Protected Cropping Systems. IEEE Robot. Autom. Lett. 2017, 2, 872–879. [Google Scholar] [CrossRef]
  32. van Henten, E.; Hemming, J.; van Tuijl, B.; van Tuijl, B.; Kornet, J.; Meuleman, J.; Bontsema, J.; van Os, E. An Autonomous Robot for Harvesting Cucumbers in Greenhouses. Auton. Robot. 2002, 13, 241–258. [Google Scholar] [CrossRef]
  33. BACHCHE, S.; OKA, K. Performance Testing of Thermal Cutting Systems for Sweet Pepper Harvesting Robot in Greenhouse Horticulture. J. Syst. Des. Dyn. 2013, 7, 36–51. [Google Scholar] [CrossRef]
  34. Liu, J.; Li, Z.; Li, P.; Mao, H. Design of a laser stem-cutting device for harvesting robot. In Proceedings of the 2008 IEEE International Conference on Automation and Logistics, Qingdao, China, 1–3 September 2008; pp. 2370–2374. [Google Scholar] [CrossRef]
  35. Liu, J.; Hu, Y.; Xu, X.; Li, P. Feasibility and influencing factors of laser cutting of tomato peduncles for robotic harvesting. Afr. J. Biotechnol. 2011, 10, 15552–15563. [Google Scholar] [CrossRef]
  36. Heisel, T.; Schou, J.; Andreasen, C.; Christensen, S. Using laser to measure stem thickness and cut weed stems. Weed Res. 2002, 42, 242–248. [Google Scholar] [CrossRef]
  37. Mathiassen, S.K.; Bak, T.; Christensen, S.; Kudsk, P. The Effect of Laser Treatment as a Weed Control Method. Biosyst. Eng. 2006, 95, 497–505. [Google Scholar] [CrossRef]
  38. Coleman, G.; Betters, C.; Squires, C.; Leon-Saval, S.; Walsh, M. Low Energy Laser Treatments Control Annual Ryegrass (Lolium rigidum). Front. Agron. 2021, 2, 35. [Google Scholar] [CrossRef]
  39. Nadimi, M.; Sun, D.; Paliwal, J. Recent applications of novel laser techniques for enhancing agricultural production. Laser Phys. 2021, 31, 053001. [Google Scholar] [CrossRef]
  40. Woo, S.; Uyeh, D.D.; Kim, J.; Kim, Y.; Kang, S.; Kim, K.C.; Lee, S.Y.; Ha, Y.; Lee, W.S. Analyses of Work Efficiency of a Strawberry-Harvesting Robot in an Automated Greenhouse. Agronomy 2020, 10, 1751. [Google Scholar] [CrossRef]
  41. Sorour, M.; From, P.J.; Elgeneidy, K.; Kanarachos, S.; Sallam, M. Produce Harvesting by Laser Stem-Cutting. In Proceedings of the 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE), Mexico City, Mexico, 22–26 August 2022; pp. 487–492. [Google Scholar] [CrossRef]
  42. Raycus. 50W Q-Switched Pulse Fiber Laser. Available online: https://en.raycuslaser.com/products/50w-q-switched-pulse-fiber-laser.html (accessed on 22 March 2025).
  43. Grimstad, L.; From, P.J. Thorvald II—A Modular and Re-configurable Agricultural Robot. IFAC-PapersOnLine 2017, 50, 4588–4593. [Google Scholar] [CrossRef]
  44. xARM. xARM Collaborative Robot. Available online: https://www.ufactory.cc/xarm-collaborative-robot (accessed on 14 March 2025).
  45. Rusu, R.B.; Cousins, S. 3D is here: Point Cloud Library (PCL). In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011. [Google Scholar]
  46. Ge, Y.; Xiong, Y.; From, P.J. Instance Segmentation and Localization of Strawberries in Farm Conditions for Automatic Fruit Harvesting. IFAC-PapersOnLine 2019, 52, 294–299. [Google Scholar] [CrossRef]
  47. Meng, Z.; Du, X.; Sapkota, R.; Ma, Z.; Cheng, H. YOLOv10-pose and YOLOv9-pose: Real-time strawberry stalk pose detection models. Comput. Ind. 2025, 165. [Google Scholar] [CrossRef]
  48. Sorour, M.; From, P.J.; Elgeneidy, K.; Kanarachos, S.; Sallam, M. Compact Strawberry Harvesting Tube Employing Laser Cutter. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 8956–8962. [Google Scholar]
Figure 1. The overall developed harvesting system in (a) and a closeup of the laser powered trapping/cutting tool in (b).
Figure 1. The overall developed harvesting system in (a) and a closeup of the laser powered trapping/cutting tool in (b).
Electronics 14 01708 g001
Figure 2. The harvesting system anatomy in side and top views. Relevant coordinate frames are shown with the x, y, and z axes in red, green and blue arrows, respectively. Vision cone of first and second RGB-D cameras in grey and light red, respectively. The selected dimensions of interest are in centimeters.
Figure 2. The harvesting system anatomy in side and top views. Relevant coordinate frames are shown with the x, y, and z axes in red, green and blue arrows, respectively. Vision cone of first and second RGB-D cameras in grey and light red, respectively. The selected dimensions of interest are in centimeters.
Electronics 14 01708 g002
Figure 3. Operation logic, with the harvesting tool at the x-y coordinates of the strawberry in (a), eventually encapsulating it by moving upwards in z-axis (b). The stem is precisely entrapped into a groove in (c) while tolerating fruit localization error, followed by triggering the laser beam in (d) until fruit detachment.
Figure 3. Operation logic, with the harvesting tool at the x-y coordinates of the strawberry in (a), eventually encapsulating it by moving upwards in z-axis (b). The stem is precisely entrapped into a groove in (c) while tolerating fruit localization error, followed by triggering the laser beam in (d) until fruit detachment.
Electronics 14 01708 g003
Figure 4. Setup for evaluating the impact of laser spot diameter on the stem piercing velocity, featuring 7 hangers corresponding to differing spot diameters which are tested for coarse tuning.
Figure 4. Setup for evaluating the impact of laser spot diameter on the stem piercing velocity, featuring 7 hangers corresponding to differing spot diameters which are tested for coarse tuning.
Electronics 14 01708 g004
Figure 5. Effect of laser spot diameter on the pierce constant.
Figure 5. Effect of laser spot diameter on the pierce constant.
Electronics 14 01708 g005
Figure 6. Reduced scene point cloud focusing on the area of interest with a black bounding box marking the located strawberries.
Figure 6. Reduced scene point cloud focusing on the area of interest with a black bounding box marking the located strawberries.
Electronics 14 01708 g006
Figure 7. Snapshots of a complete strawberry harvesting cycle, starting in (a) and ending in (g) with the detachment of a single strawberry. Steps in between feature z-axis retract (b) and approach (d), moving to the x, y-axis strawberry location (c), step trapping (e) and laser cutter engaging (f). Lower row showing the close-up images captured by the tool-observing camera. Tool movement in x, y, and z axes indicated by green, red, and blue arrows respectively.
Figure 7. Snapshots of a complete strawberry harvesting cycle, starting in (a) and ending in (g) with the detachment of a single strawberry. Steps in between feature z-axis retract (b) and approach (d), moving to the x, y-axis strawberry location (c), step trapping (e) and laser cutter engaging (f). Lower row showing the close-up images captured by the tool-observing camera. Tool movement in x, y, and z axes indicated by green, red, and blue arrows respectively.
Electronics 14 01708 g007
Table 1. Average stem piercing velocity v ¯ p and the stem pierce constant C p at different laser spot diameters ϕ l s .
Table 1. Average stem piercing velocity v ¯ p and the stem pierce constant C p at different laser spot diameters ϕ l s .
ϕ ls (mm) ϕ ¯ s (mm) t ¯ p (s) v ¯ p (mm/s) C p (mm2/s)
3.79 2.3 32.0 0.07 0.26
3.18 2.3 19.2 0.12 0.38
2.56 2.2 8.92 0.25 0.64
1.94 2.3 6.93 0.34 0.66
1.32 2.4 2.61 0.92 1.21
0.71 2.3 0.96 2.42 1.72
0.09 2.2 0.71 3.09 0.28
Table 2. Results of fine tuning the laser spot diameter ϕ l s , with the best performing in bold.
Table 2. Results of fine tuning the laser spot diameter ϕ l s , with the best performing in bold.
ϕ ls (mm) ϕ ¯ s (mm) t ¯ p (s) v ¯ p (mm/s) C p (mm2/s)
0.5 2.1 1.16 1.81 0.90
0.6 2.4 1.34 1.79 1.07
0.8 2.1 1.50 1.4 1.14
0 . 9 2 . 2 1 . 47 1 . 49 1 . 36
1.0 2.1 1.74 1.21 1.23
1.1 2.3 2.52 0.91 1.02
Table 3. Localization algorithm parameters.
Table 3. Localization algorithm parameters.
ParameterValueParameterValue
x l + 55 1 r t h 100
x l 25 1 g t h 70
y l + 30 1 b t h 70
y l 30  1t 0.02  1
z l + 50 1 s m i n 20
z l 30 1 s m a x 1000
1 Value in centimeter.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sorour, M.; From, P. Laser-Powered Harvesting Tool for Tabletop Grown Strawberries. Electronics 2025, 14, 1708. https://doi.org/10.3390/electronics14091708

AMA Style

Sorour M, From P. Laser-Powered Harvesting Tool for Tabletop Grown Strawberries. Electronics. 2025; 14(9):1708. https://doi.org/10.3390/electronics14091708

Chicago/Turabian Style

Sorour, Mohamed, and Pål From. 2025. "Laser-Powered Harvesting Tool for Tabletop Grown Strawberries" Electronics 14, no. 9: 1708. https://doi.org/10.3390/electronics14091708

APA Style

Sorour, M., & From, P. (2025). Laser-Powered Harvesting Tool for Tabletop Grown Strawberries. Electronics, 14(9), 1708. https://doi.org/10.3390/electronics14091708

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop