Identification of Technical Requirements for the Initial Design of an Autonomous Fruit-Harvesting Robot
Abstract
:1. Introduction
2. Methodology
3. Operational Requirements
3.1. Fruit Identification and Selection
3.1.1. Advanced Computer Vision Systems
- RGB cameras: These cameras capture color images, which are useful for detecting fruit color, a visual indicator of ripeness. The captured colors can be processed to identify whether the fruit has reached the characteristic color of ripeness [10].
- Hyperspectral cameras: These cameras capture images at a wide spectrum of wavelengths, beyond what the human eye can see. This makes it possible to analyze internal characteristics of the fruit, such as sugar content and firmness, which are important in determining ripeness [11]. Figure 2 shows a fruit recognition system using artificial vision.
- Pre-processing: It consists of improving the quality of the image and eliminating noise. Techniques such as lighting standardization and noise filtering are common at this stage [12].
- Image segmentation: Separates the fruits from the background of the image. This can be achieved by segmentation algorithms such as color threshold, edge-based segmentation, or the use of neural networks for semantic segmentation.
- Feature extraction: This involves identifying specific characteristics of fruits, such as color, texture, shape, and size. These characteristics are used to determine the state of ripeness of the fruit [13].
3.1.2. Machine Learning Algorithms
3.1.3. Proximity and Touch Sensors
Proximity Sensors
- Ultrasonic sensors: They emit high-frequency sound waves and measure the time it takes for the echo to return after bouncing off an object. The distance is calculated based on the flight time of the sound [27].
- Infrared (IR) sensors: They emit infrared light and detect the amount of light reflected by a nearby object. The distance is determined by the intensity of the reflected light [28].
- Induction sensors: They generate a magnetic field and detect changes in the field when a metal object approaches. They are less common in agriculture, but useful in industrial settings.
- Fruit detection: Identify the presence of fruits in the robot’s field of action.
- Navigation and Obstacle Avoidance: Helping the robot move through the field without colliding with plants, branches, or other objects.
- Precise positioning: Facilitate the precise approach of the robotic arm to the fruits to ensure damage-free harvesting [29].
Touch Sensors
- Piezoelectric sensors: They generate an electrical charge when pressure is applied. The amount of charge generated is proportional to the force applied [30].
- Resistive sensors: They change their electrical resistance in response to pressure. The variation in resistance is used to measure the force applied [31].
- Capacitive sensors: They detect changes in capacitance when pressure is applied to a sensitive surface. They are sensitive and accurate, suitable for detecting light forces [32].
- Firmness assessment: To determine fruit ripeness based on its firmness, which is a key indicator of ripeness in some fruits.
- Gentle handling: Allowing the robot to apply the correct amount of force when grasping and picking fruit, preventing damage and ensuring gentle picking [33].
- Real-time feedback: Provide real-time information on contact and force applied, allowing immediate adjustments during picking to improve accuracy and reduce damage to fruits [34].
3.2. Navigation and Mobility
3.2.1. Sensors and Awareness of the Environment
- Cameras and vision systems: They capture images of the environment to detect ripe fruits, obstacles and other relevant elements in the field [38]. These systems can be monocular, stereoscopic, or even multispectral cameras, depending on specific detection needs. Just as cameras are used to recognize fruits, cameras can be used to generate an algorithm that allows the robot to move along an unobstructed route around the crop field [39].
- Proximity sensors: They detect the presence of nearby objects, such as plants or structures, to avoid collisions and plan safe routes [40]. If the robot deviates from its planned route, location systems allow real-time corrections to be made to maintain accuracy and efficiency during harvesting. This helps optimize the operation time and resources used in the field.
- Location systems: They use GPS, GNSS, or other methods to determine the robot’s precise position in the field. The robot’s position must be precise to ensure that it does not deviate from the path laid out for harvesting [41].
- Odometry sensors: They monitor the robot’s movement and direction to calculate the distance traveled and the current orientation. The information provided by the odometry sensors allows the robot to move precisely along the rows of plants and between crops, ensuring that each fruit is harvested efficiently and without damage [42]. By knowing the distance traveled, odometry sensors help optimize the use of energy and resources during robot operation, increasing autonomy and reducing downtime.
3.2.2. Route Planning and Navigation
- Mapping the environment: Before starting any harvesting operation, the robot uses its sensors, such as cameras and perception systems, to create a detailed map of the agricultural environment. This map includes information on the arrangement of the rows of plants, the location of ripe fruits, obstacles such as trees or agricultural structures, and other relevant elements [43].
- Route planning: Based on the generated map and the harvesting objectives, the robot plans the best route to move efficiently between the rows of plants and collect the identified fruits. The robot decides the order in which it will pick the fruits based on criteria such as proximity, accessibility, and picking efficiency. For example, you can prioritize fruits that are closer or in a more accessible position. Using route planning algorithms, such as the algorithm, the robot calculates the shortest and safest route to reach each point of interest. During route planning, the robot also considers how to avoid obstacles detected in the environment [44].
- Autonomous navigation: During operation, the robot uses navigation algorithms that combine information from the map of the environment, the robot’s current position, and sensor data to make real-time decisions about direction and speed of movement.
- GPS (Global Positioning System): GPS works by receiving signals from satellites in orbit. A GPS receiver in the robot picks up these signals and uses them to calculate the robot’s position in terms of geographic coordinates (latitude, longitude, and altitude) [45]. Using the trilateration technique, the GPS receiver calculates its exact position by measuring the time it takes for signals to travel from various satellites to the receiver. The GPS provides the robot with information about its location in the crop field. This is crucial for planning efficient harvesting routes and ensuring that the robot covers the entire growing area without skipping any sections. In large fields where multiple robots may be operating simultaneously, GPS helps coordinate movements between robots to avoid collisions and optimize field coverage. The data collected, such as the location of ripe fruits, can be georeferenced, allowing a subsequent analysis on the distribution and conditions of the crop [46].
- LiDAR (Light Detection and Ranging): LiDAR works by emitting pulses of laser light into the environment and measuring the time it takes for light to reflect and return to the sensor. By measuring these distances in multiple directions, LiDAR creates a three-dimensional “point cloud” that accurately represents the structure of the environment [47]. LiDAR allows the robot to detect obstacles in its immediate environment, such as plants, trees, and agricultural structures, in real-time. This is essential to avoid collisions and damage to both the robot and the crop. By generating a detailed three-dimensional map of the environment, LiDAR helps the robot navigate precisely between rows of plants and locate ripe fruits for picking. The detailed environmental information provided by the LiDAR allows the robot to position its harvesting arms and tools with pinpoint accuracy, ensuring effective harvesting without damaging surrounding fruits or plants [48].
- GPS and LiDAR integration: While GPS provides the robot’s global location in the field, LiDAR provides detailed mapping of the immediate environment. By combining this data, the robot can plan and execute harvesting routes more efficiently and safely. GPS and LiDAR data are integrated using sensor fusion algorithms that combine the advantages of both systems to improve navigation accuracy and obstacle detection. As the robot moves, GPS and LiDAR data is continuously updated, allowing the robot to adapt to changes in the environment and maintain an optimal trajectory. Figure 4 shows a robot in the process of harvesting, to achieve this it must have a very precise positioning system due to the difficult conditions of its environment.
3.3. Environmental Conditions
3.3.1. Ground Conditions
3.3.2. Characteristics of the Crop
3.3.3. Environmental Factors
3.3.4. Interaction with Other Elements
3.4. Technical Specifications
3.4.1. Dimensions
3.4.2. Mobility
3.4.3. Perception Systems
3.4.4. Navigation Systems
3.4.5. Processing and Control
3.4.6. Gathering Capacity
3.4.7. Autonomy
3.4.8. Interaction and Security
3.4.9. Environmental Conditions
3.4.10. Different Types of Soil
4. Real Example
5. Economic Evaluation
6. Discussion
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Kragh, M.; Hansen, S.M.; Jørgensen, R.N. Agricultural Robot Navigation using 3D Vision and LiDAR. Robot. Auton. Syst. 2017, 92, 195–206. [Google Scholar]
- Bargoti, S.; Underwood, J. Deep Fruit Detection in Orchards. IEEE Robot. Autom. Lett. 2017, 2, 940–947. [Google Scholar]
- Green, M.A.; Hishikawa, Y.; Dunlop, E.D.; Levi, D.H.; Hohl-Ebinger, J.; Ho-Baillie, A.W.Y. Solar Cell Efficiency Tables (Version 45). Prog. Photovoltaics Res. Appl. 2015, 23, 3–15. [Google Scholar] [CrossRef]
- Kang, S.; Kim, J.; Park, H. Immersion Testing of Robotics in Water Environments. IEEE Robot. Autom. Lett. 2018, 3, 965–972. [Google Scholar]
- Romero, P.; García, E.; González, R. Weather-Resistant Design for Outdoor Robotics. J. Field Robot. 2019, 36, 943–959. [Google Scholar]
- Gonzalez, R.C.; Woods, R.E. Digital Image Processing; Pearson Prentice Hall: London, UK, 2008. [Google Scholar]
- Alur, R.; Gupta, P.; Patel, K.; Sharma, N. Ingress Protection (IP) Testing and Classification. Int. J. Adv. Res. Eng. Technol. 2020, 11, 17–24. [Google Scholar]
- Alvarado, J.V.A.; Molina, M.A.C. Classification of fruits based on convolutional neural networks. Polo Conoc. Rev. Científica-Prof. 2020, 5, 3–22. [Google Scholar]
- Bac, C.W.; Hemming, J.; van Tuijl, B.A.J.; Barth, R.; Wais, E.; van Henten, E.J. Harvesting Robots for High-Value Crops: State-of-the-Art Review and Challenges Ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
- Vasconez, J.P.; Kantor, G.A.; Pérez, J.A. Agricultural Robotics: Unmanned Robotic Service Units in Agricultural Tasks. IEEE Ind. Electron. Mag. 2019, 13, 48–58. [Google Scholar]
- Bakhtiari, F.; Fathian, M.; Khalilian, A.; Ahmad, D. Modular Robotics for Agriculture: From Concept to Implementation. Agric. Eng. Int. CIGR J. 2020, 22, 50–61. [Google Scholar]
- Van Henten, E.J.; Hemming, J.; Van Tuijl, B.A.J.; Kornet, J.G.; Meuleman, J.; Bontsema, J.; Van Os, E.A. An Autonomous Robot for Harvesting Cucumbers in Greenhouses. Auton. Robot. 2003, 13, 241–258. [Google Scholar] [CrossRef]
- Gu, L.; Zhang, Q.; Liu, H.; Zhang, D. Waterproofing Technology for Robotic Applications. IEEE Trans. Robot. 2016, 32, 627–636. [Google Scholar]
- Radhakrishnan, S.; Balasubramanian, R.; Subramanian, V. Terrain Adaptation for Agricultural Robots. Int. J. Agric. Robot. 2016, 2, 45–59. [Google Scholar]
- Riek, L.D. Healthcare Robotics. Commun. ACM 2017, 60, 68–78. [Google Scholar] [CrossRef]
- Zhang, Z.; He, L.; Wang, S. Development and Field Evaluation of a Robotic Apple Harvester. Biosyst. Eng. 2020, 193, 166–179. [Google Scholar]
- Bechar, A.; Vigneault, C.; Ben-Zion, M. Safety Mechanisms in Agricultural Robotics. Robot. Auton. Syst. 2021, 2021. [Google Scholar]
- Boykov, Y.; Funka-Lea, G. Graph Cuts and Efficient N-D Image Segmentation. Int. J. Comput. Vis. 2006, 70, 109–131. [Google Scholar] [CrossRef]
- Bishop, C.M.; Nasrabadi, N.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006; Volume 4, p. 738. [Google Scholar]
- Centenaro, M.; Vangelista, L.; Zanella, A.; Zorzi, M. Long-Range Communications in Unlicensed Bands: The Rising Stars in the IoT and Smart City Scenarios. IEEE Wirel. Commun. 2016, 23, 60–67. [Google Scholar] [CrossRef]
- Scheiner, L.; Plapper, P.; Lasnier, A. Proximity Sensing for Safe Human-Robot Interaction in Collaborative Working Cells. Robot. Auton. Syst. 2017, 98, 219–234. [Google Scholar]
- Kempkes, F.L.K.; Wopereis, H.A.N.; van Dijk, C.J. Robotic Harvesting of Fruit in Orchards: Simulation and Field Tests. Int. J. Agric. Robot. 2018, 4, 77–92. [Google Scholar]
- Lehnert, C.; English, A.; Perez, T.; McCool, C. Field Robotics: Achieving Robustness in Agricultural Environments. Robot. Auton. Syst. 2018, 2018. [Google Scholar]
- Chen, W.; Liu, H.; He, Y. Design and Simulation of Ventilation Systems for Robotic Applications. J. Mech. Des. 2017, 139, 071405. [Google Scholar]
- Pérez, L.J.; Núñez, F.; Sánchez, J.; Guerrero, J. Grape Detection and Grading Based on Hyperspectral Imaging and Machine Learning. Comput. Electron. Agric. 2019, 162, 12–19. [Google Scholar]
- Sabatini, M.; Giorgioni, M.; Rinaldi, R. Wireless Communication in Precision Agriculture. Comput. Electron. Agric. 2018, 151, 141–150. [Google Scholar]
- Underwood, J.; Hung, C.; Sukkarieh, S. Mapping and Classification of Orchards Using LiDAR Data. J. Field Robot. 2016, 33, 927–945. [Google Scholar]
- Qin, J.; Burks, T.F.; Kim, D.G.; Choi, D. Hyperspectral Imaging for Detecting Bruises in Apples. J. Food Eng. 2013, 114, 1–10. [Google Scholar]
- Hsieh; Wu, S.H.; H, C.; Chen, C.P. Energy-Efficient Robotic Harvesting Systems. IEEE Trans. Autom. Sci. Eng. 2014, 11, 806–816. [Google Scholar]
- Chin, H.L.; Su, Y.H.; Li, W.L. Solar Energy Harvesting for Agricultural Robots. J. Field Robot. 2011, 28, 111–120. [Google Scholar]
- Cubero, S.; Aleixos, N.; Molto, E.; Blasco, J. Advances in Machine Vision Applications for Automatic Inspection and Quality Evaluation of Fruits and Vegetables. Food Bioprocess Technol. 2016, 9, 280–298. [Google Scholar] [CrossRef]
- Wang, D.; Zhang, Y.; Sun, Y. Obstacle Detection and Avoidance for Mobile Robots in Agriculture. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417717054. [Google Scholar]
- Dong, S.; Wang, X.; Li, Y. Energy-Efficient Design for Field Robots. Appl. Energy 2020, 2020. [Google Scholar]
- Silwal, A.; Gongal, A.; Karkee, M.; Zhang, Q.; Lewis, K. Design of an Apple Harvesting Robot: Adaptive and Delicate Picking Mechanism. Trans. ASABE 2017, 60, 1501–1510. [Google Scholar]
- Dupont, C.; Moreau, R.; Langlois, G. Real-Time Data Processing in Agricultural Robotics. IEEE Trans. Autom. Sci. Eng. 2021, 2021. [Google Scholar]
- Esram, T.; Chapman, P.L. Comparison of Photovoltaic Array Maximum Power Point Tracking Techniques. IEEE Trans. Energy Convers. 2007, 22, 439–449. [Google Scholar] [CrossRef]
- Jiang, H.; Yu, P.; Wang, J. Corrosion Resistant Materials for Robotics in Harsh Environments. Mater. Res. Express 2019, 6, 026402. [Google Scholar]
- Almada, J.M.; Rodríguez, P.L.; González, A.M.; Hernández, S.R.; Martínez, T.F.; Pérez, C.R. Advances in Materials for Robotic Applications. J. Mater. Sci. 2018, 53, 3989–4014. [Google Scholar]
- Fraisse, G.; Boulard, T.; Baille, A. Energy Harvesting in Agriculture: Solar Power for Agricultural Robots. Renew. Energy 2011, 36, 1374–1379. [Google Scholar]
- Shukla, S.; Verma, P.; Sharma, A. Autonomous Navigation in Agricultural Fields Using GPS and LIDAR Sensors. Sensors 2015, 15, 26891–26906. [Google Scholar]
- Gupta, M.; Chauhan, N.; Kaur, R. Advances in Sensing and Communication Technologies for Robotic Fruit Harvesting: A Review. IEEE Access 2019, 7, 95392–95412. [Google Scholar]
- Häberli, L.; Lange, S.; Hutter, M.; Siegwart, R. Climate Chamber Testing of Robotic Systems for Extreme Environments. Robot. Auton. Syst. 2019, 117, 240–249. [Google Scholar]
- Hammoudeh, M.; Newman, R.; Mount, M. A Wireless Sensor Network Border Monitoring System: Deployment Issues and Routing Protocols. IEEE Sens. J. 2015, 15, 3863–3872. [Google Scholar] [CrossRef]
- Hellström, T.; Ringdahl, O.; Svensson, A. An Autonomous Robot for Harvesting Strawberries in Greenhouses. Robot. Auton. Syst. 2012, 61, 1122–1130. [Google Scholar]
- Zhang, Y.; Jiang, L.; Lim, S. Advances in Silicon Solar Cells. Annu. Rev. Mater. Res. 2017, 47, 61–92. [Google Scholar]
- van Henten, E.J.; Barth, R.; Hemming, J.; Edan, Y. Field Robots for Harvesting: State of the Art and Future Challenges. Int. J. Robot. Autom. 2016, 31, 299–324. [Google Scholar]
- Hinton, G.; Deng, L.; Yu, D.; Dahl, G.E.; Mohamed, A.; Jaitly, N.; Kingsbury, B.; Kingsbury, B. Deep Learning for Vision-Based Fruit Recognition. IEEE Trans. Neural Netw. Learn. Syst. 2012, 2012. [Google Scholar]
- Tang, L.; Zhou, J.; Liu, S. Cost-Benefit Analysis of Robotic Systems in Agriculture. Precis. Agric. 2021, 22, 1247–1262. [Google Scholar]
- Johnson, R.B.; Wiles, J. Effective User Interface Design for Robotic Systems. Hum. Comput. Interact. Stud. Robot. Autom. Syst. 2003, 5, 133–149. [Google Scholar]
- Kamei, K.; Nishio, S.; Hagita, N.; Sato, M. Cloud Networked Robotics. IEEE Netw. 2012, 26, 28–34. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Kim, J.; Lee, S.; Park, C. Solar-Powered Autonomous Robots for Agriculture. Renew. Energy 2019, 2019. [Google Scholar]
- Kondo, N. Automation on Fruit and Vegetable Grading System and Food Traceability. Trends Food Sci. Technol. 2010, 21, 145–152. [Google Scholar] [CrossRef]
- Siciliano, B.; Khatib, O. Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Li, H.; Lee, W.S.; Burks, T. Autonomous Orchard Vehicle for Fruit Harvesting: Navigation and Localisation. Biosyst. Eng. 2017, 153, 109–120. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully Convolutional Networks for Semantic Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Pesaran, A.; Santhanagopalan, S.; Kim, G.H. Battery Thermal Management System Design Modeling. J. Power Sources 2013, 238, 301–312. [Google Scholar]
- Pedersen, S.M.; Fountas, S.; Blackmore, B.S.; Gylling, M.; Pedersen, J.L. Agricultural Robots – System Analysis and Economic Feasibility. Precis. Agric. 2008, 9, 341–366. [Google Scholar] [CrossRef]
- Marucci, A.; Caprara, C.; Frontoni, E.; Iacchetti, A.; Longhi, S. Safety Protocols for Human-Robot Interaction in Agriculture. Agron. Res. 2019, 17, 347–356. [Google Scholar]
- Miller, T.; Redfield, S.; Smith, B. System-Level Fault Detection for Autonomous Robots. Robot. Auton. Syst. 2017, 91, 112–122. [Google Scholar]
- Palacin, M.R. Recent Advances in Rechargeable Battery Materials: A Chemist’s Perspective. Chem. Soc. Rev. 2016, 45, 2747–2786. [Google Scholar] [CrossRef] [PubMed]
- Murphy, R.R. Disaster Robotics; MIT Press: Cambridge, MA, USA, 2014. [Google Scholar]
- Neretti, A.; Di Carlo, M.; Messina, P. Performance of Solar Panels in Dusty Environments. Sol. Energy 2016, 135, 158–166. [Google Scholar]
- LaValle, S.M. Planning Algorithms; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar]
- Nitta, N.; Wu, F.; Lee, J.T.; Yushin, G. Li-ion Battery Materials: Present and Future. Mater. Today 2015, 18, 252–264. [Google Scholar] [CrossRef]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C.; Corke, P. DeepFruits: A Fruit Detection System Using Deep Neural Networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef]
- Zhou, Q.; Liu, X.; Meng, Q. Environmental Sensing and Adaptation in Robotic Systems. IEEE Sens. J. 2021, 21, 11372–11381. [Google Scholar]
- Pajarola, R. Efficient Level-of-Detail Computation of Point-Based Representations. IEEE Trans. Vis. Comput. Graph. 2002, 10, 70–83. [Google Scholar]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
- Van Henten, E.J.; Bac, C.W.; Van Tuijl, B.A.J.; Barth, R.; Hemming, J.; Bontsema, J. Robotic Harvesting of Fruit: Precision and Delicacy. Biosyst. Eng. 2006, 95, 135–147. [Google Scholar]
- Huang, Y.; Shen, W.; Jiang, L. Active Cooling Systems for Robotic Applications. J. Therm. Sci. Eng. Appl. 2015, 7, 021009. [Google Scholar]
- Patil, D.; Oza, K. Application of Image Processing in Fruit and Vegetable Analysis: A Review. J. Adv. Res. Comput. Commun. Eng. 2018, 7, 567–570. [Google Scholar]
- Shamshiri, R.R.; Weltzien, C.; Hameed, I.A.; Jablonowski, N.D.; Ball, A.; Cheein, F.A. Research and Development in Agricultural Robotics: A Perspective of Digital Farming. Int. J. Agric. Biol. Eng. 2018, 11, 4. [Google Scholar] [CrossRef]
- Slaughter, D.C.; Giles, D.K.; Downey, D. Vision-Based Sensing for Automation of Agricultural Vehicles. Comput. Electron. Agric. 2008, 59, 64–78. [Google Scholar]
- Pérez-Ruiz, M.; Agüera, J.; Rodríguez-Lizana, A. Mapping Orchard Yield and Efficiency using a Robot. Comput. Electron. Agric. 2013, 95, 54–66. [Google Scholar]
- Norman, D.A. The Design of Everyday Things: Revised and Expanded Edition; Basic Books: New York, NY, USA, 2013. [Google Scholar]
- Rotz, S.; Fraser, E.D.G.; Martin, R.C. Economic Analysis of Robot Harvesters in Agriculture. Agric. Syst. 2020, 183, 102874. [Google Scholar]
Category | Sample | Percentage |
---|---|---|
Banana | 1.067 | 7.89 |
Cherry | 1.001 | 7.41 |
Strawberry | 1.004 | 7.43 |
Lemon | 1.029 | 7.61 |
Tangerine | 1.005 | 7.44 |
Mango | 1.013 | 7.49 |
Apple | 1.032 | 7.64 |
Blackberry | 1.086 | 8.03 |
Orange | 1.038 | 7.68 |
Papaya | 1.026 | 7.59 |
Pear | 1.070 | 7.92 |
Pineapple | 1.018 | 7.53 |
Grape | 1.127 | 8.34 |
Specification | Detail |
---|---|
Dimensions | |
Length | 1.5 m |
Width | 0.8 m |
Height | 1.2 m |
Weight | 120 kg |
Mobility | |
Traction type | Wheels with all-terrain capability |
Wheel drive | 4-wheel drive (4WD) for better terrain adaptability |
Maximum speed | 1.5 km/h |
Slope capacity | Up to 20 degrees |
Turning radius | 0.75 m |
Obstacle clearance | 25 cm |
Perception Systems | |
Cameras | RGB camera, multispectral camera |
Lidar | 3D LiDAR with 30-m range |
Proximity sensors | Ultrasound, infrared |
Touch sensors | Pressure sensors on the harvesting arm |
Additional sensors | Depth camera for enhanced spatial perception |
Navigation System | |
GPS | Accuracy of ±2 cm |
RTK-GPS | Real-time kinematic GPS for precise positioning |
Odometry sensors | Encoders on the wheels |
IMU | Inertial measurement unit for motion tracking |
Path planning algorithm | SLAM (Simultaneous Localization and Mapping) |
Processing and Control | |
Processing unit | Multicore CPU, GPU for image processing |
AI algorithms | Neural networks for fruit detection, route planning |
Communication protocols | ROS (Robot Operating System) |
Control system | PID and fuzzy control for accurate motor coordination |
Gathering Capacity | |
Robotic arms | 1–2 robotic arms with adaptive grippers |
Actuation system | Electric actuators with feedback control |
End-effector type | Soft robotic gripper for delicate fruit handling |
Time per pickup | 5–10 s per fruit |
Load capacity | Up to 20 kg of fruit harvested |
Maximum reach | 1.5 m |
Power System and Autonomy | |
Battery type | Lithium-ion battery, 48 V, 100 Ah |
Battery life | 10 h of continuous operation |
Power consumption | 500 W to 1000 W (depending on tasks) |
Charging time | 4–6 h for a full charge |
Solar charging | Optional solar panels for extended range |
Energy management | Smart power distribution to critical components |
Interaction and Security | |
Communication | Wi-Fi, Bluetooth, 4G connectivity for remote monitoring |
User interface | Touch screen, mobile app, voice command option |
Safety sensors | Proximity sensors, emergency stop buttons, vision-based obstacle detection |
Security features | Encrypted communication, password-protected access |
Compliance | CE certification for machinery safety |
Environmental Conditions | |
Operating temperature | −10 °C to 40 °C |
Water resistance | IP65-rated for protection against dust and water jets |
Relative humidity | 10–90% non-condensing |
Wind resistance | Operational under wind speeds of up to 15 m/s |
Ground adaptability | Capable of operating on loose soil, gravel, and wet surfaces |
Maintenance and Lifespan | |
Maintenance interval | Routine check every 100 h of operation |
Spare parts | Modular design for easy replacement of damaged parts |
Expected lifespan | 10 years with regular maintenance |
Software updates | Over-the-air (OTA) updates for control software |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Salazar, M.; Portero, P.; Rubin, J. Identification of Technical Requirements for the Initial Design of an Autonomous Fruit-Harvesting Robot. AgriEngineering 2024, 6, 3823-3842. https://doi.org/10.3390/agriengineering6040218
Salazar M, Portero P, Rubin J. Identification of Technical Requirements for the Initial Design of an Autonomous Fruit-Harvesting Robot. AgriEngineering. 2024; 6(4):3823-3842. https://doi.org/10.3390/agriengineering6040218
Chicago/Turabian StyleSalazar, Maxwell, Paola Portero, and Juan Rubin. 2024. "Identification of Technical Requirements for the Initial Design of an Autonomous Fruit-Harvesting Robot" AgriEngineering 6, no. 4: 3823-3842. https://doi.org/10.3390/agriengineering6040218
APA StyleSalazar, M., Portero, P., & Rubin, J. (2024). Identification of Technical Requirements for the Initial Design of an Autonomous Fruit-Harvesting Robot. AgriEngineering, 6(4), 3823-3842. https://doi.org/10.3390/agriengineering6040218