Optimizing Waste Sorting for Sustainability: An AI-Powered Robotic Solution for Beverage Container Recycling
Abstract
:1. Introduction
2. Literature Reviews
2.1. Current State of Waste Classification Systems
2.2. Gripper and Sensor Mounted on Automatic Waste Sorting Robot
2.3. Algorithms for Waste Identification
3. Materials and Methods
3.1. Configuration and Control of the Automatic Beverage Container Sorting Robot
3.2. Detection of Objects and Determination of Material (PET, Bottles, Cans)
3.3. Preliminary Validation to Identify Pick Points and Determine Color Sorting Methods for Glass Bottles
3.3.1. Pick Point Identification
3.3.2. Color Sorting
3.4. The Pick Point Identification and Glass Bottle Color Sorting Method Finally Adopted
3.4.1. Image Moment Calculation Method
3.4.2. Color Sorting Method Integrating SAM and CNN
Algorithm 1: Glass Bottle Detection and Color Classification |
1: Input: Image with glass bottle 2: Output: Predicted color label for the bottle (e.g., Transparent, Brown, Other) 3: 4: Step 1: Detect Glass Bottle in Image 5: Load detection model (e.g., YOLO, Faster R-CNN) 6: detections ← model.detect(image) 7: bottle_box ← find_bottle_bounding_box(detections) 8: 9: Step 2: Create Bounding Box 10: Extract bounding box coordinates: (xmin, ymin, xmax, ymax) 11: cropped_image ← crop_image(image, bottle_box) 12: 13: Step 3: Segment Bottle Using SAM within Bounding Box 14: Load SAM segmentation model 15: mask ← sam_model.apply_segmentation(cropped_image) 16: segmented_bottle ← apply_mask(cropped_image, mask) 17: Save segmented_bottle as “segmented_bottle.png” 18: 19: Step 4: Preprocess Segmented Image for Color Classification CNN 20: resized_image ← resize(segmented_bottle, target_size = (224, 224)) 21: normalized_image ← normalize(resized_image) 22: 23: Step 5: Classify Bottle Color with CNN 24: Load color classification CNN model 25: color_class ← color_cnn.predict(normalized_image) 26: 27: Step 6: Output Classification Result 28: if color_class = 0 then 29: Label as “Transparent” 30: else if color_class = 1 then 31: Label as “Brown” 32: else 33: Label as “Other” 34: end if 35: Print classification result |
3.5. Composition and Creation Methods of Training Data
3.6. Performance Evaluation Experimental Setup
- To test both the image recognition AI and the entire robot system’s performance and to figure out the challenges within the sorting process, a real-scale robot performance experiment is designed as follows.
- Only the placement of beverage containers is allowed to be performed manually while the whole robot system is not working.
- After the placement, operators start the robot sorting and take videos both at the robot site and the image recognition AI site, which represents the identification result on the screen. The sorting result is also collected with the progress of the experiment. The data are collected by counts of acquisition actions, positive solutions, acquisition misses, identification errors of AI, missorting, and the time of experiment by seconds.
- All experiments are conducted with constant room light conditions in the laboratory, a conveyor belt speed of 156 mm/s, and robot settings.
- The classification of robot sorting is designed as sorting PET bottles, cans, clear glass, brown glass, and other glass simultaneously.
- Experimental results are analyzed through the data collected during the experiment, and the counting results from video data.
- The image recognition AI is tested by counting the miss recognitions, and the properties of the mistakes are analyzed.
- The robot system is tested by counting the rate of acquisition operation, successful picking, correct sorting, and picking misses. The properties of picking misses are analyzed by calculating the counts and percentage of specific properties among the total number of missed acquisitions.
- A total number of 495 beverage containers are sorted as inputs.
- The acquisition operation is set to pick up an object with an interval of 180 to 260 ms from the previous operation. Theoretically, 180 ms is the shortest pick interval, but to protect the robot, the interval is set to 260 ms.
- To avoid malfunctions due to collisions between the robot and the conveyor belt, the pneumatic suction cup attached to the robot is set to descend to about 4 cm above the surface of the conveyor belt after reaching the pick point.
- The robot is evaluated by calculating the percentage of each sorting and image recognition result. The sorting efficiency is calculated by dividing the count of acquisitions by experimental time in seconds.
4. Results and Discussions
4.1. The Results of the Experiments
4.2. Discussion of the AI Experiment Results
4.3. Discussion of the Results of the Robot System Experiment
5. Conclusions and Further Study Recommendations
5.1. Conclusions
5.2. Further Study Recommendations
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Onoda, H. Prospects for Contactless and Automatic Waste Collection. Mater. Cycles Waste Manag. Res. 2021, 32, 155–162. [Google Scholar] [CrossRef]
- Kiyokawa, T.; Takamatsu, J.; Koyanaka, S. Challenges for Future Robotic Sorters of Mixed Industrial Waste: A Survey. IEEE Trans. Autom. Sci. Eng. 2022, 21, 1023–1040. [Google Scholar] [CrossRef]
- Kim, D.; Lee, S.; Park, M.; Lee, K.; Kim, D.-Y. Designing of reverse vending machine to improve its sorting efficiency for recyclable materials for its application in convenience stores. J. Air Waste Manag. Assoc. 2021, 71, 1312–1318. [Google Scholar] [CrossRef]
- Dong, L.; Dong, C.; Zhi, W.; Li, W.; Gu, B.; Yang, T. Combining the Fine Physical Purification Process with Photoelectric Sorting to Recycle PET Plastics from Waste Beverage Containers. ACS Sustain. Chem. Eng. 2024, 12, 11377–11384. [Google Scholar] [CrossRef]
- Madsen, A.M.; Raulf, M.; Duquenne, P.; Graff, P.; Cyprowski, M.; Beswick, A.; Laitinen, S.; Rasmussen, P.U.; Hinker, M.; Kolk, A.; et al. Review of biological risks associated with the collection of municipal wastes. Sci. Total. Environ. 2021, 791, 148287. [Google Scholar] [CrossRef] [PubMed]
- Xiao, W.; Yang, J.; Fang, H.; Zhuang, J.; Ku, Y. Classifying construction and demolition waste by combining spatial and spectral features. Proc. Inst. Civ. Eng. Waste Resour. Manag. 2020, 173, 79–90. [Google Scholar] [CrossRef]
- Satav, A.G.; Kubade, S.; Amrutkar, C.; Arya, G.; Pawar, A. A state-of-the-art review on robotics in waste sorting: Scope and challenges. Int. J. Interact. Des. Manuf. (IJIDeM) 2023, 17, 2789–2806. [Google Scholar] [CrossRef]
- Ihsanullah, I.; Alam, G.; Jamal, A.; Shaik, F. Recent advances in applications of artificial intelligence in solid waste management: A review. Chemosphere 2022, 309, 136631. [Google Scholar] [CrossRef]
- Ji, T.; Fang, H.; Zhang, R.; Yang, J.; Fan, L.; Li, J. Automatic sorting of low-value recyclable waste: A comparative experimental study. Clean Technol. Environ. Policy 2022, 25, 949–961. [Google Scholar] [CrossRef]
- Mao, W.-L.; Chen, W.-C.; Wang, C.-T.; Lin, Y.-H. Recycling waste classification using optimized convolutional neural network. Resour. Conserv. Recycl. 2020, 164, 105132. [Google Scholar] [CrossRef]
- Li, N.; Chen, Y. Municipal solid waste classification and real-time detection using deep learning methods. Urban Clim. 2023, 49, 101462. [Google Scholar] [CrossRef]
- Bobulski, J.; Kubanek, M. Deep Learning for Plastic Waste Classification System. Appl. Comput. Intell. Soft Comput. 2021, 2021, 1–7. [Google Scholar] [CrossRef]
- Mohammed, M.A.; Abdulhasan, M.J.; Kumar, N.M.; Abdulkareem, K.H.; Mostafa, S.A.; Maashi, M.S.; Khalid, L.S.; Abdulaali, H.S.; Chopra, S.S. Automated waste-sorting and recycling classification using artificial neural network and features fusion: A digital-enabled circular economy vision for smart cities. Multimed. Tools Appl. 2022, 82, 39617–39632. [Google Scholar] [CrossRef]
- Chen, X.; Huang, H.; Liu, Y.; Li, J.; Liu, M. Robot for automatic waste sorting on construction sites. Autom. Constr. 2022, 141, 104387. [Google Scholar] [CrossRef]
- Koskinopoulou, M.; Raptopoulos, F.; Papadopoulos, G.; Mavrakis, N.; Maniadakis, M. Robotic Waste Sorting Technology: Toward a Vision-Based Categorization System for the Industrial Robotic Separation of Recyclable Waste. IEEE Robot. Autom. Mag. 2021, 28, 50–60. [Google Scholar] [CrossRef]
- Chen, J.; Fu, Y.; Lu, W.; Pan, Y. Augmented reality-enabled human-robot collaboration to balance construction waste sorting efficiency and occupational safety and health. J. Environ. Manag. 2023, 348, 119341. [Google Scholar] [CrossRef]
- Lin, Y.-H.; Mao, W.-L.; Fathurrahman, H.I.K. Development of intelligent Municipal Solid waste Sorter for recyclables. Waste Manag. 2023, 174, 597–604. [Google Scholar] [CrossRef] [PubMed]
- Bonello, D.; Saliba, M.A.; Camilleri, K.P. An Exploratory Study on the Automated Sorting of Commingled Recyclable Domestic Waste. Procedia Manuf. 2017, 11, 686–694. [Google Scholar] [CrossRef]
- Gupta, T.; Joshi, R.; Mukhopadhyay, D.; Sachdeva, K.; Jain, N.; Virmani, D.; Garcia-Hernandez, L. A deep learning approach based hardware solution to categorise garbage in environment. Complex Intell. Syst. 2021, 8, 1129–1152. [Google Scholar] [CrossRef]
- Inamura, T.; Kojo, N.; Hatao, N.; Tokutsu, S.; Fujimoto, J.; Sonoda, T.; Okada, K.; Inaba, M. Realization of Trash Separation of Bottles and Cans for Humanoids using Eyes, Hands and Ears. J. Robot. Soc. Jpn. 2007, 25, 813–821. [Google Scholar] [CrossRef]
- Calvini, R.; Orlandi, G.; Foca, G.; Ulrici, A. Development of a classification algorithm for efficient handling of multiple classes in sorting systems based on hyperspectral imaging. J. Spectr. Imaging 2018, 7, 1–15. [Google Scholar] [CrossRef]
- Lu, W.; Chen, J. Computer vision for solid waste sorting: A critical review of academic research. Waste Manag. 2022, 142, 29–43. [Google Scholar] [CrossRef]
- Nakano, H.; Kawamoto, N.; Umemoto, T.; Katsuragi, I. Development of a Collaborative Robot-Based Support System for Sorting Recyclable Waste; Japan Society of Waste Management: Tokyo, Japan, 2021. [Google Scholar]
- Zhong, Z.; Zheng, L.; Kang, G.; Li, S.; Yang, Y. Random Erasing Data Augmentation. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020. [Google Scholar]
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Kiyokawa, T.; Katayama, H.; Tatsuta, Y.; Takamatsu, J.; Ogasawara, T. Robotic Waste Sorter with Agile Manipulation and Quickly Trainable Detector. IEEE Access 2021, 9, 124616–124631. [Google Scholar] [CrossRef]
- Nikko Petris (PET Bottle Sorting Machine). Available online: https://www.nikko-net.co.jp/product/environment/petris.html (accessed on 26 November 2023).
- N. Craft Rattling: Automatic Beverage Container Sorting Machine. Available online: https://www.n-craft.biz/product12.html (accessed on 26 November 2023).
- Nihon Cim Sorting Machine. Available online: https://www.nihon-cim.co.jp/product/sorting-machine/hisen.html (accessed on 13 September 2021).
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. Pytorch: An imperative style, high-performance deep learning library. arXiv 2019, arXiv:1912.01703. [Google Scholar] [CrossRef]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar]
- Lin, T.-Y.; Maire, M.; Belongie, S.; Bourdev, L.; Girshick, R.; Hays, J.; Perona, P.; Ramanan, D.; Zitnick, C.L.; Dollár, P. Microsoft COCO: Common Objects in Context. In Computer Vision—ECCV 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer: Cham, Switzerland, 2014; Volume 8693, pp. 740–755. ISBN 978-3-319-10601-4. [Google Scholar] [CrossRef]
- Supervisely Supervisely. Available online: https://supervisely.com/ (accessed on 26 November 2023).
- Zivkovic, Z.; van der Heijden, F. Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recognit. Lett. 2006, 27, 773–780. [Google Scholar] [CrossRef]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.-Y.; et al. Segment Anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 1–6 October 2023. [Google Scholar] [CrossRef]
Automatic Beverage Container Sorting Robot | Sorting Targets | Recognition Method | Ideal Placement Location |
---|---|---|---|
This study | PET bottles, cans, clear glass, brown glass, other glass | Image recognition | Emission site and intermediate treatment facility |
PETRIS | PET, other bottles | Object detection/transmission detection sensor | Intermediate treatment facility |
GARAGARAPON | PET, aluminum cans, steel cans | Air Knife, Aluminum separator | Emission site |
HISEN | PET, other bottles | Suction Blower | Intermediate treatment facility |
Robot System Components | Details |
---|---|
CPU | 12th Gen Intel Core i9-12900K × 24 Intel, Santa Clara, CA, USA. |
GPU | GTX 3090 Ti NVIDA, Santa Clara, CA, USA. |
RAM | CFD Crucia DDR4-320MH2 16GB × 2 Crucial, Boise, ID, USA |
Storage | Weston Digital SN770 500 GB SSD Western Digital, San Jose, CA, USA. |
OS | Ubuntu 20.04 64 bits Canonical, London, UK |
Camera | Real sense D435 Intel, Santa Clara, CA, USA. |
Robot | Hiwin delta robot Hiwin, Taiwan, China. |
Conveyor belt | MMW-H-340-1200-400-1VH-20 Okura Yusoki Co., Ltd., Hyogo, Japan. |
relay | SONGLE SRD-05VDC.SL-G National Institute of Advanced Industrial Science and Technology, Ibaraki, Japan. |
Recall | Precision | mAP 0.5–0.95 | mAP 0.5 |
---|---|---|---|
0.9948 | 0.984 | 0.8968 | 0.9967 |
False Recognition Type | With Label | Without Label | Total Amount | |
---|---|---|---|---|
Misrecognition | PET-Glass | 3 | 23 | 26 |
Glass-PET | 0 | 1 | 1 | |
Glass-CAN | 1 | 0 | 1 | |
total | 4 | 24 | 28 | |
False positive rate (false positive/total) | 1% | 24% | 6% | |
Total number of experiments | 394 | 101 | 495 |
Quantities | Rate | Calculation Method | |
---|---|---|---|
Acquisition actions | 413 | 83% | Actions/inputs received |
Succeed picking | 322 | 78% | Number of acquisitions/activities |
Correct sorting | 299 | 93% | Positive solutions/successful picking |
Picking miss | 168 | 34% | Number of acquisition misses/inputs |
Misrecognition | 23 | 6% | Identification error/number of inputs |
Missorting | 5 | 1% | (identification/acquisition error)/number of inputs |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cheng, T.; Kojima, D.; Hu, H.; Onoda, H.; Pandyaswargo, A.H. Optimizing Waste Sorting for Sustainability: An AI-Powered Robotic Solution for Beverage Container Recycling. Sustainability 2024, 16, 10155. https://doi.org/10.3390/su162310155
Cheng T, Kojima D, Hu H, Onoda H, Pandyaswargo AH. Optimizing Waste Sorting for Sustainability: An AI-Powered Robotic Solution for Beverage Container Recycling. Sustainability. 2024; 16(23):10155. https://doi.org/10.3390/su162310155
Chicago/Turabian StyleCheng, Tianhao, Daiki Kojima, Hao Hu, Hiroshi Onoda, and Andante Hadi Pandyaswargo. 2024. "Optimizing Waste Sorting for Sustainability: An AI-Powered Robotic Solution for Beverage Container Recycling" Sustainability 16, no. 23: 10155. https://doi.org/10.3390/su162310155
APA StyleCheng, T., Kojima, D., Hu, H., Onoda, H., & Pandyaswargo, A. H. (2024). Optimizing Waste Sorting for Sustainability: An AI-Powered Robotic Solution for Beverage Container Recycling. Sustainability, 16(23), 10155. https://doi.org/10.3390/su162310155