Geometric based apple suction strategy for robotic packaging
Keywords:
apple, suction cup, robotic packaging, robotic manipulation, point cloudAbstract
Packaging is one of the least automated steps among all the fruit postharvest processes, which is time-consuming and labor-intensive. Therefore, a robust suction strategy for robotic manipulation needs to be developed. In this research, a geometric-based apple suction strategy for robotic packaging was studied, including suction cup design, optimal suction region selection algorithm, and robot system integration. In the first place, on the basis of the geometric features of the spheroid fruit, the structure of the suction cups was designed to provide reliable suction force. Then, suction force measurement experiments on both acrylic balls and apples were conducted. Based on the results, the parameters of the suction cup were finally determined. The results also indicated that the curvature radius of the suction region is supposed to larger than that of the suction cups. Furthermore, a robust suction region selection algorithm was developed, which involves four steps: RGB-D information acquisition, object detection and point cloud generation, spherical fitting, and suction region selection. Finally, the above methods were integrated into a robotic packaging system. In addition, on the basis of spatial-frequency domain imaging (SFDI) technology, early stage bruise was detected for validation. The results showed that, the proposed suction strategy and system is potential for robust robotic apple packaging. Keywords: apple, suction cup, robotic packaging, robotic manipulation, point cloud DOI: 10.25165/j.ijabe.20241703.7947 Citation: Wang Z, Wang Q Y, Lou M Z, Wu F, Zhu Y N, Hu D, et al. Geometric based apple suction strategy for robotic packaging. Int J Agric & Biol Eng, 2024; 17(3): 12-20.References
[1] Zhang Z, Lu Y Z, Lu R F. Development and evaluation of an apple infield grading and sorting system. Postharvest Biology and Technology, 2021; 180: 111588.
[2] Zhao M, Peng Y K, Li L. A robot system for the autodetection and classification of apple internal quality attributes. Postharvest Biology and Technology, 2021; 180: 111615.
[3] Wang Z H, Xun Y, Wang Y K, Yang Q H. Review of smart robots for fruit and vegetable picking in agriculture. Int J Agric & Biol Eng, 2022; 15(1): 33–54.
[4] Zeng A, Yu K-T, Song S R, Suo D, Walker E, Rodriguez A, et al. Multi-view self-supervised deep learning for 6d pose estimation in the amazon picking challenge. 2017 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2017. DOI: 10.1109/ICRA.2017.7989165
[5] Corbato C H, Bharatheesha M, Van Egmond J, Ju J H, Wisse M. Integrating different levels of automation: Lessons from winning the amazon robotics challenge 2016. IEEE Transactions on Industrial Informatics, 2018; 14(11): 4916–4926.
[6] Zeng A, Song S R, Yu K-T, Donlon E, Hogan F R, Bauza M, et al. Robotic pick-and-place of novel objects in clutter with multi-affordance grasping and cross-domain image matching. The International Journal of Robotics Research, 2019; 41(7): 690–705.
[7] Aoyagi S, Suzuki M, Morita T, Takahashi T, Takise H. Bellows suction cup equipped with force sensing ability by direct coating thin-film resistor for vacuum type robotic hand. IEEE/ASME Transactions on Mechatronics, 2020; 25(5): 2501–2512.
[8] Zhang B H, Xie Y X, Zhou J, Wang K, Zhang Z. State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Computers and Electronics in Agriculture, 2020; 177: 105694.
[9] Wang L-J, Zhang Q, Song H Y. Wang Z-W. Mechanical damage of ‘Huangguan’ pear using different packaging under random vibration. Postharvest Biology and Technology, 2022; 187: 111847.
[10] Eppner C, Höfer S, Jonschkowski R, Martín-Martín R, Sieverling A, Wall V, et al. Four aspects of building robotic systems: lessons from the Amazon Picking Challenge 2015. Autonomous Robots, 2018; 42(7): 1459–1475.
[11] Hasegawa S, Wada K, Niitani Y, Okada K, Inaba M. A three-fingered hand with a suction gripping system for picking various objects in cluttered narrow space. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2017. DOI:10.1109/IROS.2017.8202288
[12] Zhang B H, Zhou J, Meng Y M, Zhang N, Gu B X, Yan Z H, Idris S I. Comparative study of mechanical damage caused by a two-finger tomato gripper with different robotic grasping patterns for harvesting robots. Biosystems Engineering, 2018; 171: 245–257.
[13] Liu C-H, Huang G-F, Chiu C-H, Pai T-Y. Topology synthesis and optimal design of an adaptive compliant gripper to maximize output displacement. Journal of Intelligent & Robotic Systems, 2018; 90(3): 287–304.
[14] Hughes J, Scimeca L, Ifrim I, Maiolino P, Iida F. Achieving robotically peeled lettuce. IEEE Robotics and Automation Letters, 2018; 3(4): 4337–4342.
[15] Kurpaska S, Sobol Z, Pedryc N, Hebda T, Nawara P. Analysis of the pneumatic system parameters of the suction cup integrated with the head for harvesting strawberry fruit. Sensors, 2020; 20(16): 4389.
[16] Zhakypov Z, Heremans F, Billard A, Paik J. An origami-inspired reconfigurable suction gripper for picking objects with variable shape and size. IEEE Robotics Automation Letters, 2018; 3(4): 2894–2901.
[17] Gilday K, Lilley J, Iida F. Suction cup based on particle jamming and its performance comparison in various fruit handling tasks. 2020 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), IEEE, 2020. DOI: 10.1109/AIM43001.2020.9158945.
[18] Uppalapati N K, Walt B, Havens A J, Mahdian A, Chowdhary G, Krishnan G. A berry picking robot with a hybrid soft-rigid arm: design and task space control. Robotics: Science and Systems, 2020. DOI:10.15607/RSS.2020.XVI.027.
[19] Koivikko A. Drotlef D-M, Dayan C B, Sariola V, Sitti M. 3D-printed pneumatically controlled soft suction cups for gripping fragile, small, and rough objects. Advanced Intelligent Systems, 2021; 3(9): 2100034.
[20] Aoyagi S, Morita T, Shintani T, Takise H, Takahashi T, Suzuki M. Formation of PVDF piezoelectric film on 3D bellows surface of robotic suction cup for providing force sensing ability-feasibility study on two methods of dip-coating and lamination. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2019. DOI:10.1109/iros40897.2019.8967654.
[21] Seo J, Kim S, Kumar V. Planar, bimanual, whole-arm grasping. 2012 IEEE International Conference on Robotics and Automation, IEEE, 2012.
[22] Zhao Z ZH, Shang W W, He H Y, Li Z J. Grasp prediction and evaluation of multi-fingered dexterous hands using deep learning. Robotics and Autonomous Systems, 2020; 129: 103550.
[23] Giefer L A, Arango Castellanos J D, Babr M M, Freitag M. Deep learning-based pose estimation of apples for inspection in logistic centers using single-perspective imaging. Processes, 2019; 7(7): 424.
[24] Mahler J, Pokorny F T, Hou B, Roderick M, Laskey M, Aubry M, et al. Goldberg. Dex-net 1.0: A cloud-based network of 3d objects for robust grasp planning using a multi-armed bandit model with correlated rewards. 2016 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2016.
[25] Valencia A J, Idrovo R M, Sapp.A D, Guingla D P, Ochoa D. A 3D vision based approach for optimal grasp of vacuum grippers. 2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and Their Application to Mechatronics (ECMSM), IEEE, 2017.
[26] Gori I, Pattacini U, Tikhanoff V, Metta G. Three-finger precision grasp on incomplete 3d point clouds. 2014 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2014.
[27] Mahler J, Liang J, Niyaz S, Laskey M, Doan R, Liu X Y, et al. Dex-net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics. arXiv preprint, 2017.
[28] Park K, Prankl J, Vincze M. Mutual hypothesis verification for 6D pose estimation of natural objects. Proceedings of the IEEE International Conference on Computer Vision Workshops, 2017.
[29] Mahler J, Matl M, Liu X Y, Li A, Gealy D, Goldberg K. Dex-net 3.0: Computing robust vacuum suction grasp targets in point clouds using a new analytic model and deep learning. 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2018.
[30] Utomo T W, Cahyadi A I, Ardiyanto I. Suction-based grasp point estimation in cluttered environment for robotic manipulator using deep learning-based affordance map. International Journal of Automation and Computing, 2021; 18(2): 277–287.
[31] You F, Mende M, Štogl D, Hein B, Kröger T. Model-free grasp planning for configurable vacuum grippers. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2018.
[32] Guo N, Zhang B H, Zhou J, Zhan K T, Lai S. Pose estimation and adaptable grasp configuration with point cloud registration and geometry understanding for fruit grasp planning. Computers and Electronics in Agriculture, 2020; 179: 105818.
[33] Keselman L, Iselin Woodfill J, Grunnet-Jepsen A, Bhowmik A. Intel realsense stereoscopic depth cameras. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2017.
[34] Wang Q Y, Wu D H, Liu W, Lou M ZH, Jiang H Y, Ying Y B, Zhou M C. PlantStereo: A High Quality Stereo Matching Dataset for Plant Reconstruction. Agriculture, 2023; 13(2): 330.
[35] Sun Z Z, Hu D, Xie L J, Ying Y B. Detection of early stage bruise in apples using optical property mapping. Computers and Electronics in Agriculture, 2022; 194: 106725.
[36] Wang Q Y, Wu D H, Sun Z ZH, et al. Design, integration, and evaluation of a robotic peach packaging system based on deep learning. Computers and Electronics in Agriculture, 2023; 211: 108013.
[37] Wang Q Y, Xing H, Ying Y B, Zhou M C. CGFNet: 3D Convolution Guided and Multi-scale Volume Fusion Network for fast and robust stereo matching. Pattern Recognition Letters, 2023; 173: 38–44.
[2] Zhao M, Peng Y K, Li L. A robot system for the autodetection and classification of apple internal quality attributes. Postharvest Biology and Technology, 2021; 180: 111615.
[3] Wang Z H, Xun Y, Wang Y K, Yang Q H. Review of smart robots for fruit and vegetable picking in agriculture. Int J Agric & Biol Eng, 2022; 15(1): 33–54.
[4] Zeng A, Yu K-T, Song S R, Suo D, Walker E, Rodriguez A, et al. Multi-view self-supervised deep learning for 6d pose estimation in the amazon picking challenge. 2017 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2017. DOI: 10.1109/ICRA.2017.7989165
[5] Corbato C H, Bharatheesha M, Van Egmond J, Ju J H, Wisse M. Integrating different levels of automation: Lessons from winning the amazon robotics challenge 2016. IEEE Transactions on Industrial Informatics, 2018; 14(11): 4916–4926.
[6] Zeng A, Song S R, Yu K-T, Donlon E, Hogan F R, Bauza M, et al. Robotic pick-and-place of novel objects in clutter with multi-affordance grasping and cross-domain image matching. The International Journal of Robotics Research, 2019; 41(7): 690–705.
[7] Aoyagi S, Suzuki M, Morita T, Takahashi T, Takise H. Bellows suction cup equipped with force sensing ability by direct coating thin-film resistor for vacuum type robotic hand. IEEE/ASME Transactions on Mechatronics, 2020; 25(5): 2501–2512.
[8] Zhang B H, Xie Y X, Zhou J, Wang K, Zhang Z. State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Computers and Electronics in Agriculture, 2020; 177: 105694.
[9] Wang L-J, Zhang Q, Song H Y. Wang Z-W. Mechanical damage of ‘Huangguan’ pear using different packaging under random vibration. Postharvest Biology and Technology, 2022; 187: 111847.
[10] Eppner C, Höfer S, Jonschkowski R, Martín-Martín R, Sieverling A, Wall V, et al. Four aspects of building robotic systems: lessons from the Amazon Picking Challenge 2015. Autonomous Robots, 2018; 42(7): 1459–1475.
[11] Hasegawa S, Wada K, Niitani Y, Okada K, Inaba M. A three-fingered hand with a suction gripping system for picking various objects in cluttered narrow space. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2017. DOI:10.1109/IROS.2017.8202288
[12] Zhang B H, Zhou J, Meng Y M, Zhang N, Gu B X, Yan Z H, Idris S I. Comparative study of mechanical damage caused by a two-finger tomato gripper with different robotic grasping patterns for harvesting robots. Biosystems Engineering, 2018; 171: 245–257.
[13] Liu C-H, Huang G-F, Chiu C-H, Pai T-Y. Topology synthesis and optimal design of an adaptive compliant gripper to maximize output displacement. Journal of Intelligent & Robotic Systems, 2018; 90(3): 287–304.
[14] Hughes J, Scimeca L, Ifrim I, Maiolino P, Iida F. Achieving robotically peeled lettuce. IEEE Robotics and Automation Letters, 2018; 3(4): 4337–4342.
[15] Kurpaska S, Sobol Z, Pedryc N, Hebda T, Nawara P. Analysis of the pneumatic system parameters of the suction cup integrated with the head for harvesting strawberry fruit. Sensors, 2020; 20(16): 4389.
[16] Zhakypov Z, Heremans F, Billard A, Paik J. An origami-inspired reconfigurable suction gripper for picking objects with variable shape and size. IEEE Robotics Automation Letters, 2018; 3(4): 2894–2901.
[17] Gilday K, Lilley J, Iida F. Suction cup based on particle jamming and its performance comparison in various fruit handling tasks. 2020 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), IEEE, 2020. DOI: 10.1109/AIM43001.2020.9158945.
[18] Uppalapati N K, Walt B, Havens A J, Mahdian A, Chowdhary G, Krishnan G. A berry picking robot with a hybrid soft-rigid arm: design and task space control. Robotics: Science and Systems, 2020. DOI:10.15607/RSS.2020.XVI.027.
[19] Koivikko A. Drotlef D-M, Dayan C B, Sariola V, Sitti M. 3D-printed pneumatically controlled soft suction cups for gripping fragile, small, and rough objects. Advanced Intelligent Systems, 2021; 3(9): 2100034.
[20] Aoyagi S, Morita T, Shintani T, Takise H, Takahashi T, Suzuki M. Formation of PVDF piezoelectric film on 3D bellows surface of robotic suction cup for providing force sensing ability-feasibility study on two methods of dip-coating and lamination. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2019. DOI:10.1109/iros40897.2019.8967654.
[21] Seo J, Kim S, Kumar V. Planar, bimanual, whole-arm grasping. 2012 IEEE International Conference on Robotics and Automation, IEEE, 2012.
[22] Zhao Z ZH, Shang W W, He H Y, Li Z J. Grasp prediction and evaluation of multi-fingered dexterous hands using deep learning. Robotics and Autonomous Systems, 2020; 129: 103550.
[23] Giefer L A, Arango Castellanos J D, Babr M M, Freitag M. Deep learning-based pose estimation of apples for inspection in logistic centers using single-perspective imaging. Processes, 2019; 7(7): 424.
[24] Mahler J, Pokorny F T, Hou B, Roderick M, Laskey M, Aubry M, et al. Goldberg. Dex-net 1.0: A cloud-based network of 3d objects for robust grasp planning using a multi-armed bandit model with correlated rewards. 2016 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2016.
[25] Valencia A J, Idrovo R M, Sapp.A D, Guingla D P, Ochoa D. A 3D vision based approach for optimal grasp of vacuum grippers. 2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and Their Application to Mechatronics (ECMSM), IEEE, 2017.
[26] Gori I, Pattacini U, Tikhanoff V, Metta G. Three-finger precision grasp on incomplete 3d point clouds. 2014 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2014.
[27] Mahler J, Liang J, Niyaz S, Laskey M, Doan R, Liu X Y, et al. Dex-net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics. arXiv preprint, 2017.
[28] Park K, Prankl J, Vincze M. Mutual hypothesis verification for 6D pose estimation of natural objects. Proceedings of the IEEE International Conference on Computer Vision Workshops, 2017.
[29] Mahler J, Matl M, Liu X Y, Li A, Gealy D, Goldberg K. Dex-net 3.0: Computing robust vacuum suction grasp targets in point clouds using a new analytic model and deep learning. 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2018.
[30] Utomo T W, Cahyadi A I, Ardiyanto I. Suction-based grasp point estimation in cluttered environment for robotic manipulator using deep learning-based affordance map. International Journal of Automation and Computing, 2021; 18(2): 277–287.
[31] You F, Mende M, Štogl D, Hein B, Kröger T. Model-free grasp planning for configurable vacuum grippers. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2018.
[32] Guo N, Zhang B H, Zhou J, Zhan K T, Lai S. Pose estimation and adaptable grasp configuration with point cloud registration and geometry understanding for fruit grasp planning. Computers and Electronics in Agriculture, 2020; 179: 105818.
[33] Keselman L, Iselin Woodfill J, Grunnet-Jepsen A, Bhowmik A. Intel realsense stereoscopic depth cameras. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 2017.
[34] Wang Q Y, Wu D H, Liu W, Lou M ZH, Jiang H Y, Ying Y B, Zhou M C. PlantStereo: A High Quality Stereo Matching Dataset for Plant Reconstruction. Agriculture, 2023; 13(2): 330.
[35] Sun Z Z, Hu D, Xie L J, Ying Y B. Detection of early stage bruise in apples using optical property mapping. Computers and Electronics in Agriculture, 2022; 194: 106725.
[36] Wang Q Y, Wu D H, Sun Z ZH, et al. Design, integration, and evaluation of a robotic peach packaging system based on deep learning. Computers and Electronics in Agriculture, 2023; 211: 108013.
[37] Wang Q Y, Xing H, Ying Y B, Zhou M C. CGFNet: 3D Convolution Guided and Multi-scale Volume Fusion Network for fast and robust stereo matching. Pattern Recognition Letters, 2023; 173: 38–44.
Downloads
Published
2024-07-11
How to Cite
Wang, Z., Wang, Q., Lou, M., Wu, F., Zhu, Y., Hu, D., … Ying, Y. (2024). Geometric based apple suction strategy for robotic packaging. International Journal of Agricultural and Biological Engineering, 17(3), 12–20. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/7947
Issue
Section
Applied Science, Engineering and Technology
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).