Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN
Keywords:
strawberry detection, 3D point cloud, mean-shift, clustering methodAbstract
To solve the problem of high labour costs in the strawberry picking process, the approach of a strawberry picking robot to identify and find strawberries is suggested in this study. First, 1000 images including mature, immature, single, multiple, and occluded strawberries were collected, and a two-stage detection Mask R-CNN instance segmentation network and a one-stage detection YOLOv3 target detection network were used to train a strawberry identification model which classified strawberries into two categories: mature and immature. The accuracy ratings for YOLOv3 and Mask R-CNN were 93.4% and 94.5%, respectively. Second, the ZED stereo camera, triangulation, and a neural network were used to locate the strawberry in three dimensions. YOLOv3 identification accuracy was 3.1 mm, compared to Mask R-CNN of 3.9 mm. The strawberry detection and positioning method proposed in this study may effectively be used to supply the picking robot with a precise location of the ripe strawberry. Keywords: strawberry detection, 3D point cloud, mean-shift, clustering method DOI: 10.25165/j.ijabe.20221506.7306 Citation: Hu H M, Kaizu Y, Zhang H D, Xu Y W, Imou K, Li M, et al. Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN. Int J Agric & Biol Eng, 2022; 15(6): 175–179.References
[1] Hikawa-Endo M. Improvement in the shelf-life of Japanese strawberry fruits by breeding and post-harvest techniques. The Horticulture Journal, 2020; 89(2): 115–123.
[2] Nishizawa T. Current status and future prospect of strawberry production in East Asia and Southeast Asia. In: Proceedings of the IX International
Strawberry Symposium, 2021; pp.395–402.
[3] Yoshida T, Fukao T, Hasegawa T. Fast detection of tomato peduncle using point cloud with a harvesting robot. Journal of Robotics and Mechatronics, 2018; 30(2): 180–186.
[4] Takenaga. Strawberry harvesting robot for greenhouses. Japan Strawberry Seminar 1998 and Added Information. Tokyo, Japan: The Chemical Daily, 1998; pp.6–11. (in Japanese)
[5] Hayashi S, Takahashi K, Yamamoto S, Saito S, Komeda T. Gentle handling of strawberries using a suction device. Biosystems Engineering, 2011; 109(4): 348–356.
[6] Han K S, Kim S C, Lee Y B, Kim S C, Im D H, Choi H K, et al. Strawberry harvesting robot for bench-type cultivation. Journal of Biosystems Engineering, 2012; 37(1): 65–74.
[7] Xiong Y, Ge Y, Grimstad L, From P J. An autonomous strawberry‐harvesting robot: Design, development, integration, and field evaluation. Journal of Field Robotics, 2020; 37(2): 202–224.
[8] Xiong Y, Peng C, Grimstad L, From P J, Isler V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Computers and Electronics in Agriculture, 2019; 157: 392–402.
[9] Feng Q C, Wang X, Zheng W G, Qiu Q, Jiang K. A new strawberry harvesting robot for elevated-trough culture. Int J Agric & Biol Eng, 2012; 5(2): 1–8.
[10] De Preter A, Anthonis J, De Baerdemaeker J. Development of a robot for harvesting strawberries. IFAC-Papers OnLine, 2018; 51(17): 14-19.
[11] Cui Y, Gejima Y, Kobayashi T, Hiyoshi K, Nagata M. Study on cartesian-type strawberry-harvesting robot. Sensor Letters, 2013; 11(6-7): 1223–1228.
[12] Yu Y, Zhang K, Liu H, Yang L, Zhang D. Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access, 2020; 8: 116556–116568.
[13] Xu L M, Zhang T Z. Influence of light intensity on extracted colour feature values of different maturity in strawberry. New Zealand Journal of Agricultural Research, 2007; 50(5): 559–565.
[14] Zhang L, Ma X, Liu G, Zhou W, Zhang M. Recognition and positioning of strawberry fruits for harvesting robot based on convex hull. In: 2014 Montreal, Quebec Canada July 13–16, ASABE, 2014; doi: 10.13031/aim.20141902612.
[15] Lei H, Huang K, Jiao Z, Tang Y, Zhong Z, Cai Y. Bayberry segmentation in a complex environment based on a multi-module convolutional neural network. Applied Soft Computing, 2022; 119: 108556. doi: 10.1016/ j.asoc.2022.108556.
[16] Kai H, Huan L, Zeyu J, Tianlun H, Zaili C, Nan W. Bayberry maturity estimation algorithm based on multi-feature fusion. In: 2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), 2021; pp.514–518. doi: 10.1109/ICAICA52286.2021.9498084.
[17] Liu J, Wang X. Plant diseases and pests detection based on deep learning: a review. Plant Methods, 2021; 17(1): 1–18.
[18] Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.
[19] He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision, 2017; pp.2961–2969.
[20] Yu Y, Zhang K, Yang L, Zhang D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Computers and Electronics in Agriculture, 2019; 163: 104846. doi: 10.1016/jcompag.2019.06.001.
[21] Kirsten E, Inocencio L C, Veronez M R, da Silveira L G, Bordin F, Marson F P. 3D data acquisition using stereo camera. In IEEE International Geoscience and Remote Sensing Symposium, 2018; pp.9214–9217. doi: 10.1109/igarss.2018.8519568.
[22] Ortiz L E, Cabrera E V, Gonçalves L M. Depth data error modeling of the ZED 3D vision sensor from stereolabs. ELCVIA: Electronic Letters on Computer Vision and Image Analysis, 2018; 17(1): 1–15. doi: 10.5565/rev/elcvia.1084.
[23] Gupta T, Li H. Indoor mapping for smart cities—an affordable approach: Using kinect sensor and ZED stereo camera. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2017; pp.1–8.
[24] Tagarakis A C, Kalaitzidis D, Filippou E, Benos L, Bochtis D. 3D scenery construction of agricultural environments for robotics awareness. Information and Communication Technologies for Agriculture—Theme III: Decision. Cham: Springer, 2022; pp. 125–142. doi: 10.1007/978-3-030-84152-2_6.
[25] Rahul Y, Nair B B. Camera-based object detection, identification and distance estimation. 2nd International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE), 2018; pp.203–205. doi: 10.1109/icmete.2018.00052.
[26] Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000; 22(11): 1330–1334.
[2] Nishizawa T. Current status and future prospect of strawberry production in East Asia and Southeast Asia. In: Proceedings of the IX International
Strawberry Symposium, 2021; pp.395–402.
[3] Yoshida T, Fukao T, Hasegawa T. Fast detection of tomato peduncle using point cloud with a harvesting robot. Journal of Robotics and Mechatronics, 2018; 30(2): 180–186.
[4] Takenaga. Strawberry harvesting robot for greenhouses. Japan Strawberry Seminar 1998 and Added Information. Tokyo, Japan: The Chemical Daily, 1998; pp.6–11. (in Japanese)
[5] Hayashi S, Takahashi K, Yamamoto S, Saito S, Komeda T. Gentle handling of strawberries using a suction device. Biosystems Engineering, 2011; 109(4): 348–356.
[6] Han K S, Kim S C, Lee Y B, Kim S C, Im D H, Choi H K, et al. Strawberry harvesting robot for bench-type cultivation. Journal of Biosystems Engineering, 2012; 37(1): 65–74.
[7] Xiong Y, Ge Y, Grimstad L, From P J. An autonomous strawberry‐harvesting robot: Design, development, integration, and field evaluation. Journal of Field Robotics, 2020; 37(2): 202–224.
[8] Xiong Y, Peng C, Grimstad L, From P J, Isler V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Computers and Electronics in Agriculture, 2019; 157: 392–402.
[9] Feng Q C, Wang X, Zheng W G, Qiu Q, Jiang K. A new strawberry harvesting robot for elevated-trough culture. Int J Agric & Biol Eng, 2012; 5(2): 1–8.
[10] De Preter A, Anthonis J, De Baerdemaeker J. Development of a robot for harvesting strawberries. IFAC-Papers OnLine, 2018; 51(17): 14-19.
[11] Cui Y, Gejima Y, Kobayashi T, Hiyoshi K, Nagata M. Study on cartesian-type strawberry-harvesting robot. Sensor Letters, 2013; 11(6-7): 1223–1228.
[12] Yu Y, Zhang K, Liu H, Yang L, Zhang D. Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access, 2020; 8: 116556–116568.
[13] Xu L M, Zhang T Z. Influence of light intensity on extracted colour feature values of different maturity in strawberry. New Zealand Journal of Agricultural Research, 2007; 50(5): 559–565.
[14] Zhang L, Ma X, Liu G, Zhou W, Zhang M. Recognition and positioning of strawberry fruits for harvesting robot based on convex hull. In: 2014 Montreal, Quebec Canada July 13–16, ASABE, 2014; doi: 10.13031/aim.20141902612.
[15] Lei H, Huang K, Jiao Z, Tang Y, Zhong Z, Cai Y. Bayberry segmentation in a complex environment based on a multi-module convolutional neural network. Applied Soft Computing, 2022; 119: 108556. doi: 10.1016/ j.asoc.2022.108556.
[16] Kai H, Huan L, Zeyu J, Tianlun H, Zaili C, Nan W. Bayberry maturity estimation algorithm based on multi-feature fusion. In: 2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), 2021; pp.514–518. doi: 10.1109/ICAICA52286.2021.9498084.
[17] Liu J, Wang X. Plant diseases and pests detection based on deep learning: a review. Plant Methods, 2021; 17(1): 1–18.
[18] Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.
[19] He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision, 2017; pp.2961–2969.
[20] Yu Y, Zhang K, Yang L, Zhang D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Computers and Electronics in Agriculture, 2019; 163: 104846. doi: 10.1016/jcompag.2019.06.001.
[21] Kirsten E, Inocencio L C, Veronez M R, da Silveira L G, Bordin F, Marson F P. 3D data acquisition using stereo camera. In IEEE International Geoscience and Remote Sensing Symposium, 2018; pp.9214–9217. doi: 10.1109/igarss.2018.8519568.
[22] Ortiz L E, Cabrera E V, Gonçalves L M. Depth data error modeling of the ZED 3D vision sensor from stereolabs. ELCVIA: Electronic Letters on Computer Vision and Image Analysis, 2018; 17(1): 1–15. doi: 10.5565/rev/elcvia.1084.
[23] Gupta T, Li H. Indoor mapping for smart cities—an affordable approach: Using kinect sensor and ZED stereo camera. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2017; pp.1–8.
[24] Tagarakis A C, Kalaitzidis D, Filippou E, Benos L, Bochtis D. 3D scenery construction of agricultural environments for robotics awareness. Information and Communication Technologies for Agriculture—Theme III: Decision. Cham: Springer, 2022; pp. 125–142. doi: 10.1007/978-3-030-84152-2_6.
[25] Rahul Y, Nair B B. Camera-based object detection, identification and distance estimation. 2nd International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE), 2018; pp.203–205. doi: 10.1109/icmete.2018.00052.
[26] Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000; 22(11): 1330–1334.
Downloads
Published
2022-12-27
How to Cite
Hu, H., Kaizu, Y., Zhang, H., Xu, Y., Imou, K., Li, M., … Dai, S. (2022). Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN. International Journal of Agricultural and Biological Engineering, 15(6), 175–179. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/7306
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).