Skeleton extraction and pose estimation of piglets using ZS-DLC-PAF

Authors

  • Chengqi Liu 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
  • Haijian Ye 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
  • Shuhan Lu 2. School of Information, University of Michigan, Ann Arbor 48109, USA
  • Zhan Tang 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
  • Zhao Bai 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
  • Lei Diao 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China
  • Longhe Wang 3. National Research Facility for Phenotypic and Genotypic Analysis of Model Animals (Beijing), Beijing 100083, China
  • Lin Li (1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China; 2. School of Information, University of Michigan, Ann Arbor 48109, USA

Keywords:

piglets, skeleton extraction, pose estimation, Zhang-Suen, DeepLabCut, Part affinity field

Abstract

The accurate identification of various postures in the daily life of piglets that are directly reflected by their skeleton morphology is necessary to study the behavioral characteristics of pigs. Accordingly, this study proposed a novel approach for the skeleton extraction and pose estimation of piglets. First, an improved Zhang-Suen (ZS) thinning algorithm based on morphology was used to establish the chain code mechanism of the burr and the redundant information deletion templates to achieve a single-pixel width extraction of pig skeletons. Then, body nodes were extracted on the basis of the improved DeepLabCut (DLC) algorithm, and a part affinity field (PAF) was added to realize the connection of body nodes, and consequently, construct a database of pig behavior and postures. Finally, a support vector machine was used for pose matching to recognize the main behavior of piglets. In this study, 14 000 images of piglets with different types of behavior were used in posture recognition experiments. Results showed that the improved algorithm based on ZS-DLC-PAF achieved the best thinning rate compared with those of distance transformation, medial axis transformation, morphology refinement, and the traditional ZS algorithm. The node tracking accuracy reached 85.08%, and the pressure test could accurately detect up to 35 nodes of 5 pigs. The average accuracy of posture matching was 89.60%. This study not only realized the single-pixel extraction of piglets’ skeletons but also the connection among the different behavior body nodes of individual sows and multiple piglets. Furthermore, this study established a database of pig posture behavior, which provides a reference for studying animal behavior identification and classification and anomaly detection. Keywords: piglets, skeleton extraction, pose estimation, Zhang-Suen, DeepLabCut, Part affinity field DOI: 10.25165/j.ijabe.20231603.6930 Citation: Liu C Q, Ye H J, Lu S H, Tang Z, Bai Z, Diao L, et al. Skeleton extraction and pose estimation of piglets using ZS-DLC-PAF. Int J Agric & Biol Eng, 2023; 16(3): 180–193.

Author Biography

Chengqi Liu, 1. College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China

Department of Computer Science and Technology

References

[1] Nasirahmadi A, Edwards S A, Matheson S M, Sturm B. Using automated image analysis in pig behavioural research: Assessment of the influence of enrichment substrate provision on lying behaviour. Applied Animal Behaviour Science, 2017, 196: 30-35.
[2] Naseri M, Heidari S, Gheibi R, Gong L H, Sadri A. A novel quantum binary images thinning algorithm: a quantum version of the Hilditch's algorithm. Optik-International Journal for Light and Electron Optics, 2016; 131: 678-686.
[3] Chen C, Zhu W X, Norton T. Behaviour recognition of pigs and cattle: journey from computer vision to deep learning. Computers and Electronics in Agriculture, 2021; 187: 106255. doi: 10.1016/j.compag.2021.106255.
[4]Kustra J, Jalba A, Telea A. Computing refined skeletal features from medial point clouds. Pattern Recognition Letters, 2016; 76: 13-21.
[5] Gronskyte R, Clemmensen L H, Hviid M S, Kulahci M. Pig herd monitoring and undesirable tripping and stepping prevention. Computers and Electronics in Agriculture, 2015; 119: 51-60. doi: 10.1016/j.compag.2015.09.021.
[6] Nasirahmadi A, Hensel O, Edwards S A, Sturm B. Automatic detection of mounting behaviours among pigs using image analysis. Comput. Electronics in Agriculture, 2016; 124: 295-302.
[7] Nasirahmadi A, Edwards S A, Sturm, B. Implementation of machine vision for detecting behaviour of cattle and pigs. Livestock Science, 2017; 202: 25-38.
[8] Kusuma W A, Husniah L. Skeletonization using thinning method for human motion system. 2015 International Seminar on Intelligent Technology and Its Applications (ISITIA), Surabaya: IEEE, 2015; pp.103-106. doi: 10.1109/ISITIA.2015.7219962
[9] Zhang T Y, Suen C Y. A fast parallel algorithm for thinning digital patterns. Communications of the ACM, 1984; 27(3): 236-239.
[10] Ramya P, Rajeswari R. Human action recognition using distance transform and entropy based features. Multimedia Tools and Applications, 2021; 80(21): 8147-8173.
[11] Shi C W, Zhao J Y, Chang J S. Skeleton feature extraction algorithm based on medial axis transformation. Computer Engineering, 2019; 45(7): 242-250. (in Chinese)
[12] Lynda B B, Basel S, Abdelkamel T. A modified ZS thinning algorithm by a hybrid approach. The Visual Computer, 2018; 34(5): 689-706.
[13] Lynda B B, Basel S, Abdelkamel T. Implementation and comparison of binary thinning algorithms on GPU. Computing, 2018; 101(8): 1091-1117.
[14] Li R, Zhang X Y. Research on the improvement of EPTA parallel thinning algorithm. Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE2018), 2018; pp.994-1001. doi: 10.2991/ncce-18.2018.167
[15] Pfister, T, Charles J, Zisserman A. Flowing convnets for human pose estimation in videos. 2015 IEEE International Conference on Computer Vision (ICCV), Santiago: IEEE, 2015; pp.1913-1921. doi: 10.1109/ICCV.2015.222
[16] Newell A, Huang Z A, Deng J. Associative embedding: end-to-end learning for joint detection and grouping. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017; pp.2274-2284. doi: 10.5555/3294771.3294988.
[17] Fang H S, Xie S Q, Tai Y W, Lu C W. RMPE: Regional multi-person pose estimation. In: 2017 IEEE International Conference on Computer Vision (ICCV), Venice: IEEE, 2017; pp.2353-2362. doi: 10.1109/ICCV.2017.256.
[18] Liao R J, Cao C S, Garcia E B, Yu S Q, Huang Y Z. Pose-based temporal-spatial network (PSTN) for gait recognition with carrying and clothing variations. In: Proceedings of the 12th Chinese Conference on Biometric Recognition (CCVR 2017), 2017; pp.474-483. doi: 10.1007/978-3-319-69923-3_51.
[19] Cowton, J., Kyriazakis, I., Bacardit, J. Automated individual pig localisation, tracking and behaviour metric extraction using deep learning. IEEE Access, 2019; 7: 108049-108060.
[20] Gan H M, Ou M Q, Zhao F Y, Xu C G, Li S M, Chen C X, et al. Automated piglet tracking using a single convolutional neural network. Biosystems Engineering, 2021; 205(1): 48-63.
[21] Gan H M, Ou M Q, Huang E D, Xu C G, Li S Q, Li J P, et al. Automated detection and analysis of social behaviors among preweaning piglets using key point-based spatial and temporal features. Computers and Electronics in Agriculture, 2021; 188: 106357. doi: 10.1016/j.compag.2021.106357.
[22] Gan H M, Li S M, Ou M Q, Yang X F, Huang B, Liu K, et al. Fast and accurate detection of lactating sow nursing behavior with CNN-based optical flow and features. Computers and Electronics in Agriculture, 2021; 189: 106384. doi: 10.1016/j.compag.2021.106384.
[23] Mathis A, Mamidanna P, Cury K M, Abe T, Murthy V N, Mathis M W, et al. DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 2018; 21(9): 1281-1289.
[24] Nath T, Mathis A, Chen A C, Patel A, Bethge M, Mathis M W. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols, 2019; 14(7): 476531. doi: 10.1101/476531.
[25] Alameer A, Kyriazakis I, Bacardit J. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Scientific Reports, 2020; 10: 13665. doi: 10.1038/s41598-020-70688-6.
[26] Cheng F, Zhang T M, Zheng H K, Huang J D, Cuan K X. Pose estimation and behavior classifcation of broiler chickens based on deep neural networks. Computers and Electronics in Agriculture, 2021; 180: 105863. doi: 10.1016/j.compag.2020.105863
[27] Romero-Ferrero F, Bergomi M G, Hinz R C, Heras F J H, de Polavieja G G. Idtracker.ai: Tracking all individuals in large collectives of unmarked animals. Nature Methods, 2019; 16: 179-182.
[28] Sun S J, Akhtar N, Song H S, Mian A, Shah M. Deep affinity network for multiple object tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019; 43(1): 104-119.
[29] Zhang Y F, Wang C Y, Wang X J, Zeng W J, Liu W Y. A simple baseline for multi-object tracking. International Journal of Computer Vision, 2021; 129(11): 3069-3087.
[30] Jiang Y Q, Wang P, Gao H W, Jin L, Liu X J. Study on the method for removing boundary burr based on relevance of chain code. In: 2011International Conference on Electronice Commerce, Web Application and Communication (ECWAC 2011), 2011; 144: 188-194. doi: 10.1007/978-3-642-20370-1_31.
[31] Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks. N: The 5th International Conference on Learning Representations, 2016. arXiv:1609.02907.
[32] Cao Z, Simon T, Wei S E, Sheikh Y. Realtime multi-person 2D pose estimation using part affinity fields. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu: IEEE, 2017; pp.1302-1310. doi: 10.1109/CVPR.2017.143.

Downloads

Published

2023-08-17

How to Cite

Liu, C., Ye, H., Lu, S., Tang, Z., Bai, Z., Diao, L., … Li, L. (2023). Skeleton extraction and pose estimation of piglets using ZS-DLC-PAF. International Journal of Agricultural and Biological Engineering, 16(3), 180–193. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/6930

Issue

Section

Information Technology, Sensors and Control Systems