Novel tracking method for the drinking behavior trajectory of pigs
Keywords:
tracking method, drinking behavior trajectory, pigs, L-K optical flow, KCF, DeepLabCutAbstract
Identifying and tracking the drinking behavior of pigs is of great significance for welfare feeding and piggery management. Research on pigs’ drinking behavior not only needs to indicate whether the snout is in contact with the water fountain, but it also needs to establish whether the pig is drinking water and for how long. To solve target loss and identification errors, a novel method for tracking the drinking behavior of pigs based on L-K Pyramid Optical Flow (L-K OPT), Kernelized Correlation Filters (KCF), and DeepLabCut (DLC) was proposed. First, the feature model of the drinking behavior of a sow was established by L-K OPT. In addition, the water flow vector was used to determine whether the animal drank water and to demonstrate the details of the movements. Then, on the basis of the improved KCF, the relocation model of the sow’s snout was established to resolve the problem of tracking loss in the snout. Finally, the tracking model of piglets’ drinking behavior was established by DLC to build the mapping association between the pig’s snout and the drinking fountain. By using 200 episodes of drinking water videos (30-60 s each) to verify the method proposed in this study, the results are explained that 1) according to the two important drinking water indexes, the Down (−135°, −45°) direction feature and the V2 (>10 pixels) speed feature, the drinking time could be accurate to the frame level, with an error within 30 frames; 2) The overlapping precision (OP) was 95%, the center location error (CLE) was 3 pixels, and the speed was 300 fps, which were all superior to other traditional algorithms; 3) The optimal learning rate was 0.005, and the loss value was 0.0 002. The method proposed in this study realized accurate and automatic monitoring of the drinking behavior of pigs, which could provide reference for other animal behavior monitoring. Keywords: tracking method, drinking behavior trajectory, pigs, L-K optical flow, KCF, DeepLabCut DOI: 10.25165/j.ijabe.20231606.7450 Citation: Liu C Q, Ye H J, Wang L H, Lu S H, Li L. Novel tracking method for the drinking behavior trajectory of pigs. Int J Agric & Biol Eng, 2023; 16(6): 67–76.References
[1] Larsen M L V, Wang M Q, Norton T. Information technologies for welfare monitoring in pigs and their relation to welfare quality. Sustainability, 2021; 13(2): 692.
[2] Yang Q M, Xiao D Q, Zhang G X. Automatic pig drinking behavior recognition based on Machine Vision. Transactions of the CSAM, 2018; 49(6): 232–238. (in Chinese)
[3] Martínez-Avilés M, Fernández-Carrión E, López García-Baones J M, Sánchez-Vizcaíno J M. Early detection of infection in pigs through an online monitoring system. Transboundary and Emerging Diseases, 2017; 64(2): 364–373.
[4] Stavrakakis S, Li W, Guy J H, Morgan G, Ushaw G, Johnson G R, et al. Validity of the Microsoft Kinect sensor for assessment of normal walking patterns in pigs. Computers and Electronics in Agriculture, 2015; 117: 1–7.
[5] Tuyttens F A M, Stadig L, Heerkens J L T, Van Iaer E, Buijs S, Ampe B. Opinion of applied ethologists on expectation bias, blinding observers and other debiasing techniques. Applied Animal Behaviour Science, 2016; 181: 27–33.
[6] Hu Z W, Yang H, Lou T T. Dual attention-guided feature pyramid network for instance segmentation of group pigs. Computers and Electronics in Agriculture, 2021; 186: 106140.
[7] Yang A Q, Huang H S, Zheng B, Li S M, Gan H M. An automatic recognition framework for sow daily behaviours based on motion and image analyses. Biosystems Engineering, 2020; 192: 56–71.
[8] Küster S, Kardel M, Ammer S, Brünger J, Koch R, Traulsen I. Usage of computer vision analysis for automatic detection of activity changes in sows during final gestation. Computers and Electronics in Agriculture, 2020; 169: 105177.
[9] Yang Q M, Xiao D Q, Cai J H. Pig mounting behaviour recognition based on video spatial-temporal features. Biosystems Engineering, 2021; 206: 55–66.
[10] Nasirahmadi A, Hensel O, Edwards S A, Sturm B. A new approach for categorizing pig lying behaviour based on a Delaunay triangulation method. Animal, 2017; 11(1): 131–139.
[11] Gronskyte R, Clemmensen L H, Hviid M S, Kulahci M. Monitoring pig movement at the slaughterhouse using optical flow and modified angular histograms. Biosystems Engineering, 2016; 141: 19–30.
[12] Liu D, Oczak M, Maschat K, Baumgartner J, Pletzer B, He D J, et al. A computer vision-based method for spatial-temporal action recognition of tail-biting behaviour in group-housed pigs. Biosystems Engineering, 2020; 195: 27–41.
[13] Mittek M, Psota E T, Carlson J D, Pérez L C, Schmidt T, Mote B. Tracking of group-housed pigs using multi-ellipsoid expectation maximization. IET Computer Vision, 2018; 12(2): 121–128.
[14] Alameer A, Kyriazakis I, Dalton H A, Miller A L, Bacardit J. Automatic recognition of feeding and foraging behaviour in pigs using deep learning. Biosystems Engineering, 2020; 197: 91–104.
[15] Gao Y, Yu H A, Lei M G, Li X, Guo X, Diao Y P. Trajectory tracking for group housed pigs based on locations of head/tail. Transactions of the CSAE, 2017; 33(2): 220–226. (in Chinese)
[16] Insafutdinov E, Pishchulin L, Andres B, Andriluka M, Schiele B, et al. DeeperCut: A deeper, stronger, and faster multi-person pose estimation model. In: 2016 European Conference on Computer Vision (ECCV 2016), Springer, 2016; pp.34-50. doi: 10.1007/978-3-319-46466-4_3.
[17] Zheng C, Zhu X M, Yang X F, Wang L N, Tu S Q, Xue S Q. Automatic recognition of lactating sow postures from depth images by deep learning detector. Computers and Electronics in Agriculture, 2018; 147: 51–63.
[18] Chen C, Zhu W X, Steibel J, Siegford J, Wurtz K, Han J J, et al. Recognition of aggressive episodes of pigs based on convolutional neural network and long short-term memory. Computers and Electronics in Agriculture, 2020; 169: 105166.
[19] Chen C, Zhu W X, Steibel J, Siegford J, Han J J, Norton T. Classification of drinking and drinker-playing in pigs by a video-based deep learning method. Biosystems Engineering, 2020; 196: 1–14.
[20] Tu S Q, Yuan W J, Liang Y, Wang F, Wan H. Automatic detection and segmentation for group-housed pigs based on PigMS R-CNN. Sensors, 2021; 21(9): 3251.
[21] Liao B, Hu J L, Gilmore R O. Optical flow estimation combining with illumination adjustment and edge refinement in livestock UAV videos. Computers and Electronics in Agriculture, 2021; 180: 105910.
[22] Zhou T, Song Y Y, Qin J, Wu J, Yu H. Improved L-K optical flow method for moving target detection. Journal of Fujian Computer, 2020; 36(8): 10–13. (in Chinese)
[23] Li Y Z Z, Johnston L J, Dawkins M S. Utilization of optical flow algorithms to monitor development of tail biting outbreaks in pigs. Animals, 2020; 10(2): 323.
[24] Lian Z C, Feng C J, Liu Z G, Huang C Y, Xu C S, Sun J. A novel scale insensitive KCF tracker based on HOG and color features. Journal of Circuits, Systems and Computers, 2020; 29(11): 2050183.
[25] Mathis A, Mamidanna P, Cury K M, Abe T, Murthy V N, Mathis M W, et al. DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 2018; 21(9): 1281–1289.
[26] Nath T, Mathis A, Chen A C, Patel A, Bethge M, Mathis M W. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols, 2019; 14: 2152–2176.
[27] Alameer A, Kyriazakis I, Bacardit J. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Scientific Reports, 2020; 10: 13665.
[28] Cheng F, Zhang T M, Zheng H K, Huang J D, Cuan K X. Pose estimation and behavior classifcation of broiler chickens based on deep neural networks. Computers and Electronics in Agriculture, 2021; 180: 105863.
[29] Ma C G, Guo Y Y, Wu P, Liu H B. Review of image enhancement based on generative adversarial networks. Netinfo Security, 2019; 5: 10–21. (in Chinese)
[30] Liu F, Yang C Y, Yu X C, Qi J Y. Spectral graph convolutional neural network for decentralized dual differential privacy. Netinfo Security, 2022; 22(2): 39–46. (in Chinese)
[31] Liu S, Zhang X L. Intrusion detection system based on dual attention. Netinfo Security, 2022; 22(1): 80–86. (in Chinese)
[2] Yang Q M, Xiao D Q, Zhang G X. Automatic pig drinking behavior recognition based on Machine Vision. Transactions of the CSAM, 2018; 49(6): 232–238. (in Chinese)
[3] Martínez-Avilés M, Fernández-Carrión E, López García-Baones J M, Sánchez-Vizcaíno J M. Early detection of infection in pigs through an online monitoring system. Transboundary and Emerging Diseases, 2017; 64(2): 364–373.
[4] Stavrakakis S, Li W, Guy J H, Morgan G, Ushaw G, Johnson G R, et al. Validity of the Microsoft Kinect sensor for assessment of normal walking patterns in pigs. Computers and Electronics in Agriculture, 2015; 117: 1–7.
[5] Tuyttens F A M, Stadig L, Heerkens J L T, Van Iaer E, Buijs S, Ampe B. Opinion of applied ethologists on expectation bias, blinding observers and other debiasing techniques. Applied Animal Behaviour Science, 2016; 181: 27–33.
[6] Hu Z W, Yang H, Lou T T. Dual attention-guided feature pyramid network for instance segmentation of group pigs. Computers and Electronics in Agriculture, 2021; 186: 106140.
[7] Yang A Q, Huang H S, Zheng B, Li S M, Gan H M. An automatic recognition framework for sow daily behaviours based on motion and image analyses. Biosystems Engineering, 2020; 192: 56–71.
[8] Küster S, Kardel M, Ammer S, Brünger J, Koch R, Traulsen I. Usage of computer vision analysis for automatic detection of activity changes in sows during final gestation. Computers and Electronics in Agriculture, 2020; 169: 105177.
[9] Yang Q M, Xiao D Q, Cai J H. Pig mounting behaviour recognition based on video spatial-temporal features. Biosystems Engineering, 2021; 206: 55–66.
[10] Nasirahmadi A, Hensel O, Edwards S A, Sturm B. A new approach for categorizing pig lying behaviour based on a Delaunay triangulation method. Animal, 2017; 11(1): 131–139.
[11] Gronskyte R, Clemmensen L H, Hviid M S, Kulahci M. Monitoring pig movement at the slaughterhouse using optical flow and modified angular histograms. Biosystems Engineering, 2016; 141: 19–30.
[12] Liu D, Oczak M, Maschat K, Baumgartner J, Pletzer B, He D J, et al. A computer vision-based method for spatial-temporal action recognition of tail-biting behaviour in group-housed pigs. Biosystems Engineering, 2020; 195: 27–41.
[13] Mittek M, Psota E T, Carlson J D, Pérez L C, Schmidt T, Mote B. Tracking of group-housed pigs using multi-ellipsoid expectation maximization. IET Computer Vision, 2018; 12(2): 121–128.
[14] Alameer A, Kyriazakis I, Dalton H A, Miller A L, Bacardit J. Automatic recognition of feeding and foraging behaviour in pigs using deep learning. Biosystems Engineering, 2020; 197: 91–104.
[15] Gao Y, Yu H A, Lei M G, Li X, Guo X, Diao Y P. Trajectory tracking for group housed pigs based on locations of head/tail. Transactions of the CSAE, 2017; 33(2): 220–226. (in Chinese)
[16] Insafutdinov E, Pishchulin L, Andres B, Andriluka M, Schiele B, et al. DeeperCut: A deeper, stronger, and faster multi-person pose estimation model. In: 2016 European Conference on Computer Vision (ECCV 2016), Springer, 2016; pp.34-50. doi: 10.1007/978-3-319-46466-4_3.
[17] Zheng C, Zhu X M, Yang X F, Wang L N, Tu S Q, Xue S Q. Automatic recognition of lactating sow postures from depth images by deep learning detector. Computers and Electronics in Agriculture, 2018; 147: 51–63.
[18] Chen C, Zhu W X, Steibel J, Siegford J, Wurtz K, Han J J, et al. Recognition of aggressive episodes of pigs based on convolutional neural network and long short-term memory. Computers and Electronics in Agriculture, 2020; 169: 105166.
[19] Chen C, Zhu W X, Steibel J, Siegford J, Han J J, Norton T. Classification of drinking and drinker-playing in pigs by a video-based deep learning method. Biosystems Engineering, 2020; 196: 1–14.
[20] Tu S Q, Yuan W J, Liang Y, Wang F, Wan H. Automatic detection and segmentation for group-housed pigs based on PigMS R-CNN. Sensors, 2021; 21(9): 3251.
[21] Liao B, Hu J L, Gilmore R O. Optical flow estimation combining with illumination adjustment and edge refinement in livestock UAV videos. Computers and Electronics in Agriculture, 2021; 180: 105910.
[22] Zhou T, Song Y Y, Qin J, Wu J, Yu H. Improved L-K optical flow method for moving target detection. Journal of Fujian Computer, 2020; 36(8): 10–13. (in Chinese)
[23] Li Y Z Z, Johnston L J, Dawkins M S. Utilization of optical flow algorithms to monitor development of tail biting outbreaks in pigs. Animals, 2020; 10(2): 323.
[24] Lian Z C, Feng C J, Liu Z G, Huang C Y, Xu C S, Sun J. A novel scale insensitive KCF tracker based on HOG and color features. Journal of Circuits, Systems and Computers, 2020; 29(11): 2050183.
[25] Mathis A, Mamidanna P, Cury K M, Abe T, Murthy V N, Mathis M W, et al. DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 2018; 21(9): 1281–1289.
[26] Nath T, Mathis A, Chen A C, Patel A, Bethge M, Mathis M W. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature Protocols, 2019; 14: 2152–2176.
[27] Alameer A, Kyriazakis I, Bacardit J. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Scientific Reports, 2020; 10: 13665.
[28] Cheng F, Zhang T M, Zheng H K, Huang J D, Cuan K X. Pose estimation and behavior classifcation of broiler chickens based on deep neural networks. Computers and Electronics in Agriculture, 2021; 180: 105863.
[29] Ma C G, Guo Y Y, Wu P, Liu H B. Review of image enhancement based on generative adversarial networks. Netinfo Security, 2019; 5: 10–21. (in Chinese)
[30] Liu F, Yang C Y, Yu X C, Qi J Y. Spectral graph convolutional neural network for decentralized dual differential privacy. Netinfo Security, 2022; 22(2): 39–46. (in Chinese)
[31] Liu S, Zhang X L. Intrusion detection system based on dual attention. Netinfo Security, 2022; 22(1): 80–86. (in Chinese)
Downloads
Published
2024-02-06
How to Cite
Liu, C., Ye, H., Wang, L., Lu, S., & Li, L. (2024). Novel tracking method for the drinking behavior trajectory of pigs. International Journal of Agricultural and Biological Engineering, 16(6), 67–76. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/7450
Issue
Section
Animal, Plant and Facility Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).