Novel method for real-time detection and tracking of pig body and its different parts

Authors

  • Fuen Chen School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing 100044, China
  • Xiaoming Liang School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing 100044, China
  • Longhan Chen Department of Mechanical Engineering, Oakland University, Rochester, MI 48309, USA
  • Baoyuan Liu School of electronics and electrical engineering, Beijing Jiaotong University Haibin College, Huanghua 061199, Hebei, China
  • Yubin Lan College of Engineering, South China Agricultural University, Guangzhou 510642, China

Keywords:

computer vision, CNN, pig, YOLACT, detection and tracking

Abstract

Detection and tracking of all major parts of pig body could be more productive to help to analyze pig behavior. To achieve this goal, a real-time algorithm based on You Only Look At CoefficienTs (YOLACT) was proposed. A pig body was divided into ten parts: one head, one trunk, four thighs and four shanks. And the key points of each part were calculated by the novel algorithm, which was based mainly on combination of the Zhang-Suen thinning algorithm and Gravity algorithm. The experiment results showed that these parts of pig body could be detected and tracked, and their contributions to overall pig activity could also be sought out. The detect accuracy of the algorithm in the data set could reach up to 90%, and the processing speed to 30.5 fps. Furthermore, the algorithm was robust and adaptive. Keywords: computer vision, CNN, pig, YOLACT, detection and tracking DOI: 10.25165/j.ijabe.20201306.5820 Citation: Chen F E, Liang X M, Chen L H, Liu B Y, Lan Y B. Novel method for real-time detection and tracking of pig body and its different parts. Int J Agric & Biol Eng, 2020; 13(6): 144–149.

Author Biographies

Fuen Chen, School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing 100044, China

Chen Fu’en, PhD, Associate Professor,Email: fechen@bjtu.edu.cn Tel: 18601145108

Baoyuan Liu, School of electronics and electrical engineering, Beijing Jiaotong University Haibin College, Huanghua 061199, Hebei, China

Liu Baoyuan, Master, lecturer, Beijing Jiaotong University Haibin College

Yubin Lan, College of Engineering, South China Agricultural University, Guangzhou 510642, China

Yubin Lan, Ph.D., Agricultural Engineer USDA-ARS-SPARC-APMRU 2771 F&B Road College Station, TX 77845 Email: ylan@scau.edu.cn Tel: 13922707507 Fax:(979)260-9386

References

[1] OECD, FAO. OECD-FAO Agricultural Outlook 2019-2028. https://doi.org/10.1787/agr_outlook-2019-en.
[2] Zhang Y F. The transformation and innovation of the management concept of the current large-scale pig farm. Feed and Animal Husbandry-Scale Pig Raising, 2011; 9: 21–25. (in Chinese)
[3] Li Y Y, Sun L Q, Zou Y B, Li Y. Individual pig object detection algorithm based on Gaussian mixture model. Int J Agric & Biol Eng, 2017; 10(5): 186–193.
[4] Sun L, Li Z, Duan Q, Sun X, Li J. Automatic monitoring of pig excretory behavior based on motion feature. Sensor Letters, 2014; 12(3): 673–677.
[5] Porto S M C, Arcidiacono C, Anguzza U, Cascone G. A computer vision-based system for the automatic detection of lying behavior of dairy cows in free-stall barns. Biosystems Engineering, 2013; 115(2): 184–194.
[6] Zuo S, Jin L, Chung Y, Park D. An index algorithm for tracking pigs in pigsty. In: International Conference on Industrial Electronics and Engineering, 2015; pp.797–804.
[7] Ma C, Wang Y, Ying G. The pig breeding management system based on RFID and WSN. In: 2011 Fourth International Conference on Information and Computing, IEEE, 2011; pp.30–33.
[8] Zhu W, Zhong F, Li X. Automated monitoring system of pig behavior based on RFID and ARM-LINUX. In: 2010 Third International Symposium on Intelligent Information Technology and Security Informatics. IEEE, 2010; pp.431–434.
[9] Chen Y-R, Chao K, Kim M S. Machine vision technology for agricultural applications. Computers and Electronics in Agriculture, 2002; 36(2): 173–191.
[10] Shao B, Xin H. A real-time computer vision assessment and control of thermal comfort for group-housed pigs. Computers and Electronics in Agriculture, 2008; 62(1): 15–21.
[11] Nasirahmadi A, Richter U, Hensel O, Edwards S, Sturm B. Using machine vision for investigation of changes in pig group lying patterns. Computers and Electronics in Agriculture, 2015; 119: 184–190.
[12] Kashiha M, Bahr C, Ott S, Moons C, Niewold T, Odberg F, et al. Automatic identification of marked pigs in a pen using image pattern recognition. Computers and Electronics in Agriculture, 2013; 93: 111–120.
[13] Ahrendt P, Gregersen T, Karstoft H. Development of a real-time computer vision system for tracking loose-housed pigs. Computers and Electronics in Agriculture, 2011; 76(2): 169–174.
[14] Xiao D Q, Feng A J, Liu J. Detection and tracking of pigs in natural environments based on video analysis. Int J Agric & Biol Eng, 2019; 12(4): 116–126.
[15] Dong X, Shen J, Yu D, Wang W, Liu J, Huang H. Occlusion-aware real-time object tracking. IEEE Transactions on Multimedia, 2017; 19(4): 763–771.
[16] Hua Y, Alahari K, Schmid C. Occlusion and motion reasoning for long-term tracking. Computer Vision – ECCV 2014. Springer International Publishing, 2014; pp.172–187.
[17] Yang H, Alahari K, Schmid C. Online object tracking with proposal selection. In: IEEE International Conference on Computer Vision. IEEE, 2015; pp.3092–3100.
[18] Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: Tech Report (v5). UC Berkeley, 2013; pp.580–587.
[19] Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell, 2015; 39(6): 91–99.
[20] Bolya D, Zhou C, Xiao F, Lee Y J. YOLACT: real-time instance segmentation. In: The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 9157-9166
[21] Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu C Y, et al. Ssd: single shot multibox detector. In: 14th European Conference on Computer Vision (ECCV), Proceedings, Part I, Springer, 2016; pp. 21–37.
[22] Redmon J, Farhadi A. Yolo9000: better, faster, stronger. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, Hawaii, USA, 2017; Vol.1, pp.6517–6525.
[23] Silvera A M, Knowles T G, Butterworth A, Berckmans D, Vranken E, Blokhuis H J. Lameness assessment with automatic monitoring of activity in commercial broiler flocks. Poultry Science, 2017; 96(7): 2013–2017.
[24] Kongsro J. Development of a computer vision system to monitor pig locomotion. Open Journal of Animal Sciences, 2013; 3(3): 254–260.
[25] Oczak M, Viazzi S, Ismayilova G, Sonoda L, Roulston N, Fels M, Bahr C, Hartung J, Guarino M, Berckmans D, Vranken E. Classification of aggressive behavior in pigs by activity index and multilayer feed forward neural network. Biosystems Engineering, 2014; 119(4): 89–97.
[26] Ojukwu C C, Feng Y Z, Jia G F, Zhao H T, Tan H Q. Development of a computer vision system to detect inactivity in group-housed pigs. Int J Agric & Biol Eng, 2020; 13(1): 42–46.
[27] Anguelov D, Srinivasan P, Koller D, et al. SCAPE: shape completion and animation of people. ACM Transactions on Graphics, 2005; 24(3): 408–416.
[28] Ma M, Li Y B. 2D human pose estimation using multi-level dynamic model. ROBOT, 2016; 38: 587. (in Chinese)
[29] Xiao D, Feng A, Yang Q, Liu J, Zhang Z. Fast motion detection for pigs based on video tracking. Transactions of the CSAM, 2016; 47(10): 331, 351–357. (in Chinese)
[30] Kentaro Wada. Labelme: image polygonal annotation with Python. 2016. Available: https://github.com/wkentaro/labelme. Accessed on July 5, 2019.
[31] Abadi M. TensorFlow: learning functions at scale. ACM Sigplan Notices, 2016; 51(9): 1. doi:10.1145/3022670.2976746.
[32] Li Y B, Hao Y J, Liu E H. Calculation method of polygon center of gravity. Computer Application, 2005; 25(S1): 391–393. (in Chinese)
[33] Gu X D, Yu D H, Zhang L M. Image thinning using pulse coupled neural network. Pattern Recognition Letters, 2004; 25(9): 1075–1084.
[34] Rublee E, Rabaud V, Konolige K, Bradski G. ORB: An efficient alternative to SIFT or SURF. In: 2011 International Conference on Computer Vision, Barcelona, 2011; pp. 2564–2571.
[35] Sun L Q, Zou Y B, Li Y, Cai Z D, Li Y, Luo B, et al. Multi target pigs tracking loss correction algorithm based on Faster R-CNN. Int J Agric & Biol Eng, 2018; 11(5): 192–197.
[36] Čehovin L, Leonardis A, Kristan M. Visual object tracking performance measures revisited. IEEE Transactions on Image Processing, 2016; 25(3): 1261–1274.
[37] He K M, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2017; 42: 386–397.
[38] Tian Y Y. Precision of edge detection affected by smoothing operator of image. Computer Engineering and Applications, 2009; 45(32): 161–163. (in Chinese)
[39] Qu Y C, Deng C Y, Liu G L. The exploration on the characteristics of porcine behaviors and the improvement of pig feeding as well as management. Guizhou Animal Science and Veterinary Medicine, 2001; 25(5): 9–10. (in Chinese)
[40] Forkman B, Furuhaug I L, Jensen P. Personality, coping patterns, and aggression in piglets. Applied Animal Behaviour, 1995; 45(1-2): 31–42.
[41] Zhang Y F. The transformation and innovation of the current large-scale pig farm management concept. Feed and Animal Husbandry Large Scale Pig Raising, 2011; 9: 21–25. (in Chinese)

Downloads

Published

2020-12-03

How to Cite

Chen, F., Liang, X., Chen, L., Liu, B., & Lan, Y. (2020). Novel method for real-time detection and tracking of pig body and its different parts. International Journal of Agricultural and Biological Engineering, 13(6), 144–149. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/5820

Issue

Section

Information Technology, Sensors and Control Systems