Development of phenotyping system using low altitude UAV imagery and deep learning
Keywords:
panicle detection, vision-based phenotyping, deep learning, unmanned aerial vehicle (UAV)Abstract
In this study, a lightweight phenotyping system that combined the advantages of both deep learning-based panicle detection and the photogrammetry based on light consumer-level UAVs was proposed. A two-year experiment was conducted to perform data collection and accuracy validation. A deep learning model, named Mask Region-based Convolutional Neural Network (Mask R-CNN), was trained to detect panicles in complex scenes of paddy fields. A total of 13 857 images were fed into Mask R-CNN, with 80% used for training and 20% used for validation. Scores, precision, recall, Average Precision (AP), and F1-score of the Mask R-CNN, were 82.46%, 80.60%, 79.46%, and 79.66%, respectively. A complete workflow was proposed to preprocess flight trajectories and remove repeated detection and noises. Eventually, the evident changed in rice growth during the heading stage was visualized with geographic distributions, and the total number of panicles was predicted before harvest. The average error of the predicted amounts of panicles was 33.98%. Experimental results showed the feasibility of using the developed system as the high-throughput phenotyping approach. Keywords: panicle detection, vision-based phenotyping, deep learning, unmanned aerial vehicle (UAV) DOI: 10.25165/j.ijabe.20211401.6025 Citation: Lyu S X, Noguchi N, Ospina R, Kishima Y. Development of phenotyping system using low altitude UAV imagery and deep learning. Int J Agric & Biol Eng, 2021; 14(1): 207–215.References
[1] Mosleh M K, Hassan Q K, Chowdhury E H. Application of remote sensors in mapping rice area and forecasting its production: A review. Sensors, 2015; 15(1): 769–791.
[2] Shamshiri R R, Ibrahim B, Balasundram S K, Taheri S, Weltzien C. Evaluating system of rice intensification using a modified transplanter: A smart farming solution toward sustainability of paddy fields in Malaysia. Int J Agric & Biol Eng, 2019; 12(2): 54–67.
[3] Shamshiri R R, Ibrahim B, Ahmad D, Che Man H, Wayayok A. An overview of the System of Rice Intensification for Paddy Fields of Malaysia. Indian J Sci Technol, 2018; 11(18): 1–16.
[4] Feng X, Jiang Y, Yang X, Du M, Li X. Computer vision algorithms and hardware implementations: A survey. Integration the VLSI Journal, 2019; 69: 309–320.
[5] Li L, Zhang Q, Huang D. A Review of Imaging Techniques for Plant Phenotyping. Sensors, 2014; 14(11): 20078–20111.
[6] Haw C L, Ismail W I W, Kairunniza-Bejo S, Putih A, Shamshiri R. Colour vision to determine paddy maturity. Int J Agric & Biol Eng, 2014; 7(5): 55–63.
[7] Mochida K, Koda S, Inoue K, Hirayama T, Tanaka S, Nishii R, et al. Computer vision-based phenotyping for improvement of plant productivity: a machine learning perspective. GigaScience, 2019; 8(1): 1–12.
[8] Tang L, Zhu Y, Hannaway D, Meng Y, Liu L, Chen L, et al. RiceGrow: A rice growth and productivity model. NJAS - Wageningen J Life Sci, 2009; 57(1): 83–92.
[9] Yoshida S. Fundamentals of Rice Crop Science. Los Banos: The International Rice research Institute, 1981; 279p.
[10] Duan L, Huang C, Chen G, Xiong L, Liu Q, Yang W. Determination of rice panicle numbers during heading by multi-angle imaging. Crop J, 2015; 3(3): 211–219.
[11] Xiong X, Duan L, Liu L, Tu H, Yang P, Wu D, et al. Panicle-SEG: A robust image segmentation method for rice panicles in the field based on deep learning and superpixel optimization. Plant Methods, 2017; 13(1): 1–15.
[12] Guo W, Fukatsu T, Ninomiya S. Automated characterization of flowering dynamics in rice using field-acquired time-series RGB images. Plant Methods, 2015; 11(1): 7. doi: 10.1186/s13007-015-0047-9.
[13] Desai S V, Balasubramanian V N, Fukatsu T, Ninomiya S, Guo W. Automatic estimation of heading date of paddy rice using deep learning. Plant Methods, 2019; 15(1): 76. doi: 10.1186/s13007-019-0457-1.
[14] Zhou C, Ye H, Hu J, Shi X, Hua S, Yue J, et al. Automated counting of rice panicle by applying deep learning model to images from unmanned aerial vehicle platform. Sensors, 2019; 19(14): 3106. doi: 10.3390/s19143106.
[15] Ikeda M, Hirose Y, Takashi T, Shibata Y, Yamamura T, Komura T, et al. Analysis of rice panicle traits and detection of QTLs using an image analyzing method. Breed Sci, 2010; 60(1): 55–64.
[16] Zhu Y, Cao Z, Lu H, Li Y, Xiao Y. In-field automatic observation of wheat heading stage using computer vision. Biosyst Eng, 2016; 143: 28–41.
[17] Sadeghi-Tehran P, Sabermanesh K, Virlet N, Hawkesford M J. Automated method to determine two critical growth stages of wheat: Heading and flowering. Front Plant Sci, 2017; 8: 252. doi: 10.3389/fpls.2017.00252.
[18] Kamilaris A, Prenafeta-Boldú F X. Deep learning in agriculture: A survey. Comput Electron Agric, 2018; 147: 70–90.
[19] Araus J L, Cairns J E. Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci, 2014; 19(1): 52–61.
[20] Yang G, Liu J, Zhao C, Li Z, Huang Y, Yu H, et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front Plant Sci, 2017; 8: 1111. doi: 10.3389/fpls.2017.01111.
[21] Ampatzidis Y, Partel V. UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sens, 2019; 11(4): 410. doi: 10.3390/rs11040410.
[22] Gnädinger F, Schmidhalter U. Digital counts of maize plants by unmanned aerial vehicles (UAVs). Remote Sens, 2017; 9(6): 544. doi: 10.3390/rs9060544.
[23] Jin X L, Liu S Y, Baret F, Hemerlé M, Comar A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens Environ, 2017; 198: 105–114.
[24] He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. Proc IEEE Int Conf Comput Vis, 2017; 42(2): 386–397.
[25] Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus: IEEE, 2014; pp.580–587.
[26] Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell, 2017; 39(6): 1137–1149.
[27] Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Trans Pattern Anal Mach Intell, 2017; 39(4): 640–651.
[28] He K, Zhang X, Ren S, Sun J. Deep Residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2016; pp.770–778.
[29] Lin T-Y Y, Dollár P, Girshick R, He K, Hariharan B, Belongie S. Feature pyramid networks for object detection. In: Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017. 2017; pp.936–944.
[30] Waleed A. Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow. Github 2017; Available: https://github.com/matterport/Mask_RCNN. Acceseed on [2018-12-01].
[31] Neubeck A, Van Gool L. Efficient Non-Maximum Suppression. In: 18th International Conference on Pattern Recognition (ICPR’06). Hong Kong: IEEE, 2006; pp.850–885.
[32] Fletcher S J. Kalman filter and smoother. In: Data Assimilation for the Geosciences. Elsevier; 2017. pp.765–782.
[33] Lee W C, Krumm J. Trajectory preprocessing. In: Zheng Y, Zhou X, editors. Computing with Spatial Trajectories. New York: Springer New York, 2011; pp.3–33.
[34] Sugiura R, Noguchi N, Ishii K. Remote-sensing technology for vegetation monitoring using an unmanned helicopter. Biosyst Eng, 2005; 90(4): 369–379.
[35] Ester M, Kriegel H P, Sander J, Xu X. A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining, 1996; pp.226–231.
[36] Archontoulis S V, Miguez F E. Nonlinear regression models and applications in agricultural research. Agron J, 2015; 107(2): 786. doi: 10.2134/agronj2012.0506.
[37] Sheehy J E. Bi-phasic growth patterns in rice. Ann Bot, 2004; 94(6): 811–817.
[2] Shamshiri R R, Ibrahim B, Balasundram S K, Taheri S, Weltzien C. Evaluating system of rice intensification using a modified transplanter: A smart farming solution toward sustainability of paddy fields in Malaysia. Int J Agric & Biol Eng, 2019; 12(2): 54–67.
[3] Shamshiri R R, Ibrahim B, Ahmad D, Che Man H, Wayayok A. An overview of the System of Rice Intensification for Paddy Fields of Malaysia. Indian J Sci Technol, 2018; 11(18): 1–16.
[4] Feng X, Jiang Y, Yang X, Du M, Li X. Computer vision algorithms and hardware implementations: A survey. Integration the VLSI Journal, 2019; 69: 309–320.
[5] Li L, Zhang Q, Huang D. A Review of Imaging Techniques for Plant Phenotyping. Sensors, 2014; 14(11): 20078–20111.
[6] Haw C L, Ismail W I W, Kairunniza-Bejo S, Putih A, Shamshiri R. Colour vision to determine paddy maturity. Int J Agric & Biol Eng, 2014; 7(5): 55–63.
[7] Mochida K, Koda S, Inoue K, Hirayama T, Tanaka S, Nishii R, et al. Computer vision-based phenotyping for improvement of plant productivity: a machine learning perspective. GigaScience, 2019; 8(1): 1–12.
[8] Tang L, Zhu Y, Hannaway D, Meng Y, Liu L, Chen L, et al. RiceGrow: A rice growth and productivity model. NJAS - Wageningen J Life Sci, 2009; 57(1): 83–92.
[9] Yoshida S. Fundamentals of Rice Crop Science. Los Banos: The International Rice research Institute, 1981; 279p.
[10] Duan L, Huang C, Chen G, Xiong L, Liu Q, Yang W. Determination of rice panicle numbers during heading by multi-angle imaging. Crop J, 2015; 3(3): 211–219.
[11] Xiong X, Duan L, Liu L, Tu H, Yang P, Wu D, et al. Panicle-SEG: A robust image segmentation method for rice panicles in the field based on deep learning and superpixel optimization. Plant Methods, 2017; 13(1): 1–15.
[12] Guo W, Fukatsu T, Ninomiya S. Automated characterization of flowering dynamics in rice using field-acquired time-series RGB images. Plant Methods, 2015; 11(1): 7. doi: 10.1186/s13007-015-0047-9.
[13] Desai S V, Balasubramanian V N, Fukatsu T, Ninomiya S, Guo W. Automatic estimation of heading date of paddy rice using deep learning. Plant Methods, 2019; 15(1): 76. doi: 10.1186/s13007-019-0457-1.
[14] Zhou C, Ye H, Hu J, Shi X, Hua S, Yue J, et al. Automated counting of rice panicle by applying deep learning model to images from unmanned aerial vehicle platform. Sensors, 2019; 19(14): 3106. doi: 10.3390/s19143106.
[15] Ikeda M, Hirose Y, Takashi T, Shibata Y, Yamamura T, Komura T, et al. Analysis of rice panicle traits and detection of QTLs using an image analyzing method. Breed Sci, 2010; 60(1): 55–64.
[16] Zhu Y, Cao Z, Lu H, Li Y, Xiao Y. In-field automatic observation of wheat heading stage using computer vision. Biosyst Eng, 2016; 143: 28–41.
[17] Sadeghi-Tehran P, Sabermanesh K, Virlet N, Hawkesford M J. Automated method to determine two critical growth stages of wheat: Heading and flowering. Front Plant Sci, 2017; 8: 252. doi: 10.3389/fpls.2017.00252.
[18] Kamilaris A, Prenafeta-Boldú F X. Deep learning in agriculture: A survey. Comput Electron Agric, 2018; 147: 70–90.
[19] Araus J L, Cairns J E. Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci, 2014; 19(1): 52–61.
[20] Yang G, Liu J, Zhao C, Li Z, Huang Y, Yu H, et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front Plant Sci, 2017; 8: 1111. doi: 10.3389/fpls.2017.01111.
[21] Ampatzidis Y, Partel V. UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sens, 2019; 11(4): 410. doi: 10.3390/rs11040410.
[22] Gnädinger F, Schmidhalter U. Digital counts of maize plants by unmanned aerial vehicles (UAVs). Remote Sens, 2017; 9(6): 544. doi: 10.3390/rs9060544.
[23] Jin X L, Liu S Y, Baret F, Hemerlé M, Comar A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens Environ, 2017; 198: 105–114.
[24] He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. Proc IEEE Int Conf Comput Vis, 2017; 42(2): 386–397.
[25] Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus: IEEE, 2014; pp.580–587.
[26] Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell, 2017; 39(6): 1137–1149.
[27] Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Trans Pattern Anal Mach Intell, 2017; 39(4): 640–651.
[28] He K, Zhang X, Ren S, Sun J. Deep Residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2016; pp.770–778.
[29] Lin T-Y Y, Dollár P, Girshick R, He K, Hariharan B, Belongie S. Feature pyramid networks for object detection. In: Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017. 2017; pp.936–944.
[30] Waleed A. Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow. Github 2017; Available: https://github.com/matterport/Mask_RCNN. Acceseed on [2018-12-01].
[31] Neubeck A, Van Gool L. Efficient Non-Maximum Suppression. In: 18th International Conference on Pattern Recognition (ICPR’06). Hong Kong: IEEE, 2006; pp.850–885.
[32] Fletcher S J. Kalman filter and smoother. In: Data Assimilation for the Geosciences. Elsevier; 2017. pp.765–782.
[33] Lee W C, Krumm J. Trajectory preprocessing. In: Zheng Y, Zhou X, editors. Computing with Spatial Trajectories. New York: Springer New York, 2011; pp.3–33.
[34] Sugiura R, Noguchi N, Ishii K. Remote-sensing technology for vegetation monitoring using an unmanned helicopter. Biosyst Eng, 2005; 90(4): 369–379.
[35] Ester M, Kriegel H P, Sander J, Xu X. A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining, 1996; pp.226–231.
[36] Archontoulis S V, Miguez F E. Nonlinear regression models and applications in agricultural research. Agron J, 2015; 107(2): 786. doi: 10.2134/agronj2012.0506.
[37] Sheehy J E. Bi-phasic growth patterns in rice. Ann Bot, 2004; 94(6): 811–817.
Downloads
Published
2021-02-10
How to Cite
Lyu, S., Noguchi, N., Ospina, R., & Kishima, Y. (2021). Development of phenotyping system using low altitude UAV imagery and deep learning. International Journal of Agricultural and Biological Engineering, 14(1), 207–215. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/6025
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).