Automatic greenhouse pest recognition based on multiple color space features
Keywords:
ensemble learning classifier, greenhouse sticky trap, automated pest recognition and counting, HSI and Lab color spaces, multiple color space featuresAbstract
Recognition and counting of greenhouse pests are important for monitoring and forecasting pest population dynamics. This study used image processing techniques to recognize and count whiteflies and thrips on a sticky trap located in a greenhouse environment. The digital images of sticky traps were collected using an image-acquisition system under different greenhouse conditions. If a single color space is used, it is difficult to segment the small pests correctly because of the detrimental effects of non-uniform illumination in complex scenarios. Therefore, a method that first segments object pests in two color spaces using the Prewitt operator in I component of the hue-saturation-intensity (HSI) color space and the Canny operator in the B component of the Lab color space was proposed. Then, the segmented results for the two-color spaces were summed and achieved 91.57% segmentation accuracy. Next, because different features of pests contribute differently to the classification of pest species, the study extracted multiple features (e.g., color and shape features) in different color spaces for each segmented pest region to improve the recognition performance. Twenty decision trees were used to form a strong ensemble learning classifier that used a majority voting mechanism and obtains 95.73% recognition accuracy. The proposed method is a feasible and effective way to process greenhouse pest images. The system accurately recognized and counted pests in sticky trap images captured under real greenhouse conditions. Keywords: ensemble learning classifier, greenhouse sticky trap, automated pest recognition and counting, HSI and Lab color spaces, multiple color space features DOI: 10.25165/j.ijabe.20211402.5098 Citation: Yang Z K, Li W Y, Li M, Yang X T. Automatic greenhouse pest recognition based on multiple color space features. Int J Agric & Biol Eng, 2021; 14(2): 188–195.References
[1] Qiao M, Lim J, Ji C W, Chung B K, Kim H Y, Uhm K B, et al. Density estimation of Bemisia tabaci (Hemiptera: Aleyrodidae) in a greenhouse using sticky traps in conjunction with an image processing system. J. Asia-Pacific Entomol, 2008; 11(1): 25–29.
[2] Pinto-Zevallos D M, Vänninen I. Yellow sticky traps for decision-making in whitefly management: what has been achieved? Crop Prot, 2013; 47: 74–84.
[3] Allen W A, Rajotte E G. The changing role of extension entomology in the IPM era. Annu. Rev. Entomol, 1990; 35: 379–397.
[4] Hu Y Q, Song L T, Zhang J, Xie C J, Li R. Pest image recognition of multi-feature fusion based on sparse representation. Pattern Recognition and Artificial Intelligence, 2014; 27(5): 985–992. (in Chinese)
[5] Kang S H, Song S H, Lee S H. Identification of butterfly species with a single neural network system. J. Asia-Pacific Entomol, 2012; 15(3): 431–435.
[6] Kang S H, Cho J H, Lee S H. Identification of butterfly based on their shapes when viewed from different angles using an artificial neural network. J. Asia-Pacific Entomol, 2014; 17(2): 143–149.
[7] Tofilski A. DrawWing, a program for numerical description of insect wings. Journal of Insect Science, 2004; 4(1): 17. doi: 10.1673/ 031.004.1701.
[8] Wang J, Lin C, Ji L, Liang A. A new automatic identification system of insect images at the order level. Knowl.-Based Syst, 2012; 33: 102–110.
[9] Li W Y, Li M, Qian J P, Sun C H, Du S F, Chen M X. Segmentation method for touching pest images based on shape factor and separation points location. Transactions of the Chinese Society of Agricultural Engineering, 2015; 31(6): 175–180. (in Chinese)
[10] Larios N, Soran B, Shapiro L G, Martínez-Muñoz G, Lin J, Dietterich T G. Haar random forest features and SVM spatial matching kernel for stonefly species identification. In: 2010 20th International Conference on Pattern Recognition. Istanbul: IEEE, 2010; pp.2624–2627.
[11] Larios N, Deng H L, Zhang W, Sarpola M, Yuen J, Paasch R, et al. Automated insect identification through concatenated histograms of local appearance features: feature vector generation and region detection for deformable objects. Machine Vision and Applications, 2008; 19: 105–123.
[12] Martinez-Munoz G, Larios N, Mortensen E, Zhang W, Yamamuro A, Paasch R, et al. Dictionary-free categorization of very similar objects via stacked evidence trees. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition. Miami, FL: IEEE, 2009; pp.549–556.
[13] Lytle D A, Martínez-Muñoz G, Zhang W, Larios N, Shapiro L, Paasch R, et al. Automated processing and identification of benthic invertebrate samples. J. North Am. Benthol. Soc, 2010; 29(3): 867–874.
[14] Boissard P, Martin V, Moisan S. A cognitive vision approach to early pest detection in greenhouse crops. Comput. Electron. Agric, 2008; 62(2): 81–93.
[15] Xia C L, Chon T S, Ren Z M, Lee J M. Automatic identification and counting of small size pests in greenhouse conditions with low computational cost. Ecol Inform, 2015; 29(6): 139–146.
[16] Wang X, Hänsch R, Ma L Z, Hellwich O. Comparison of different color spaces for image segmentation using graph-cut. In: 2014 International Conference on Computer Vision Theory and Applications. Lisbon: IEEE, 2014; pp.301–308.
[17] Maharlooei M, Sivarajan S, Bajwa S G, Harmon J P, Nowatzki J. Detection of soybean aphids in a greenhouse using an image processing technique. Comput. Electron. Agric, 2017; 132: 63–70.
[18] Ebrahimi M A, Khoshtaghaza M H, Minaei S, Jamshidi B. Vision-based pest detection based on SVM classification method. Comput. Electron. Agric, 2017; 137(6): 52–58.
[19] Sun Y R, Cheng H, Cheng Q, Zhou H Y, Li M H, Fan Y H, et al. A smart-vision algorithm for counting whiteflies and thrips on sticky traps using two-dimensional Fourier transform spectrum. Biosystems Engineering, 2017; 153: 82–88.
[20] Heinz K M, Parrella M P, Newman J P. Time-efficient use of yellow sticky traps in monitoring insect populations. Journal of Economic Entomology, 1992; 85(6): 2263–2269.
[21] Steiner M Y, Spohr L J, Barchia I, Goodwin S. Rapid estimation of numbers of whiteflies (Hemiptera: Aleurodidae) and thrips (Thysanoptera: Thripidae) on sticky traps. Australian Journal of Entomology, 1999; 38: 367–372.
[22] Cho J, Choi J, Qiao M, Ji C W, Kim H Y, Uhm K, et al. Automatic identification of whiteflies, aphids and thrips in greenhouse based on image analysis. International Journal of Mathematics and Computers in Simulation, 2007; 1(1): 46–53.
[23] Espinoza E, Valera D L, Torres J A, López A, Molina-Aiz F D. Combination of image processing and artificial neural networks as a novel approach for the identification of Bemisia tabaci and Frankliniella occidentalis on sticky traps in greenhouse agriculture. Comput. Electron. Agric, 2016; 127(3): 495–505.
[24] Solis-Sánchez L O, García-Escalante J J, Castañeda-Miranda R, Torres-Pacheco I, Guevara-González R. Machine vision algorithm for whiteflies (Bemisia tabaci Genn.) scouting under greenhouse environment. Journal of Applied Entomology, 2009; 133: 546–552.
[25] Solis-Sánchez L O, Castañeda-Miranda R, García-Escalante J J, Torres-Pacheco I, Guevara-González R G, Castañeda Miranda C L, et al. Scale invariant feature approach for insect monitoring. Computers and Electronics in Agriculture, 2011; 75(1): 92–99.
[26] Lowe D G. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 2004; 60: 91–110.
[27] Xiao D Q, Feng J Z, Lin T Y, Pang C H, Ye Y W. Classification and recognition scheme for vegetable pests based on the BOF-SVM model. Int J Agric & Biol Eng, 2018; 11(3): 190–196.
[28] Deng L M, Wang Y J, Han Z Z, Yu R S. Research on insect pest image detection and recognition based on bio-inspired methods. Biosystems Engineering, 2018; 169(8): 139–148.
[29] Kohavi R, Provost F. Glossary of terms: special issue on applications of machine learning and the knowledge discovery process. Mach. Learn, 1998; 30: 271–274.
[30] Xia C L, Lee J M, Li Y, Chung B K, Chon T S. In situ detection of small-size insect pests sampled on traps using multifractal analysis.
[2] Pinto-Zevallos D M, Vänninen I. Yellow sticky traps for decision-making in whitefly management: what has been achieved? Crop Prot, 2013; 47: 74–84.
[3] Allen W A, Rajotte E G. The changing role of extension entomology in the IPM era. Annu. Rev. Entomol, 1990; 35: 379–397.
[4] Hu Y Q, Song L T, Zhang J, Xie C J, Li R. Pest image recognition of multi-feature fusion based on sparse representation. Pattern Recognition and Artificial Intelligence, 2014; 27(5): 985–992. (in Chinese)
[5] Kang S H, Song S H, Lee S H. Identification of butterfly species with a single neural network system. J. Asia-Pacific Entomol, 2012; 15(3): 431–435.
[6] Kang S H, Cho J H, Lee S H. Identification of butterfly based on their shapes when viewed from different angles using an artificial neural network. J. Asia-Pacific Entomol, 2014; 17(2): 143–149.
[7] Tofilski A. DrawWing, a program for numerical description of insect wings. Journal of Insect Science, 2004; 4(1): 17. doi: 10.1673/ 031.004.1701.
[8] Wang J, Lin C, Ji L, Liang A. A new automatic identification system of insect images at the order level. Knowl.-Based Syst, 2012; 33: 102–110.
[9] Li W Y, Li M, Qian J P, Sun C H, Du S F, Chen M X. Segmentation method for touching pest images based on shape factor and separation points location. Transactions of the Chinese Society of Agricultural Engineering, 2015; 31(6): 175–180. (in Chinese)
[10] Larios N, Soran B, Shapiro L G, Martínez-Muñoz G, Lin J, Dietterich T G. Haar random forest features and SVM spatial matching kernel for stonefly species identification. In: 2010 20th International Conference on Pattern Recognition. Istanbul: IEEE, 2010; pp.2624–2627.
[11] Larios N, Deng H L, Zhang W, Sarpola M, Yuen J, Paasch R, et al. Automated insect identification through concatenated histograms of local appearance features: feature vector generation and region detection for deformable objects. Machine Vision and Applications, 2008; 19: 105–123.
[12] Martinez-Munoz G, Larios N, Mortensen E, Zhang W, Yamamuro A, Paasch R, et al. Dictionary-free categorization of very similar objects via stacked evidence trees. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition. Miami, FL: IEEE, 2009; pp.549–556.
[13] Lytle D A, Martínez-Muñoz G, Zhang W, Larios N, Shapiro L, Paasch R, et al. Automated processing and identification of benthic invertebrate samples. J. North Am. Benthol. Soc, 2010; 29(3): 867–874.
[14] Boissard P, Martin V, Moisan S. A cognitive vision approach to early pest detection in greenhouse crops. Comput. Electron. Agric, 2008; 62(2): 81–93.
[15] Xia C L, Chon T S, Ren Z M, Lee J M. Automatic identification and counting of small size pests in greenhouse conditions with low computational cost. Ecol Inform, 2015; 29(6): 139–146.
[16] Wang X, Hänsch R, Ma L Z, Hellwich O. Comparison of different color spaces for image segmentation using graph-cut. In: 2014 International Conference on Computer Vision Theory and Applications. Lisbon: IEEE, 2014; pp.301–308.
[17] Maharlooei M, Sivarajan S, Bajwa S G, Harmon J P, Nowatzki J. Detection of soybean aphids in a greenhouse using an image processing technique. Comput. Electron. Agric, 2017; 132: 63–70.
[18] Ebrahimi M A, Khoshtaghaza M H, Minaei S, Jamshidi B. Vision-based pest detection based on SVM classification method. Comput. Electron. Agric, 2017; 137(6): 52–58.
[19] Sun Y R, Cheng H, Cheng Q, Zhou H Y, Li M H, Fan Y H, et al. A smart-vision algorithm for counting whiteflies and thrips on sticky traps using two-dimensional Fourier transform spectrum. Biosystems Engineering, 2017; 153: 82–88.
[20] Heinz K M, Parrella M P, Newman J P. Time-efficient use of yellow sticky traps in monitoring insect populations. Journal of Economic Entomology, 1992; 85(6): 2263–2269.
[21] Steiner M Y, Spohr L J, Barchia I, Goodwin S. Rapid estimation of numbers of whiteflies (Hemiptera: Aleurodidae) and thrips (Thysanoptera: Thripidae) on sticky traps. Australian Journal of Entomology, 1999; 38: 367–372.
[22] Cho J, Choi J, Qiao M, Ji C W, Kim H Y, Uhm K, et al. Automatic identification of whiteflies, aphids and thrips in greenhouse based on image analysis. International Journal of Mathematics and Computers in Simulation, 2007; 1(1): 46–53.
[23] Espinoza E, Valera D L, Torres J A, López A, Molina-Aiz F D. Combination of image processing and artificial neural networks as a novel approach for the identification of Bemisia tabaci and Frankliniella occidentalis on sticky traps in greenhouse agriculture. Comput. Electron. Agric, 2016; 127(3): 495–505.
[24] Solis-Sánchez L O, García-Escalante J J, Castañeda-Miranda R, Torres-Pacheco I, Guevara-González R. Machine vision algorithm for whiteflies (Bemisia tabaci Genn.) scouting under greenhouse environment. Journal of Applied Entomology, 2009; 133: 546–552.
[25] Solis-Sánchez L O, Castañeda-Miranda R, García-Escalante J J, Torres-Pacheco I, Guevara-González R G, Castañeda Miranda C L, et al. Scale invariant feature approach for insect monitoring. Computers and Electronics in Agriculture, 2011; 75(1): 92–99.
[26] Lowe D G. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 2004; 60: 91–110.
[27] Xiao D Q, Feng J Z, Lin T Y, Pang C H, Ye Y W. Classification and recognition scheme for vegetable pests based on the BOF-SVM model. Int J Agric & Biol Eng, 2018; 11(3): 190–196.
[28] Deng L M, Wang Y J, Han Z Z, Yu R S. Research on insect pest image detection and recognition based on bio-inspired methods. Biosystems Engineering, 2018; 169(8): 139–148.
[29] Kohavi R, Provost F. Glossary of terms: special issue on applications of machine learning and the knowledge discovery process. Mach. Learn, 1998; 30: 271–274.
[30] Xia C L, Lee J M, Li Y, Chung B K, Chon T S. In situ detection of small-size insect pests sampled on traps using multifractal analysis.
Downloads
Published
2021-04-03
How to Cite
Yang, Z., Li, W., Li, M., & Yang, X. (2021). Automatic greenhouse pest recognition based on multiple color space features. International Journal of Agricultural and Biological Engineering, 14(2), 188–195. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/5098
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).