Measurement of the distance from grain divider to harvesting boundary based on dynamic regions of interest
Keywords:
distance, harvesting boundary, dynamic, region of interest, combine harvester, measurementAbstract
Combine harvesters need to work along the crop boundary line during operation. Lateral deviation (the distance between the grain divider and the harvesting boundary line) is important navigation information. Improving the measurement accuracy and real-time performance of lateral deviation is an effective way to improve navigation accuracy. Aiming at the problems of poor real-time performance and low measurement accuracy of existing lateral deviation measurement methods, a method of dynamic selection of the region of interest in the process of image processing was proposed and verified by field experiments. The calculation of lateral deviation includes the following stages: analyzing the average gray value of each column in the field image; drawing dynamic region of interest using maximum average gray value; extracting the rice boundary line by using the probabilistic Hough transform algorithm; predicting the location of the boundary line by using the Kalman filter algorithm; measuring the lateral deviation by using the inverse perspective transform algorithm. The analysis of images under different rice fields showed that the method can effectively identify crop boundary lines. According to the test results of calling cameras in different installation positions, the highest extraction success rate of the boundary line was 96.9%, the average success rate was 94.8%, and the speed of real-time measurement of lateral deviation was 0.065 s/frame. When the driving speed was 0.4 m/s, the detection error of linear tracking detection was less than 4.3 cm. With the increase of speed, the error gradually increased. The algorithm has a good real-time performance and high accuracy during low-speed driving. Keywords: distance, harvesting boundary, dynamic, region of interest, combine harvester, measurement DOI: 10.25165/j.ijabe.20211404.6138 Citation: Chen J, Song J, Guan Z H, Lian Y. Measurement of the distance from grain divider to harvesting boundary based on dynamic regions of interest. Int J Agric & Biol Eng, 2021; 14(4): 226–232.References
[1] Patrício D I, Rieder R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Computers and Electronics in Agriculture, 2018; 153: 69–81.
[2] Tan C J, Li Y L, Wang D F, Mao W J, Yang F Z. Review on automatic navigation technologies of agricultural machinery. Journal of Agricultural Mechanization Research, 2020; 42(5): 7–14, 32. (in Chinese)
[3] Zhao Y. Extension and development suggestions on small-scale agricultural machinery in context of precision agriculture. Agricultural Engineering, 2019; 9(8): 25–27. (in Chinese)
[4] Reid J F, Zhang Q, Noguchi N, Dickson M. Agricultural automatic guidance research in North America. Computers and Electronics in Agriculture, 2000; 25(1-2): 155–167.
[5] Yin X, Du J, Geng D Y, Jin C Q. Development of an automatically guided rice transplanter using RTK-GNSS and IMU. IFAC-PapersOnLine, 2018; 51(17): 374–378.
[6] Oksanen T, Visala A. Coverage path planning algorithms for agricultural field machines. Journal of Field Robotics, 2009; 26(8): 651–668.
[7] Zhu J, Fang Y, Abu-Haimed H, Lien K C, Fu D D, Gu J L. Learning object-specific distance from a monocular image. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. IEEE, 2019; pp.3839–3848. doi: 10.1109/ICCV.2019.00394.
[8] Zhang Z M, Zhang S H. An efficient vision-based pose estimation algorithm using the assistant reference planes based on the perspective projection rays. Sensors & Actuators: A Physical, 2018; 272: 301–309.
[9] Yu Z Y, Xiao J. Research on ultrasonic ranging in agricultural harvesters. Guangdong Agricultural Sciences, 2012; 39(1): 155–157. (in Chinese)
[10] Tuohy S, O’Cualain D, Jones E, Glavin M. Distance determination for an automobile environment using Inverse Perspective Mapping in OpenCV. In: IET Irish Signals and Systems Conference (ISSC 2010), 2010; pp.100–105. doi: 10.1049/cp.20210.0495.
[11] Yang W J, Yang X H, Wei Y D, Cheng X. Study of the recognition and tracking methods for lane lines based on image edge detections. In: Proceedings of the 2018 International Symposium on Communication Engineering & Computer Science (CECS 2018). Atlantis Press, 2018; pp.189–195. doi: 10.2991/cecs-18.2018.33.
[12] Peng H, Xiao J S, Cheng X, Li B J, Song X. Lane detection algorithm
based on extended kalman filter. Journal of Optoelectronics·Laser, 2015; 26(3): 567–574. (in Chinese)
[13] Gao Q, Feng Y, Wang L. A real-time lane detection and tracking algorithm. In: Proceedings of 2017 IEEE 2nd Information Technology, Networking, Electronic and Automation Control Conference (ITNEC 2017). IEEE, 2017; pp.1230–1234. doi: 10.1109/ITNC.2017.8284972.
[14] Qian J D, Chen B, Qian J Y, Chen G. Fast lane detection algorithm based on region of interest model. Journal of University of Electronic Science and Technology of China, 2018; 47(3): 356–361. (in Chinese)
[15] Kamath R, Balachandra M, Prabhu S. Raspberry Pi as visual sensor nodes in precision agriculture: A study. IEEE Access, 2019; 7: 45110–45122.
[16] Quiroz R A A, Guidotti F, Bedoya A. A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. In: 2019 XXII Symposium on Image, Signal Processing and Artificial Vision (STSIVA). IEEE, 2019; pp.1–5. doi: 10.1109/STSIVA.2019.8730214.
[17] Wu G, Tan Y, Zheng Y J, Wang S M. Walking goal line detection for grain combine harvester based on machine vision. Transactions of the CSAM, 2012; 43(S1): 266–270. (in Chinese)
[18] Winterhalter W, Fleckenstein F V, Dornhege C, Burgard W. Crop row detection on tiny plants with the pattern hough transform. IEEE Robotics & Automation Letters, 2018; 3(4): 3394–3401.
[19] Xin C, Liu Y. Research on lane recognition algorithm based on probability hough transform. Bulletin of Surveying and Mapping, 2019; (S2): 52–55. (in Chinese)
[20] Guo S T, Li Z Y, Zhao C. Lane detection method combining Hough transform with Kalman filtering. Journal of China University of Metrology, 2017; 28(4): 460–466. (in Chinese)
[2] Tan C J, Li Y L, Wang D F, Mao W J, Yang F Z. Review on automatic navigation technologies of agricultural machinery. Journal of Agricultural Mechanization Research, 2020; 42(5): 7–14, 32. (in Chinese)
[3] Zhao Y. Extension and development suggestions on small-scale agricultural machinery in context of precision agriculture. Agricultural Engineering, 2019; 9(8): 25–27. (in Chinese)
[4] Reid J F, Zhang Q, Noguchi N, Dickson M. Agricultural automatic guidance research in North America. Computers and Electronics in Agriculture, 2000; 25(1-2): 155–167.
[5] Yin X, Du J, Geng D Y, Jin C Q. Development of an automatically guided rice transplanter using RTK-GNSS and IMU. IFAC-PapersOnLine, 2018; 51(17): 374–378.
[6] Oksanen T, Visala A. Coverage path planning algorithms for agricultural field machines. Journal of Field Robotics, 2009; 26(8): 651–668.
[7] Zhu J, Fang Y, Abu-Haimed H, Lien K C, Fu D D, Gu J L. Learning object-specific distance from a monocular image. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. IEEE, 2019; pp.3839–3848. doi: 10.1109/ICCV.2019.00394.
[8] Zhang Z M, Zhang S H. An efficient vision-based pose estimation algorithm using the assistant reference planes based on the perspective projection rays. Sensors & Actuators: A Physical, 2018; 272: 301–309.
[9] Yu Z Y, Xiao J. Research on ultrasonic ranging in agricultural harvesters. Guangdong Agricultural Sciences, 2012; 39(1): 155–157. (in Chinese)
[10] Tuohy S, O’Cualain D, Jones E, Glavin M. Distance determination for an automobile environment using Inverse Perspective Mapping in OpenCV. In: IET Irish Signals and Systems Conference (ISSC 2010), 2010; pp.100–105. doi: 10.1049/cp.20210.0495.
[11] Yang W J, Yang X H, Wei Y D, Cheng X. Study of the recognition and tracking methods for lane lines based on image edge detections. In: Proceedings of the 2018 International Symposium on Communication Engineering & Computer Science (CECS 2018). Atlantis Press, 2018; pp.189–195. doi: 10.2991/cecs-18.2018.33.
[12] Peng H, Xiao J S, Cheng X, Li B J, Song X. Lane detection algorithm
based on extended kalman filter. Journal of Optoelectronics·Laser, 2015; 26(3): 567–574. (in Chinese)
[13] Gao Q, Feng Y, Wang L. A real-time lane detection and tracking algorithm. In: Proceedings of 2017 IEEE 2nd Information Technology, Networking, Electronic and Automation Control Conference (ITNEC 2017). IEEE, 2017; pp.1230–1234. doi: 10.1109/ITNC.2017.8284972.
[14] Qian J D, Chen B, Qian J Y, Chen G. Fast lane detection algorithm based on region of interest model. Journal of University of Electronic Science and Technology of China, 2018; 47(3): 356–361. (in Chinese)
[15] Kamath R, Balachandra M, Prabhu S. Raspberry Pi as visual sensor nodes in precision agriculture: A study. IEEE Access, 2019; 7: 45110–45122.
[16] Quiroz R A A, Guidotti F, Bedoya A. A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. In: 2019 XXII Symposium on Image, Signal Processing and Artificial Vision (STSIVA). IEEE, 2019; pp.1–5. doi: 10.1109/STSIVA.2019.8730214.
[17] Wu G, Tan Y, Zheng Y J, Wang S M. Walking goal line detection for grain combine harvester based on machine vision. Transactions of the CSAM, 2012; 43(S1): 266–270. (in Chinese)
[18] Winterhalter W, Fleckenstein F V, Dornhege C, Burgard W. Crop row detection on tiny plants with the pattern hough transform. IEEE Robotics & Automation Letters, 2018; 3(4): 3394–3401.
[19] Xin C, Liu Y. Research on lane recognition algorithm based on probability hough transform. Bulletin of Surveying and Mapping, 2019; (S2): 52–55. (in Chinese)
[20] Guo S T, Li Z Y, Zhao C. Lane detection method combining Hough transform with Kalman filtering. Journal of China University of Metrology, 2017; 28(4): 460–466. (in Chinese)
Downloads
Published
2021-07-31
How to Cite
Chen, J., Song, J., Guan, Z., & Lian, Y. (2021). Measurement of the distance from grain divider to harvesting boundary based on dynamic regions of interest. International Journal of Agricultural and Biological Engineering, 14(4), 226–232. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/6138
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).