Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC
Keywords:
visual navigation, unmanned ground vehicle, Hough matrix, RANSAC algorithm, orchard, H-componentAbstract
The objective of this study was to develop a visual navigation system capable of navigating an unmanned ground vehicle (UGV) travelling between tree rows in the outdoor orchard. Thus far, while most research has developed algorithms that deal with ground structures in the orchard, this study focused on the background of canopy plus sky to eliminate the interference factors such as inconsistent lighting, shadows, and color similarities in features. Aiming at the problem that the traditional Hough transform and the least square method are difficult to be applied under outdoor conditions, an algorithm combining Hough matrix and random sample consensus (RANSAC) was proposed to extract the navigation path. In the image segmentation stage, this study used an H-component that was adopted to extract the target path of the canopy plus sky. Then, after denoising and smoothing the image by morphological operation, line scanning was used to determine the midpoint of the target path. For navigation path extraction, this study extracted the feature points through Hough matrix to eliminate the redundant points, and RANSAC was used to reduce the impact of the noise points caused by different canopy shapes and fit the navigation path. The path acquisition experiment proved that the accuracy of Hough matrix and RANSAC method was 90.36%-96.81% and the time consumption of the program was within 0.55 s under different sunlight intensities. This method was superior to the traditional Hough transform in real-time and accuracy, and had higher accuracy, slightly worse real-time compared with the least square method. Furthermore, the OPENMV was used to capture the ground information of the orchard. The experiment proved that the recognition rate of OPENMV for identifying turning information was 100%, and the program running time was 0.17-0.19 s. Field experiments showed that the UGV could autonomously navigate the rows with a maximum lateral error of 0.118 m and realize the automatic turning of the UGV. The algorithm satisfied the practical operation requirements of autonomous vehicles in the orchard. So the UGV has the potential to guide multipurpose agricultural vehicles in outdoor orchards in the future. Keywords: visual navigation, unmanned ground vehicle, Hough matrix, RANSAC algorithm, orchard, H-component DOI: 10.25165/j.ijabe.20211406.5953 Citation: Zhou M K, Xia J F, Yang F, Zheng K, Hu M J, Li D, et al. Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. Int J Agric & Biol Eng, 2021; 14(6): 176–184.References
[1] Lemos RAD, Nogueira LACDO, Ribeiro AM, Mirisola L G B, Koyama M F, de Paiva E C, et al. Unisensory intra-row navigation strategy for orchards environments based on sensor laser. Congresso Brasileiro de Automática, 2018; 22: 0400. doi: 10.20906/CPS/CBA2018-0400.
[2] Thanpattranon P, Ahamed T, Takigawa T. Navigation of autonomous tractor for orchards and plantations using a laser range finder: Automatic control of trailer position with tractor. Biosystems Engineering, 2016; 147: 90–103.
[3] Blok P M, van Boheemen K, van Evert F K, IJsselmuiden J, Kim G-H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Computers and Electronics in Agriculture, 2019; 157: 261–269.
[4] Passalaqua B P, Molin J P. Path errors in sugarcane transshipment trailers. Engenharia Agrícola, 2020; 40(2): 223–231.
[5] Luo C, Mohsenimanesh A, Laguë C. Parallel point-to-point tracking for agricultural Wide-Span Implement Carrier (WSIC). Computers and Electronics in Agriculture, 2018, 153: 302–312.
[6] Sumesh K C, Ninsawat S, Som-ard J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Computers and Electronics in Agriculture, 2021, 180: 105903. doi: 10.1016/j.compag.2020.105903.
[7] Meng Q K, He J, Qiu R C, Ma X D, Si Y S, Zhang M, et al. Crop recognition and navigation line detection in natural environment based on machine vision. Acta Optica Sinica, 2014; 34(7): 180–186. (in Chinese)
[8] Oksanen T. Laser scanner based collision prevention system for autonomous agricultural tractor. Agronomy Research, 2015; 13(1): 167–172.
[9] Zhang H C, Zheng J Q, Dorr G, Zhou H P, Ge Y F. Testing of GPS accuracy for precision forestry applications. Arabian Journal for Science and Engineering, 2014; 39(1): 237–245.
[10] Bengochea-Guevara J M, Conesa-Muñoz J, Andújar D, Ribeiro A. Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors, 2016; 16(3): 276. doi: 10.3390/s16030276.
[11] Choi K H, Han S K, Han S H, Park K H, Kim K S, Kim S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Computers and Electronics in Agriculture, 2015; 113: 266–274.
[12] Yao L J, Hu D, Yang Z D, Li H B, Qian M B. Depth recovery for unstructured farmland road image using an improved SIFT algorithm. Int J Agric & Biol Eng, 2019; 12(4): 141–147.
[13] Chang Q X, Xiong Z K. Vision-aware target recognition toward autonomous robot by Kinect sensors. Signal Processing: Image Communication, 2020; 84: 115810. doi: 10.1016/j.image.2020.115810.
[14] Ma Y, Zhang W Q, Qureshi W S, Gao C, Zhang C L, Li W. Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control. Information Processing in Agriculture, 2020; 8(1): 15–26.
[15] Yang S J, Mei S L, Zhang Y N. Detection of maize navigation centerline based on machine vision. IFAC-PapersOnLine, 2018; 51(17): 570–575.
[16] Si Y S, Jiang G Q, Liu G, Gao R, Liu Z X. Early stage crop rows detection based on least square method. Transactions of the Chinese Society of Agricultural Machinery, 2010; 41(7): 163–167, 185. (in Chinese)
[17] Hu L, Luo X W, Zhang Z G, Chen X F, Lin C X. Side-shift offset identification and control of crop row tracking for intra-row mechanical weeding. Transactions of the CSAE, 2013; 29(14): 8–14. (in Chinese)
[18] Zhang R J, Li M Z, Zhang M, Liu G. Rapid crop-row detection based on improved Hough transformation. Transactions of the Chinese Society for Agricultural Machinery, 2009; 40(7): 163–166. (in Chinese)
[19] Mochizuki Y, Torii A, Imiya A. N-Point Hough transform for line detection. Journal of Visual Communication and Image Representation, 2009; 20(4): 242–253.
[20] Mukhopadhyay P, Chaudhuri B B. A survey of Hough Transform. Pattern Recognition, 2015, 48(3): 993–1010.
[21] Vera E, Lucio D, Fernandes L A F, Velho L. Hough Transform for real-time plane detection in depth images. Pattern Recognition Letters, 2018; 103: 8–15.
[22] Barawid O C, Mizushima A, Ishii K, Noguchi N. Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosystems Engineering, 2007; 96(2): 139–149.
[23] Chen Z W, Li W, Zhang W Q, Li Y W, Li M S, Li H. Vegetable crop row extraction method based on accumulation threshold of Hough Transformation. Transactions of the CSAE, 2019; 35(22): 314–322. (in Chinese)
[24] Chen J Q, Qiang H, Wu J H, Xu G W, Wang Z K. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Computers and Electronics in Agriculture, 2021; 180: 105911. doi: 10.1016/j.compag.2020.105911.
[25] Li Y, Gans N R. Predictive RANSAC: Effective model fitting and tracking approach under heavy noise and outliers. Computer Vision and Image Understanding, 2017; 161: 99–113.
[26] Zhou S Z, Kang F , Li W B, Kan J M, Zheng Y J, He G J. Extracting diameter at breast height with a handheld mobile LiDAR system in an outdoor environment. Sensors (Basel, Switzerland), 2019; 19(14): 3212. doi: 10.3390/s19143212.
[27] Zhu R J, Zhu Y H, Wang L, Lu W, Luo H, Zhang Z C. Cotton positioning technique based on binocular vision with implementation of scale-invariant feature transform algorithm. Transactions of the CSAE, 2016; 32(6): 182–188. (in Chinese).
[28] Sun Q, Zhang Y, Wang J G, Gao W. An improved FAST feature extraction based on RANSAC method of vision/SINS integrated navigation system in GNSS-denied environments. Advances in Space Research, 2017; 60(12): 2660–2671.
[29] Bochtis D, Griepentrog H W, Vougioukas S, Busato P, Beruto R, Zhou K. Route planning for orchard operations. Computers and Electronics in Agriculture, 2015; 113: 51–60.
[30] Li Y, Ding W L, Zhang X G, Ju Z J. Road detection algorithm for Autonomous Navigation Systems based on dark channel prior and vanishing point in complex road scenes. Robotics and Autonomous Systems, 2016; 85: 1–11.
[31] Narayan A, Tuci E, Labrosse F, Alkilabi M H M. A dynamic colour perception system for autonomous robot navigation on unmarked roads. Neurocomputing, 2018; 275: 2251–2263.
[32] Li J B, Zhu R G, Chen B Q. Image detection and verification of visual navigation route during cotton field management period period. Int J Agric & Biol Eng, 2018; 11(6): 159–165.
[33] Radcliffe J, Cox J, Bulanon D M. Machine vision for orchard navigation. Computers in Industry, 2018; 98: 165–17
[2] Thanpattranon P, Ahamed T, Takigawa T. Navigation of autonomous tractor for orchards and plantations using a laser range finder: Automatic control of trailer position with tractor. Biosystems Engineering, 2016; 147: 90–103.
[3] Blok P M, van Boheemen K, van Evert F K, IJsselmuiden J, Kim G-H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Computers and Electronics in Agriculture, 2019; 157: 261–269.
[4] Passalaqua B P, Molin J P. Path errors in sugarcane transshipment trailers. Engenharia Agrícola, 2020; 40(2): 223–231.
[5] Luo C, Mohsenimanesh A, Laguë C. Parallel point-to-point tracking for agricultural Wide-Span Implement Carrier (WSIC). Computers and Electronics in Agriculture, 2018, 153: 302–312.
[6] Sumesh K C, Ninsawat S, Som-ard J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Computers and Electronics in Agriculture, 2021, 180: 105903. doi: 10.1016/j.compag.2020.105903.
[7] Meng Q K, He J, Qiu R C, Ma X D, Si Y S, Zhang M, et al. Crop recognition and navigation line detection in natural environment based on machine vision. Acta Optica Sinica, 2014; 34(7): 180–186. (in Chinese)
[8] Oksanen T. Laser scanner based collision prevention system for autonomous agricultural tractor. Agronomy Research, 2015; 13(1): 167–172.
[9] Zhang H C, Zheng J Q, Dorr G, Zhou H P, Ge Y F. Testing of GPS accuracy for precision forestry applications. Arabian Journal for Science and Engineering, 2014; 39(1): 237–245.
[10] Bengochea-Guevara J M, Conesa-Muñoz J, Andújar D, Ribeiro A. Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors, 2016; 16(3): 276. doi: 10.3390/s16030276.
[11] Choi K H, Han S K, Han S H, Park K H, Kim K S, Kim S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Computers and Electronics in Agriculture, 2015; 113: 266–274.
[12] Yao L J, Hu D, Yang Z D, Li H B, Qian M B. Depth recovery for unstructured farmland road image using an improved SIFT algorithm. Int J Agric & Biol Eng, 2019; 12(4): 141–147.
[13] Chang Q X, Xiong Z K. Vision-aware target recognition toward autonomous robot by Kinect sensors. Signal Processing: Image Communication, 2020; 84: 115810. doi: 10.1016/j.image.2020.115810.
[14] Ma Y, Zhang W Q, Qureshi W S, Gao C, Zhang C L, Li W. Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control. Information Processing in Agriculture, 2020; 8(1): 15–26.
[15] Yang S J, Mei S L, Zhang Y N. Detection of maize navigation centerline based on machine vision. IFAC-PapersOnLine, 2018; 51(17): 570–575.
[16] Si Y S, Jiang G Q, Liu G, Gao R, Liu Z X. Early stage crop rows detection based on least square method. Transactions of the Chinese Society of Agricultural Machinery, 2010; 41(7): 163–167, 185. (in Chinese)
[17] Hu L, Luo X W, Zhang Z G, Chen X F, Lin C X. Side-shift offset identification and control of crop row tracking for intra-row mechanical weeding. Transactions of the CSAE, 2013; 29(14): 8–14. (in Chinese)
[18] Zhang R J, Li M Z, Zhang M, Liu G. Rapid crop-row detection based on improved Hough transformation. Transactions of the Chinese Society for Agricultural Machinery, 2009; 40(7): 163–166. (in Chinese)
[19] Mochizuki Y, Torii A, Imiya A. N-Point Hough transform for line detection. Journal of Visual Communication and Image Representation, 2009; 20(4): 242–253.
[20] Mukhopadhyay P, Chaudhuri B B. A survey of Hough Transform. Pattern Recognition, 2015, 48(3): 993–1010.
[21] Vera E, Lucio D, Fernandes L A F, Velho L. Hough Transform for real-time plane detection in depth images. Pattern Recognition Letters, 2018; 103: 8–15.
[22] Barawid O C, Mizushima A, Ishii K, Noguchi N. Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosystems Engineering, 2007; 96(2): 139–149.
[23] Chen Z W, Li W, Zhang W Q, Li Y W, Li M S, Li H. Vegetable crop row extraction method based on accumulation threshold of Hough Transformation. Transactions of the CSAE, 2019; 35(22): 314–322. (in Chinese)
[24] Chen J Q, Qiang H, Wu J H, Xu G W, Wang Z K. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Computers and Electronics in Agriculture, 2021; 180: 105911. doi: 10.1016/j.compag.2020.105911.
[25] Li Y, Gans N R. Predictive RANSAC: Effective model fitting and tracking approach under heavy noise and outliers. Computer Vision and Image Understanding, 2017; 161: 99–113.
[26] Zhou S Z, Kang F , Li W B, Kan J M, Zheng Y J, He G J. Extracting diameter at breast height with a handheld mobile LiDAR system in an outdoor environment. Sensors (Basel, Switzerland), 2019; 19(14): 3212. doi: 10.3390/s19143212.
[27] Zhu R J, Zhu Y H, Wang L, Lu W, Luo H, Zhang Z C. Cotton positioning technique based on binocular vision with implementation of scale-invariant feature transform algorithm. Transactions of the CSAE, 2016; 32(6): 182–188. (in Chinese).
[28] Sun Q, Zhang Y, Wang J G, Gao W. An improved FAST feature extraction based on RANSAC method of vision/SINS integrated navigation system in GNSS-denied environments. Advances in Space Research, 2017; 60(12): 2660–2671.
[29] Bochtis D, Griepentrog H W, Vougioukas S, Busato P, Beruto R, Zhou K. Route planning for orchard operations. Computers and Electronics in Agriculture, 2015; 113: 51–60.
[30] Li Y, Ding W L, Zhang X G, Ju Z J. Road detection algorithm for Autonomous Navigation Systems based on dark channel prior and vanishing point in complex road scenes. Robotics and Autonomous Systems, 2016; 85: 1–11.
[31] Narayan A, Tuci E, Labrosse F, Alkilabi M H M. A dynamic colour perception system for autonomous robot navigation on unmarked roads. Neurocomputing, 2018; 275: 2251–2263.
[32] Li J B, Zhu R G, Chen B Q. Image detection and verification of visual navigation route during cotton field management period period. Int J Agric & Biol Eng, 2018; 11(6): 159–165.
[33] Radcliffe J, Cox J, Bulanon D M. Machine vision for orchard navigation. Computers in Industry, 2018; 98: 165–17
Downloads
Published
2021-12-16
How to Cite
Zhou, M., Xia, J., Yang, F., Zheng, K., Hu, M., Li, D., & Zhang, S. (2021). Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. International Journal of Agricultural and Biological Engineering, 14(6), 176–184. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/5953
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).