Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot
Keywords:
localization, odometry, IMU, RTK GPS, vineyard, robot, sensors fusion, ROS, precision farmingAbstract
This study proposed an approach for robot localization using data from multiple low-cost sensors with two goals in mind, to produce accurate localization data and to keep the computation as simple as possible. The approach used data from wheel odometry, inertial-motion data from the Inertial Motion Unit (IMU), and a location fix from a Real-Time Kinematics Global Positioning System (RTK GPS). Each of the sensors is prone to errors in some situations, resulting in inaccurate localization. The odometry is affected by errors caused by slipping when turning the robot or putting it on slippery ground. The IMU produces drifts due to vibrations, and RTK GPS does not return to an accurate fix in (semi-) occluded areas. None of these sensors is accurate enough to produce a precise reading for a sound localization of the robot in an outdoor environment. To solve this challenge, sensor fusion was implemented on the robot to prevent possible localization errors. It worked by selecting the most accurate readings in a given moment to produce a precise pose estimation. To evaluate the approach, two different tests were performed, one with robot localization from the robot operating system (ROS) repository and the other with the presented Field Robot Localization. The first did not perform well, while the second did and was evaluated by comparing the location and orientation estimate with ground truth, captured by a hovering drone above the testing ground, which revealed an average error of 0.005 m±0.220 m in estimating the position, and 0.6°±3.5° when estimating orientation. The tests proved that the developed field robot localization is accurate and robust enough to be used on a ROVITIS 4.0 vineyard robot. Keywords: localization, odometry, IMU, RTK GPS, vineyard, robot, sensors fusion, ROS, precision farming DOI: 10.25165/j.ijabe.20221506.6415 Citation: Rakun J, Pantano M, Lepej P, Lakota M. Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot. Int J Agric & Biol Eng, 2022; 15(6): 91–95.References
[1] Lely Astronaut, Robotic milking system. Available: https://pdf.agriexpo.online/pdf/lely/lely-astronaut/169577-7538.html. Accessed on[2020-10-28].
[2] Pilz K H, Feichter S. How robots will revolutionize agriculture. Journal of Science and Technology, 2017; 4: 3437.
[3] Drone mapping and analytics for agriculture. Available: https://www.precisionhawk.com/agriculture. Accessed on [2020-10-28].
[4] Fennimore S A, Cutulle M. Robotic weeders can improve weed control options for specialty crops. Pest Management Science, 2019; 75(7): 1767–1774.
[5] Durmuş H, Güneş E O, Kırcı M, Üstündağ B B. The design of general purpose autonomous agricultural mobile-robot: “AGROBOT”. In: 2015 Fourth International Conference on Agro-Geoinformatics (Agro-geoinformatics), Istanbul: IEEE, 2015; pp.49-53. doi: 10.1109/Agro-Geoinformatics.2015.7248088.
[6] Orchard and Vineyard Coverage. Available: https://asirobots.com/ farming/orchard-vineyard/. Accessed on [2020-10-30].
[7] Moore T, Stouch D. A generalized extended Kalman filter implementation for the robot operating system. Advances in Intelligent Systems and Computing, 2015; 302: 335–348.
[8] ROS-Robot Localization. Available: http://docs.ros.org/en/ noetic/api/robot_localization/html/index.html. Accessedo on [2020-11-1].
[9] Vursavus K K, Yurtlu Y B, Diezma-Iglesias B, Lleo-Garcia L, Ruiz-Altisent M. Classification of the firmness of peaches by sensor fusion. Int J Agric & Biol Eng, 2015; 8(6): 104–115.
[10] Huang S D, Dissanayake G. Robot localization: An introduction. Wiley Encyclopedia of Electrical and Electronics Engineering, 2016; W8318. doi: 10.1002/047134608X.W8318.
[11] Elmenreich W. An introduction to sensor fusion. Technische Universität Wien, Institut für Technische Informatik, 2002; 28p.
[12] Yang Q H, Chang C, Bao G J, Fan J, Xun Y. Recognition and localization system of the robot for harvesting Hangzhou White Chrysanthemums. Int J Agric & Biol Eng, 2018; 11(1): 88–95.
[13] Shalal N, Low T, McCarthy C, Hancock N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion - Part B: Mapping and localisation, Computers and Electronics in Agriculture, 2015; 119: 267–278.
[14] Chen X Y, Wang S A, Zhang B Q, Luo L. Multi-feature fusion tree trunk detection and orchard mobile robot localization using camera/ultrasonic sensors. Computers and Electronics in Agriculture, 2018; 147: 91–108.
[15] Nemec D, Šimak V, Janota A, Hruboš M, Bubenikova E. Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors. Robotics and Automation Systems, 2019; 112: 168–177.
[16] Rovitis Veneto. Available: https://www.aziendapantano.it/rovitis40.html. Accessed on [2020-10-30].
[17] Bianco P, Bellucci V, De Falco C, Deromedis S, Gentilini P, Carlo J, et al. Note sull'inquinamento da pesticidi in Italia (Notes on Italian pesticide pollution). GRE Lazio, European Consumers, 2017; 120p.
[18] Phidgets Spatila 3/3/3 IMU. Available: https://www.phidgets.com/ ?tier= 3&prodid=48. Accessed on [2020-10-30].
[19] Sick LMS111 LiDAR. Available: https://www.sick.com/ag/en/ detection-and-ranging-solutions/2d-lidar-sensors/lms1xx/lms111-10100/p/p109842. Accessed on [2020-10-30].
[20] Velodyne VLP16 LiDAR. Available: https://velodynelidar.com/products/ puck/. Accessed on [2020-10-30].
[21] Switnav Piksi RTK-GPS. Available: https://www.swiftnav.com/piksi-multi. Accessed on [2020-10-30].
[22] Quigley M, Conley K, Gerkey B P, Faust J, Foote T, Leibs J, er al. ROS: an open-source robot operating system. ICRA Workshop on Open Source Software, 2009; pp.1–6.
[23] Rep200 encoder. Available: https://www.elap.it/incrementalencoders/ encoder-rep/. Accessed on [2020-10-30].
[24] Phidgets high speed encoder. Available: https://www.phidgets.com/?tier =3&catid=4&pcid=2&prodid=51. Accessed on [2020-10-30].
[25] Martinez A, Fernandez E. Learning ROS for Robotics Programming. Packet Publishing Ltd., Bermingham, UK, 2013; 332p.
[26] Kaehler A, Bradski G. Learning OpenCV 3 computer vision in C++ with the OpenCV Library. Sebastopol: O’Reilly, 2016, pp.397–406.
[2] Pilz K H, Feichter S. How robots will revolutionize agriculture. Journal of Science and Technology, 2017; 4: 3437.
[3] Drone mapping and analytics for agriculture. Available: https://www.precisionhawk.com/agriculture. Accessed on [2020-10-28].
[4] Fennimore S A, Cutulle M. Robotic weeders can improve weed control options for specialty crops. Pest Management Science, 2019; 75(7): 1767–1774.
[5] Durmuş H, Güneş E O, Kırcı M, Üstündağ B B. The design of general purpose autonomous agricultural mobile-robot: “AGROBOT”. In: 2015 Fourth International Conference on Agro-Geoinformatics (Agro-geoinformatics), Istanbul: IEEE, 2015; pp.49-53. doi: 10.1109/Agro-Geoinformatics.2015.7248088.
[6] Orchard and Vineyard Coverage. Available: https://asirobots.com/ farming/orchard-vineyard/. Accessed on [2020-10-30].
[7] Moore T, Stouch D. A generalized extended Kalman filter implementation for the robot operating system. Advances in Intelligent Systems and Computing, 2015; 302: 335–348.
[8] ROS-Robot Localization. Available: http://docs.ros.org/en/ noetic/api/robot_localization/html/index.html. Accessedo on [2020-11-1].
[9] Vursavus K K, Yurtlu Y B, Diezma-Iglesias B, Lleo-Garcia L, Ruiz-Altisent M. Classification of the firmness of peaches by sensor fusion. Int J Agric & Biol Eng, 2015; 8(6): 104–115.
[10] Huang S D, Dissanayake G. Robot localization: An introduction. Wiley Encyclopedia of Electrical and Electronics Engineering, 2016; W8318. doi: 10.1002/047134608X.W8318.
[11] Elmenreich W. An introduction to sensor fusion. Technische Universität Wien, Institut für Technische Informatik, 2002; 28p.
[12] Yang Q H, Chang C, Bao G J, Fan J, Xun Y. Recognition and localization system of the robot for harvesting Hangzhou White Chrysanthemums. Int J Agric & Biol Eng, 2018; 11(1): 88–95.
[13] Shalal N, Low T, McCarthy C, Hancock N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion - Part B: Mapping and localisation, Computers and Electronics in Agriculture, 2015; 119: 267–278.
[14] Chen X Y, Wang S A, Zhang B Q, Luo L. Multi-feature fusion tree trunk detection and orchard mobile robot localization using camera/ultrasonic sensors. Computers and Electronics in Agriculture, 2018; 147: 91–108.
[15] Nemec D, Šimak V, Janota A, Hruboš M, Bubenikova E. Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors. Robotics and Automation Systems, 2019; 112: 168–177.
[16] Rovitis Veneto. Available: https://www.aziendapantano.it/rovitis40.html. Accessed on [2020-10-30].
[17] Bianco P, Bellucci V, De Falco C, Deromedis S, Gentilini P, Carlo J, et al. Note sull'inquinamento da pesticidi in Italia (Notes on Italian pesticide pollution). GRE Lazio, European Consumers, 2017; 120p.
[18] Phidgets Spatila 3/3/3 IMU. Available: https://www.phidgets.com/ ?tier= 3&prodid=48. Accessed on [2020-10-30].
[19] Sick LMS111 LiDAR. Available: https://www.sick.com/ag/en/ detection-and-ranging-solutions/2d-lidar-sensors/lms1xx/lms111-10100/p/p109842. Accessed on [2020-10-30].
[20] Velodyne VLP16 LiDAR. Available: https://velodynelidar.com/products/ puck/. Accessed on [2020-10-30].
[21] Switnav Piksi RTK-GPS. Available: https://www.swiftnav.com/piksi-multi. Accessed on [2020-10-30].
[22] Quigley M, Conley K, Gerkey B P, Faust J, Foote T, Leibs J, er al. ROS: an open-source robot operating system. ICRA Workshop on Open Source Software, 2009; pp.1–6.
[23] Rep200 encoder. Available: https://www.elap.it/incrementalencoders/ encoder-rep/. Accessed on [2020-10-30].
[24] Phidgets high speed encoder. Available: https://www.phidgets.com/?tier =3&catid=4&pcid=2&prodid=51. Accessed on [2020-10-30].
[25] Martinez A, Fernandez E. Learning ROS for Robotics Programming. Packet Publishing Ltd., Bermingham, UK, 2013; 332p.
[26] Kaehler A, Bradski G. Learning OpenCV 3 computer vision in C++ with the OpenCV Library. Sebastopol: O’Reilly, 2016, pp.397–406.
Downloads
Published
2022-12-27
How to Cite
Rakun, J., Pantano, M., Lepej, P., & Lakota, M. (2022). Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot. International Journal of Agricultural and Biological Engineering, 15(6), 91–95. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/6415
Issue
Section
Power and Machinery Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).