agriculture, crop tracking, robot, lidar

Study improves agricultural robotics using 2D Lidar . technology

In an article published in the magazine agricultural engineeringIn this article, the researchers presented various solutions to crop row tracking issues by using the four-way guidance mobile agricultural robotics approach for staggered crops.

Stady: Autonomous vineyard tracking using a quad-guided mobile robot and 2D LiDAR. Image Credit: MrDDK /

Due to the movement of the quadrupole-guided mobile robot, the LiDAR 2D laser scanner-based navigation approach was used to track 3D crop rows. It was possible to define a reference pathway followed by two different control strategies. The primary use of the new agricultural robotics approach has involved navigating vineyards that have achieved various processes, such as monitoring, cultivation or micro-spraying. The use of a row-tracking approach based on a 2D LiDAR angled in front of the quad-guided mobile robot to fit a specific vine-row shape within the framework of the quad-guided mobile robot is detailed in the first section.

A control architecture for a four-wheel-guided mobile robot is proposed in the second section. A strategy based on a backstepping approach and an independently studied organization of the front and rear steering axle positions was examined. With the help of 3D reconstruction of realistic vineyards in different seasons, the results of these control rules were compared in a large-scale simulation framework.

Agricultural robotics in the current context

A vital issue for the ecological transformation of agriculture in the context of vineyard production is the autonomous navigation of off-road robots. By reducing the number of chemicals needed and using bio-control solutions, jobs can be completed more accurately and frequently in the field. The creation of autonomous robots is encouraged due to the scarcity of labor and the difficulty of field duties, especially for those working in vineyards.

Farms are now using intensive farming robots to help farmers by reducing the struggle of some manual activities. Agricultural robots also play various other roles, including improving agricultural productivity, reducing workers’ exposure to chemicals sprayed on farms, and enabling more accurate and efficient farming. Centimeter-level accuracy with respect to vegetation is essential for vineyard navigation, especially for roadside robots.

For interactive navigation in the vineyard, while calculating weather conditions, lightning fluctuations and vegetation growth, a robust 2D laser scanner-based navigation approach was adopted in this paper. The authors were interested in investigating the possibility of generalizable year-round interactive lidar-scanner-based navigation technology. Since the task was about vegetation development rather than precise positioning, a GPS-free solution was proposed.

A laser-scanner-based navigation method determined the expected shape of the vine structure in each navigational survey. These successive scans are combined into a global map close to the position of the robot. The path was determined by collecting points in a 3D space while the robot was moving. According to a proposed navigation algorithm based on a laser scanner, a four-wheeled mobile robot can work in vineyards, although the vegetation changes with the seasons.

Two distinct control schemes are introduced to demonstrate the effectiveness of the laser scanner-based navigation algorithm. In the first proposed method, a backsteping approach was taken to organize the longitudinal and lateral controls separately, separating the position of the robot and managing the steering. In contrast, the second technology considered split-axle control to ensure that the front axle and rear axle follow the same path.

procedural chart

The first step in the row-tracking technique described in this study was to use the LiDAR scanner-based navigation framework to identify crop shapes that fit a predicted model. Thus, the goal was to use a mobile, self-guided, four-wheeled robot equipped with a laser scanner-based 2D LiDAR angle navigation system to follow a path identified by the vineyard chassis.

The strategy then entailed defining the expected shape in each laser-scanner-based navigation scan and combining subsequent scans to create a comprehensive map. Finally, the path was determined by collecting points in a 3D space as the robot moved.

The proposed technology used LiDAR sensing to enable a mobile, autonomous, four-wheeled robot to follow a path created by a vineyard chassis. This strategy identified the expected pattern on each laser scan and combined subsequent scans into a comprehensive map. Thus, the path to follow was determined by accumulating the points in a three-dimensional space while the four-wheel-guided mobile robot was moving.

A four-wheeled mobile robot has been used to improve the mobility and maneuverability of vehicles, especially those operating in confined environments such as warehouses or agricultural settings such as vineyards. The four-wheel steering mobile robot is designed as a bicycle, that is, it has been reduced to two wheels, with one wheel serving as the front axle and the other as the rear axle, both of which have steering angles denoted by the symbol dF friendshipsStraight.

A sophisticated simulation test has been developed to reduce the expense and risk for both the quad steering and chrome mobile robot. The virtual vineyard must accurately reflect reality for the robot to behave as if it were on a natural vineyard. Hence, the proposed laser-scanner-based navigation was tried in several field settings, particularly those that influenced plant development. Although some areas were lost to vegetation, the recommended method showed accuracy within a few centimeters.

Four-wheeled mobile robot and the future of agricultural robots

Several suggestions have been made in this work regarding the use of agricultural robots for row tracking in a viticultural environment. The authors specifically addressed the issue of tracking crop rows through the use of a four-wheeled mobile robot that ran between crops. 2D LiDAR tilted in front of the robot was introduced as the basis for the row-tracking method to match the predefined shape of the chrome row in the frame of the four-wheeled mobile robot.

Through the distance measurement system, the recognized regions of interest were grouped successively along the direction of local robot movement. Such an assembly made it possible to determine the local course of the four-wheeled robot. A control architecture has also been proposed that enables the control of a mobile robot with four-wheel steering.

It appears that the obligatory blind procedure at the start of the scanner-based navigation, caused by the requirement of minimum remote points for regression estimation and then calculation of a relevant control, was a limitation of the proposed method. Due to the potential of 3D LiDAR to reduce the blind process stage, the future study will mainly focus on extending the laser scanner-based navigation algorithm into 3D space.

Future studies will also focus on controlling a portable four-way robot with simultaneous control of a compact instrument, such as a sprayer. It will also involve the use of a reconstructed digital terrain model to identify obstacles, analyze crossovers, and avoid obstacles using 3D data. In addition, it will focus more on experiments on actual vineyards, although the validation is performed on a sophisticated simulation test made with a digital vineyard.


Iberraken, D., Gaurier, F., Roux, J. C., Chaballier, C., and Lenain, R. (2022) Autonomous Vineyard Tracking using a mobile four-way robot and 2D LiDAR. agricultural engineering4 (4), 826–846.

Disclaimer: The opinions expressed here are those of the author expressed in his personal capacity and do not necessarily represent the opinions of Limited T/A AZoNetwork is the owner and operator of this website. This disclaimer forms part of the terms and conditions of use of this website.

#Study #improves #agricultural #robotics #Lidar #technology

Leave a Comment

Your email address will not be published. Required fields are marked *