Consistent lidar-only SLAM for legged agricultural robots in arboreal environments via robust dimensionality reduction
Nazate-Burgos, P., Torres-Torriti, M., Huang, S. and Auat Cheein, F. (2026) Consistent lidar-only SLAM for legged agricultural robots in arboreal environments via robust dimensionality reduction. Computers and Electronics in Agriculture, 247. ISSN 01681699
|
Text
F Auat Cheein Consistent lidar-only SLAM OCR UPLOAD.pdf - Published Version Available under License Creative Commons Attribution. Download (11MB) |
Abstract
Simultaneous localization and mapping (SLAM) in arboreal environments presents unique challenges due to dense foliage, uneven terrain, and GNSS signal degradation. While state-of-the-art 3D lidar SLAM methods exist, they typically require IMU integration and assume smooth platform motion with peak-accuracy as a primary goal, making them unsuitable for legged robots on irregular terrain or when inertial sensors are unreliable. This paper presents a SLAM approach whose goal is consistent localization rather than peak accuracy for robots with non-smooth motion in arboreal environments via dimensionality reduction. The method requires spatially persistent vertical structures like tree trunks. We adapt the modified Hausdorff distance for scan-to map matching of a 2D projection of 3D lidar data slices as a robust distance metric, eliminating the need for complex feature extraction, IMU integration, or GNSS corrections. This approach is particularly suited for legged robots and handheld mapping systems where discontinuous motion invalidates assumptions of existing methods. The method is less suitable when distinctive vertical structures are absent, such as in very young orchards, espalier systems, or very sparse plantations. We evaluate our approach across simulated environments, controlled field tests, and three real-world datasets (CitrusFarm, Bacchus, and Pullally), including a novel dataset collected with a quadruped robot. Our comprehensive evaluation results demonstrate that when IMU data is unavailable or unreliable due to impulsive platform motions, the proposed approach maintains positioning accuracy ranging from 0.40 m to 1.34 m across different environments, with particularly robust performance on the challenging Pullally dataset featuring a legged robot (0.40 ± 0.05 m over three trials), while methods designed for IMU integration and dense 3D point cloud matching either achieve superior performance on smooth terrains with wheeled robots (0.16–0.28 m on CitrusFarm) or fail catastrophically on legged platforms (>27 m on Pullally). The proposed method is compared against A-LOAM, LeGO-LOAM, DLO, MOLA and AG-LOAM. Results show that our approach trades peak performance for consistent robustness across datasets and platform types. We release our code and the Pullally dataset to support further research in agricultural robotics: https://github.com/RAL-UC/RoSA_SLAM
| Item Type: | Article |
|---|---|
| Keywords: | SLAM, Lidar, Agricultural robotics, Legged robots, Arboreal environments, Dimensionality reduction, Robust distance metrics |
| Divisions: | Engineering |
| Depositing User: | Miss Anna Cope |
| Date Deposited: | 28 Apr 2026 13:51 |
| Last Modified: | 28 Apr 2026 13:51 |
| URI: | https://hau.repository.guildhe.ac.uk/id/eprint/18359 |
Actions (login required)
![]() |
Edit Item |

