Publication detail
Collar Line Segments for Fast Odometry Estimation from Velodyne Point Clouds
VEĽAS, M. ŠPANĚL, M. HEROUT, A.
Original Title
Collar Line Segments for Fast Odometry Estimation from Velodyne Point Clouds
English Title
Collar Line Segments for Fast Odometry Estimation from Velodyne Point Clouds
Type
conference paper
Language
en
Original Abstract
We present a novel way of odometry estimation from Velodyne LiDAR point cloud scans. The aim of our work is to overcome the most painful issues of Velodyne data - the sparsity and the quantity of data points - in an efficient way, enabling more precise registration. Alignment of the point clouds which yields the final odometry is based on random sampling of the clouds using Collar Line Segments. The closest line segment pairs are identified in two sets of line segments obtained from two consequent Velodyne scans. From each pair of correspondences, a transformation aligning the matched line segments into a 3D plane is estimated. By this, significant planes (ground, walls, ...) are preserved among aligned point clouds. Evaluation using the KITTI dataset shows that our method outperforms publicly available and commonly used state-of-the-art method GICP for point cloud registration in both accuracy and speed, especially in cases where the scene lacks significant landmarks or in typical urban elements. For such environments, the registration error of our method is reduced by 75% compared to the original GICP error.
English abstract
We present a novel way of odometry estimation from Velodyne LiDAR point cloud scans. The aim of our work is to overcome the most painful issues of Velodyne data - the sparsity and the quantity of data points - in an efficient way, enabling more precise registration. Alignment of the point clouds which yields the final odometry is based on random sampling of the clouds using Collar Line Segments. The closest line segment pairs are identified in two sets of line segments obtained from two consequent Velodyne scans. From each pair of correspondences, a transformation aligning the matched line segments into a 3D plane is estimated. By this, significant planes (ground, walls, ...) are preserved among aligned point clouds. Evaluation using the KITTI dataset shows that our method outperforms publicly available and commonly used state-of-the-art method GICP for point cloud registration in both accuracy and speed, especially in cases where the scene lacks significant landmarks or in typical urban elements. For such environments, the registration error of our method is reduced by 75% compared to the original GICP error.
Keywords
Velodyne LiDAR, point cloud registration, odometry estimation, collar line segments, ICP, generalized ICP, SLAM
Released
16.05.2016
Publisher
IEEE Computer Society
Location
Stockholm
ISBN
978-1-4673-8025-6
Book
Proceedings of IEEE International Conference on Robotics and Automation
Edition
NEUVEDEN
Edition number
NEUVEDEN
Pages from
4486
Pages to
4491
Pages count
6
URL
Documents
BibTex
@inproceedings{BUT130914,
author="Martin {Veľas} and Michal {Španěl} and Adam {Herout}",
title="Collar Line Segments for Fast Odometry Estimation from Velodyne Point Clouds",
annote="We present a novel way of odometry estimation from Velodyne LiDAR point cloud
scans. The aim of our work is to overcome the most painful issues of Velodyne
data - the sparsity and the quantity of data points - in an efficient way,
enabling more precise registration. Alignment of the point clouds which yields
the final odometry is based on random sampling of the clouds using Collar Line
Segments. The closest line segment pairs are identified in two sets of line
segments obtained from two consequent Velodyne scans. From each pair of
correspondences, a transformation aligning the matched line segments into a 3D
plane is estimated. By this, significant planes (ground, walls, ...) are
preserved among aligned point clouds.
Evaluation using the KITTI dataset shows that our method outperforms publicly
available and commonly used state-of-the-art method GICP for point cloud
registration in both accuracy and speed, especially in cases where the scene
lacks significant landmarks or in typical urban elements. For such environments,
the registration error of our method is reduced by 75% compared to the original
GICP error.",
address="IEEE Computer Society",
booktitle="Proceedings of IEEE International Conference on Robotics and Automation",
chapter="130914",
doi="10.1109/ICRA.2016.7487648",
edition="NEUVEDEN",
howpublished="print",
institution="IEEE Computer Society",
year="2016",
month="may",
pages="4486--4491",
publisher="IEEE Computer Society",
type="conference paper"
}