Publication detail

Convolutional Neural Networks for the Odometry Estimation

VEĽAS, M. ŠPANĚL, M. HRADIŠ, M. HEROUT, A.

Original Title

Convolutional Neural Networks for the Odometry Estimation

English Title

Convolutional Neural Networks for the Odometry Estimation

Type

journal article - other

Language

en

Original Abstract

This article presents a novel method for odometry estimation from 3D data of Velodyne LiDAR scanner using convolutional neural networks. For training and forward evaluation of the proposed networks, the original data is encoded into 2D matrices. In experiments with the KITTI dataset, our networks show significantly higher accuracy in estimation of the translational motion parameters compared to the state of the art LOAM method. In addition, they achieve higher speed and real-time performance. Using data provided by the IMU sensor, it is possible to estimate odometry and align the point cloud with a high precision. The proposed method can replace the odometry estimation from the wheel encoders or supplement the missing GPS data when the GNSS signal is not available (for example, during the interior mapping). In addition, we propose alternate CNNs for the estimation of the rotational motion that achieve results comparable to the state of the art. Our solution delivers real-time performance and accuracy to provide online preview of the mapping and to verify the completeness of the map during the mission.

English abstract

This article presents a novel method for odometry estimation from 3D data of Velodyne LiDAR scanner using convolutional neural networks. For training and forward evaluation of the proposed networks, the original data is encoded into 2D matrices. In experiments with the KITTI dataset, our networks show significantly higher accuracy in estimation of the translational motion parameters compared to the state of the art LOAM method. In addition, they achieve higher speed and real-time performance. Using data provided by the IMU sensor, it is possible to estimate odometry and align the point cloud with a high precision. The proposed method can replace the odometry estimation from the wheel encoders or supplement the missing GPS data when the GNSS signal is not available (for example, during the interior mapping). In addition, we propose alternate CNNs for the estimation of the rotational motion that achieve results comparable to the state of the art. Our solution delivers real-time performance and accuracy to provide online preview of the mapping and to verify the completeness of the map during the mission.

Keywords

Odometry, Velodyne, LiDAR, CNN, KITTI

Released

24.01.2019

Publisher

NEUVEDEN

Location

NEUVEDEN

ISBN

0921-0296

Periodical

Journal of Intelligent and Robotics Systems

Year of study

2019

Number

93

State

NL

Pages from

1

Pages to

22

Pages count

22

URL

Documents

BibTex


@article{BUT162266,
  author="Martin {Veľas} and Michal {Španěl} and Michal {Hradiš} and Adam {Herout}",
  title="Convolutional Neural Networks for the Odometry Estimation",
  annote="This article presents a novel method for odometry estimation from 3D data of
Velodyne LiDAR scanner using convolutional neural networks. For training and
forward evaluation of the proposed networks, the original data is encoded into 2D
matrices. In experiments with the KITTI dataset, our networks show significantly
higher accuracy in estimation of the translational motion parameters compared to
the state of the art LOAM method. In addition, they achieve higher speed and
real-time performance. Using data provided by the IMU sensor, it is possible to
estimate odometry and align the point cloud with a high precision. The proposed
method can replace the odometry estimation from the wheel encoders or supplement
the missing GPS data when the GNSS signal is not available (for example, during
the interior mapping). In addition, we propose alternate CNNs for the estimation
of the rotational motion that achieve results comparable to the state of the art.
Our solution delivers real-time performance and accuracy to provide online
preview of the mapping and to verify the completeness of the map during the
mission.",
  address="NEUVEDEN",
  chapter="162266",
  edition="NEUVEDEN",
  howpublished="online",
  institution="NEUVEDEN",
  number="93",
  volume="2019",
  year="2019",
  month="january",
  pages="1--22",
  publisher="NEUVEDEN",
  type="journal article - other"
}