Springe direkt zu Inhalt

Dimitri Schachmann:

Visual IMU - Estimating Ego-Motion Using Optical Flow and Depth Images

Kurzbeschreibung

For robots it is crucial to accurately determine their own motion, be it for localization, trajectory generation or collision avoidance. This problem of ego-motion estimation is in general hard to solve. Different kinds of sensors can be applied to estimate the motion of a robot, but it is always a trade-off between multiple quality factors and the monetary cost of the system. One approach is to use cameras and optical flow, which proved challenging due to inherent ambiguities of observed motion.

This work presents a new approach to solving the ego-motion problem by using visual data only and a unique combination of existing computer vision techniques. Depth information from stereo data is used to remove scale ambiguity and a known calibration with respect to the ground allows to filter out foreign motion. Focusing on applications with approximately downward-looking cameras, a transformation of the camera images into a top view perspective aids a feature based optical flow algorithm to match points in the environment. RANSAC (Random Sample Consensus) and an Unscented Kalman Filter are then applied to fit a nonholonomic motion model to the matched points.

A working implementation was developed and evaluated on experiment data in the course of this work. The results promise a successful application as an ego-motion estimator during low speed, start-up and braking maneuvers.

 

Abschluss
Master of Science (M.Sc.)
Abgabedatum
11.07.2016
Projekt

Downloads