In this paper we describe a sound source localization approach which, in combination
with data from lidar sensors, can be used for an improved object tracking in the setting of an
autonomous car. After explaining the chosen sensor setup we will show how acoustic data from
two Kinect cameras, i.e., multiple microphones, which were mounted on top of a car, can be
combined to derive an object's direction and distance. Part of this work will focus on a method
to handle non-synchronized sensory data between the multiple acoustic sensors. We will describe
how the sound localization approach was evaluated using data from lidar sensors.
Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios