Skip to main navigation menu Skip to main content Skip to site footer

Asynchronous Planar Motion Estimation using Address Event Sensors

Abstract

Motion estimation is a central problem of computer vision essential
to many applications, such as optical flow and egomotion estimation.
Traditional frame-based cameras suffer is limited to a temporal
resolution that is bounded by the fixed frame-rate of the camera.
Address Event Sensors (AES) are bio-inspired vision sensors characterized
by low latency, high dynamic range and high resilience
to motion blur. Contrary to traditional frame-based cameras which
output frames at fixed time intervals, AES generates asynchronous
events at microsecond resolution each time the local brightness of
a pixel changes. However, most of the current Address Event (AE)
approaches to estimate motion have not been effective at exploiting
these characteristics. They mostly rely on spatial smoothness
that require accumulating events into grid-like representations for
processing, eliminating most of the AES advantages. We conjecture
that processing events asynchronously as they arrive should
lead to better use of the camera’s temporal resolution and hence
result in motion estimates that are more resilient to rapid and shaky
motions. In this paper, we present an asynchronous particle filter
approach using BCE-based likelihood function, to solve for planar
motion velocities using AES. It uses the AE data as the only source
of information relying on a single event track, while freeing events
from the spatial smoothness assumption. It is, thus, capable of exploiting
the advantages offered by AES for motion estimation. Our
results for general planar motion estimation are on par with stateof-
the-art results.

PDF