UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Asynchronous Optical Flow and Egomotion Estimation from Address Events Sensors

Loading...
Thumbnail Image

Date

2022-05-18

Authors

Azzi, Charbel

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Motion estimation is considered essential for many applications such as robotics, automation, and augmented reality to name a few. All cheap and low cost sensors which are commonly used for motion estimation have many shortcomings. Recently, event cameras are a new stream in imaging sensor technology characterized by low latency, high dynamic range, low power and high resilience to motion blur. These advantages allow them to have the potential to fill some of the gaps of other low cost motion sensors, offering alternatives to motion estimation that are worth exploring. All current event-based approaches estimate motion by considering that events in a neighborhood encode the local structure of the imaged scene, then track the evolution of this structure over time which is problematic since events are only an approximation of the local structure that can be very sparse in some cases. In this thesis, we tackle the problem in a fundamentally different way by considering that events generated by the motion of the same scene point relative to the camera constitute an event track. We show that consistency with a single camera motion is sufficient for correct data association of events and their previous firings along event tracks resulting in more accurate and robust motion estimation. Towards that, we present new voting based solutions which consider all potential data association candidates that are consistent with a single camera motion for candidates evaluation by handling each event individually with- out assuming any relationship to its neighbors beyond the camera motion. We first exploit this in a particle filtering framework for the simple case of a camera undergoing a planar motion, and show that our approach can yield motion estimates that are an order of magnitude more accurate than opti- cal flow based approaches. Furthermore, we show that the consensus based approach can be extended to work even in the case of arbitrary camera mo- tion and unknown scene depth. Our general motion framework significantly outperforms other approaches in terms of accuracy and robustness.

Description

Keywords

Optical Flow, Egomotion, Motion Estimation, Event Camera, Planar Motion, General Motion

LC Keywords

Citation