Asynchronous Optical Flow and Egomotion Estimation from Address Events Sensors
MetadataShow full item record
Motion estimation is considered essential for many applications such as robotics, automation, and augmented reality to name a few. All cheap and low cost sensors which are commonly used for motion estimation have many shortcomings. Recently, event cameras are a new stream in imaging sensor technology characterized by low latency, high dynamic range, low power and high resilience to motion blur. These advantages allow them to have the potential to fill some of the gaps of other low cost motion sensors, offering alternatives to motion estimation that are worth exploring. All current event-based approaches estimate motion by considering that events in a neighborhood encode the local structure of the imaged scene, then track the evolution of this structure over time which is problematic since events are only an approximation of the local structure that can be very sparse in some cases. In this thesis, we tackle the problem in a fundamentally different way by considering that events generated by the motion of the same scene point relative to the camera constitute an event track. We show that consistency with a single camera motion is sufficient for correct data association of events and their previous firings along event tracks resulting in more accurate and robust motion estimation. Towards that, we present new voting based solutions which consider all potential data association candidates that are consistent with a single camera motion for candidates evaluation by handling each event individually with- out assuming any relationship to its neighbors beyond the camera motion. We first exploit this in a particle filtering framework for the simple case of a camera undergoing a planar motion, and show that our approach can yield motion estimates that are an order of magnitude more accurate than opti- cal flow based approaches. Furthermore, we show that the consensus based approach can be extended to work even in the case of arbitrary camera mo- tion and unknown scene depth. Our general motion framework significantly outperforms other approaches in terms of accuracy and robustness.
Cite this version of the work
Charbel Azzi (2022). Asynchronous Optical Flow and Egomotion Estimation from Address Events Sensors. UWSpace. http://hdl.handle.net/10012/18293