BEST LINEAR UNBIASED ESTIMATION FOR 2D AND 3D FLOW WITH
EVENT-BASED CAMERAS
IKERLAN TECHNOLOGY CENTRE
Juan L. Valerdi, Xabier Iturbe
ABSTRACT
Dynamic Vision Sensors (DVS) provide low-latency, highdynamic-range motion estimation, but their real-time applicability is often limited by the computational complexity and latency overheads introduced by iterative motion compensation techniques. In this work, we propose a novel probabilistic model that leverages the stochastic distribution of events along moving edges. Using our model, we introduce a lightweight patch-based algorithm that employs a linear combination of event spatial coordinates, making it highly suitable for implementation on specialized hardware. Our approach exhibits linear scalability with dimensionality, making it suitable for emerging event-based 3D sensors, such as Light-Field DVS (LF-DVS). Experimental results validate the efficiency and scalability of our method, establishing a solid foundation for real-time event-based ultra-efficient 2D and 3D motion estimation.



